Progress 02/01/24 to 01/31/25
Outputs Target Audience:Biological and Agricultural Engineers, Mechanical Engineers, Material scientists, Robotics researchers, poultry science researchers, poultry processing industry. Changes/Problems:
Nothing Reported
What opportunities for training and professional development has the project provided?Wang's group had 1 Biological and Agricultural Engineering Ph.D., 1 M.S. student, and 1 mechanical M.S student involved in this project for the development of reliable 3D imaging system, deep learning based chicken detection algorithms, and imitation learning method for robotic control. Shou's group had 2 Ph.D. students and 1 M.S. student involved in this project for sensor fabrication, and robotic hand development; 2 undergraduate students involved in robotic hand design and fabrication. She's group had 1 Ph.D. student and 1 undergraduate students involved in this project for setting up a dual-arm robotic manipulation system, collecting data, and training policies for whole chicken manipulation tasks. How have the results been disseminated to communities of interest?Shou's group presented a poster at a major ASME IMECE conference. The resulting robotic hand was showcased during the Summer Camp organized by the University of Arkansas for K-12 education. Wang's group presents the research results for instance segmentation, synthetic data generation and novel gripper designs in ASABE International and state session meetings. Researches results have also been communicated with companies including Siemens, Mayekawa during IPPE conferences. She's group presented the research results for imitation learning for artificial chicken grasping and manipulation at the ASABE International and state session meetings at Anaheim, CA. In addition, She's research in tactile reactive controller design was presented at the IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) at Abu Dhabi, UAE. What do you plan to do during the next reporting period to accomplish the goals?For the tactile sensor embedded robotic hand, we will refine our sensor array fabrication and wire connection, and update the hand design and integration strategy for easy system integration. We will further innovate laser direct printing configuration to manufacture sensitive and robust tactile sensor arrays with easy assembly and integration for robotic hands. Several sensor array designs will be evaluated and characterized, and then integrated with the robotic hands (both industrial robotic gripper and home-developed hand prototypes). Currently, the robotic hand is actuated with a pneumatic mechanism, we will look into other mechanisms, such as a tendon-driven mechanism, to enhance the payload. For the vision side, using the generated depth dataset, a deep learning based stereo matching method will be developed to improve the 3D reconstruction accuracy and achieve high-resolution and high-accuracy imaging. In addition, blender based synthetic data generation will be evaluated through different deep learning models, and blender based depth information generated will be further explored. From the perspective of chicken grasping and manipulation, building upon this initial success of using imitation learning for simple autonomous robotic manipulation of artificial chickens, our next step seeks to expand the complexity and realism of these tasks. Extensive evaluation of adopting diffusion policy on chicken rehanging with customized gripper will be evaluated. In addition, multi-modality status input including vision and tactile sensor will be tested for performance comparison, aiming to improve the finesse of chicken handling by providing sensitive feedback on the grip forces and slip information. Our initial investigation presents promising results to leverage imitation learning for whole chicken manipulation. By the next reporting period, we hope to achieve more intricate procedures such as chicken picking, lifting, placing, and hanging manipulations.
Impacts What was accomplished under these goals?
For the sensor fabrication: we have successfully fabricated microscale electrodes (or lines) for tactile sensor array assembly using the laser direct writing technique. The electrodes have a linewidth of ~50 microns; with the given length (10-20cm), the resistances are in the single digital Ω range. We have successfully fabricated functional 8x8 tactile sensor arrays. Meanwhile, we have successfully fabricated robotic hands actuated by compressed gas. We are optimizing the wire connection and integrating tactile sensor arrays with grippers and robotic hands. Meanwhile, we are developing control strategies for robotic hands. For the computer vision system, a low cost dual-laser active laser scanning system has been set up which is composed of line lasers, galvanometers, and a color camera. The low cost system achieves promising depth accuracy compared to commercialized Intel realsense cameras. Using the customized laser scanning system, 160 whole chicken carcass RGB-depth images were collected. Via training a depth information awared network with 70% training data and 20% valuation data, the semantic RGB-D Mask RCNN model can achieve 0.68 mAP for pixel segmentation and 8.99 pixel centroid prediction shift for the toppest chicken on the test dataset, which is outperforms adopting Mark RCNN directly on the RGB model. The depth awared model also mitigates the instance segmentation errors for stacked carcasses with heavy occlusion. To further improve semantic segmentation performance, the team has utilized Blender, an open-source 3D creation software, to automatically generate synthetic datasets using customized Python code to control camera positions. During the process, the ground truth labels are automatically generated during the rendering process. Using this approach, we generated a total of 1000 synthetic samples, each containing RGB images and instance masks. The synthetic dataset provides robust training data, ensuring the model learns to handle overlapping and closely positioned carcasses effectively. For the robotic control side, a customized robotic end-effector with dual two finger grippers was designed which can be controlled via Arduino board to grasp the two leg joints of chicken carcass and to rehang the carcass to shackles. A diffusion policy model was trained based on the 6 degree of freedom of robotic arm and the 2 degree of freedom of gripper fingers to achieve end-to-end robot control. Another primary accomplishment has been the development and publication of LeTac-MPC (Learning Model Predictive Control), a novel learning-based tactile-reactive control framework. This framework integrates high-resolution tactile sensing with advanced control algorithms to achieve robust and adaptive grasping, which can be used for chicken grasping and rehanging, addressing the specific challenges posed by the irregular shapes and delicate nature of poultry. The controller incorporates a differentiable MPC layer combined with a neural network to process tactile feedback and make real-time adjustments during grasping. This design enables robust performance when handling objects with varying physical properties and under dynamic or force-interactive scenarios. Trained using standardized blocks with diverse physical properties, LeTac-MPC has demonstrated exceptional generalizability, making it suitable for handling objects of different shapes, materials, and textures. This adaptability is critical for ensuring precise, secure handling of poultry while minimizing potential damage during grasping and rehanging. Experimental results have validated the controller's robustness and ability to maintain stable grasps under varying external forces, demonstrating its potential for real-world poultry processing applications. By dynamically adjusting grip force based on real-time tactile feedback, the system aligns with the project's goal of achieving integrated and autonomous poultry processing.
Publications
- Type:
Peer Reviewed Journal Articles
Status:
Published
Year Published:
2024
Citation:
Ali, M. A., Wang, D., & Tao, Y. (2024). Active Dual Line-Laser Scanning for Depth Imaging of Piled Agricultural Commodities for Itemized Processing Lines. Sensors, 24(8), 2385
- Type:
Peer Reviewed Journal Articles
Status:
Published
Year Published:
2024
Citation:
Mahmoudi, S., Davar, A., Sohrabipour, P., Bist, R. B., Tao, Y., & Wang, D.# Leveraging Imitation Learning in Agricultural Robotics: A Comprehensive Survey and Comparative Analysis. Frontiers in Robotics and AI, 11, 1441312.
- Type:
Peer Reviewed Journal Articles
Status:
Published
Year Published:
2024
Citation:
Tushar, N., Wu, R., She, Y., Zhou, W. and Shou, W., 2024. Desktop-scale robot tape manipulation for additive manufacturing. Device.
- Type:
Peer Reviewed Journal Articles
Status:
Published
Year Published:
2024
Citation:
Zhengtong Xu and Yu She, LeTO: Learning Constrained Visuomotor Policy with Differentiable Trajectory Optimization, IEEE Transactions on Automation Science and Engineering (TASE)
- Type:
Peer Reviewed Journal Articles
Status:
Published
Year Published:
2024
Citation:
Zhengtong Xu and Yu She, LeTac-MPC: Learning Model Predictive Control for Tactile-reactive Grasping, IEEE Transactions on Robotics (TRO)
- Type:
Other Journal Articles
Status:
Under Review
Year Published:
2024
Citation:
Xu, Z., Uppuluri, R., Zhang, X., Fitch, C., Crandall, P.G., Shou, W., Wang, D. and She, Y., 2024. UniT: Unified tactile representation for robot learning. arXiv preprint arXiv:2408.06481. IEEE Robotics and Automation Letters, under review
DOI: 10.48550/arXiv.2408.06481
- Type:
Other Journal Articles
Status:
Under Review
Year Published:
2024
Citation:
Bryan, C., Crandall, P., McFadden B., Wang, D., Obe T., Joulroyd J., Sawyer J., Feng Y. 2024. Workspace interventions to mitigate work related musculoskeletal disorders in meat processing plant workers: current knowledge and future prospects. Safety and Health at Work, under review.
- Type:
Other Journal Articles
Status:
Under Review
Year Published:
2024
Citation:
Sohrabipour, P., Pallerla, C., Davar, A., Mahmoudi, S., Feng, Y., Bist, R., Crandall P., Shou, W., She, Y., Wang, D., Cost effective active laser scanning system for depth-aware deep-learning-based instance segmentation for poultry processing. Submitted to AgriEngineering, 2025.
- Type:
Conference Papers and Presentations
Status:
Published
Year Published:
2024
Citation:
Feng Y., Wang D., (2024) Synthetic Data Augmentation for Chicken Carcass Instance Segmentation with Mask Transformer. In 2024 American Society of Agricultural and Biological Engineers (ASABE) Annual International Meeting. Anaheim, CA
- Type:
Conference Papers and Presentations
Status:
Published
Year Published:
2024
Citation:
Ariadna Ramirez, Youwen Liang, Wan Shou. Dexterous Soft Robotic Hand With Rich Sensing Feedback, ASME IMECE, Portland, Oregon, 2024.
- Type:
Conference Papers and Presentations
Status:
Published
Year Published:
2024
Citation:
Sohrabipour P., Wan S., Yu S., Wang D., (2024) Depth image guided Mask-RCNN model for chicken detection in poultry processing line. In 2024 American Society of Agricultural and Biological Engineers (ASABE) Annual International Meeting.Anaheim, CA
- Type:
Conference Papers and Presentations
Status:
Published
Year Published:
2024
Citation:
Sohrabipour P., Mahmoudi S., She Y., Shou W., Pallerla C., Schrader L., Wang D. (2024), Advanced Poultry Automation: Integrating 3D Vision Reconstruction and Mask R-CNN for Efficient Chicken Handling. In 2024 the Third Annual Artificial Intelligence in Agriculture Conference. College Station, TX [First place winner]
- Type:
Conference Papers and Presentations
Status:
Published
Year Published:
2024
Citation:
Wang D., Shou W., She Y., (2024) Multimodal sensors guided robotic chicken grasping and rehanging for integrated and autonomous poultry processing. In NSF National Robotics Initiative (NRI) Principal Investigators meeting. Baltimore, MD
- Type:
Conference Papers and Presentations
Status:
Published
Year Published:
2024
Citation:
Davar, A., Xu, Z., Mahmoudi, S., Pallerla, C., She, Y., Shou, W., Wang, D., (2024) Diffusion Policy-Based Imitation Learning for Gripping and Manipulation of Delicate and Irregularly Shaped Objects with a Customized Dual-Fingered End Effector. In 2024 ASABE Arkansas Session meeting.
- Type:
Conference Papers and Presentations
Status:
Published
Year Published:
2024
Citation:
Yuhao Zhou, Pokuang Zhou, Shaoxiong Wang, Yu She, In-Hand Singulation and Scooping Manipulation with a 5 DOF Tactile Gripper, 2024 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), October 14 18, 2024, Abu Dhabi, UAE.
- Type:
Conference Papers and Presentations
Status:
Published
Year Published:
2024
Citation:
Zhengtong Xu, Raghava Uppuluri, Wan Shou, Dongyi Wang, and Yu She, Whole Chicken Manipulation via Imitation Learning, American Society of Agricultural and Biological Engineers (ASABE) 2024 Annual International Meeting, July 28 - 31 in Anaheim, California, USA
|
Progress 02/01/23 to 01/31/24
Outputs Target Audience:Biological and Agricultural Engineers, Mechanical Engineers, Material scientists, Robotics researchers, poultry science researchers, poultry processing industry. Changes/Problems:
Nothing Reported
What opportunities for training and professional development has the project provided?Wang's group had 1 Biological and Agricultural Engineering Ph.D., 1 M.S. student, and 1 undergraduate student involved in this project for quantifying work-related musculoskeletal disorders in poultry rehanging, the development of high-speed high-resolution 3D imaging sensors.Wang also founded the agricultural robotic club at the University of Arkansas. An undergraduate group was supported to compete in the ASABE 2023 robotic competition. Two groups are planning to attend the ASABE 2024 robotic competition. Shou's group had 1 Ph.D. student and 1 M.S. student involved in this project for sensor fabrication; 6 undergraduate students involved in robotic hand design and fabrication. In addition, a high school student was involved in Dr. Shou's group in the development of a soft robotic finger controller. She's group had 1 Ph.D. student, 1 M.S. student, and 2 undergraduate students involved in this project for setting up a dual-arm robotic manipulation system, collecting data, and training policies for whole chicken manipulation tasks. Owen's group had 1 Poultry Science M.S. student involved in the project for preparing poultry samples. The team has also organized several poultry processing plant visits (Simmons Food, Siloam Spring, AR, and Pilgrims, De Queen, AR) to help students have a better understanding of the chicken rehanging task. How have the results been disseminated to communities of interest?
Nothing Reported
What do you plan to do during the next reporting period to accomplish the goals?For the tactile sensor embedded robotic hand, we will refine our hand design and manufacture prototypes. We will further optimize the laser direct writing parameters to manufacture tactile sensor arrays. Several sensor array designs will be assembled and evaluated, and then integrated with the robotic hand prototypes. Currently, the robotic hand is actuated with a pneumatic mechanism, we will look into other mechanisms, such as a tendon-driven mechanism, to enhance the payload. For the vision side, there are a couple of improvements that need to be made in the following years, which include improving the detection accuracy of chicken boundary, improving the model generalization performance on different backgrounds, and adding more training images for segmenting stacked chickens. An additional laser will be also added in the current 3D scanning system to avoid obstruction issues. From the perspective of chicken grasping and manipulation, our initial investigation presents promising results to leverage imitation learning for whole chicken manipulation. Building upon this initial success of using imitation learning for simple autonomous robotic manipulation of artificial chickens, our next step seeks to expand the complexity and realism of these tasks. By the next reporting period, we hope to achieve more intricate procedures such as chicken picking, lifting, placing, and hanging manipulations. We plan to integrate tactile sensing on the robotic fingers which is another significant focus next year, aiming to improve the finesse of chicken handling by providing sensitive feedback on the grip forces and slip information.
Impacts What was accomplished under these goals?
In our preliminary study, for the sensor fabrication, we have successfully fabricated microscale electrodes (or lines) for tactile sensor array assembly. Four lines with a width of ~50 microns are made by laser sintering of silver nanoparticles, which show decent conductivity (~25 Ω/cm). This writing process is ready for tactile sensor array fabrication.We have designed a few versions of robotic hands and manufactured some components. Meanwhile, we have fabricated several tactile sensor arrays through manual assembly, which are integrated with robotic grippers to gain force feedback. Our tactile sensor-embedded gripper shows better control of the gripper with a set force threshold. For the computer vision system setup, we have collected around 300 chicken images with corresponding manually labeled segmentation ground truth masks. Mask RCNN model as the baseline model was trained with 80% of the data, and 20% of the data was used for model validation. In the 100 training epochs, the best validation loss can be achieved as low as 0.04. The instance segmentation result can offer chicken posture information for guiding the robotic movement. To acquire the 3D posture information of the chicken, an active laser scanning system is developed. The system is composed of a line laser, a galvanometer, and a color camera. The galvanometer mirror is controlled by Arduino which is synchronized with image acquisition. Meanwhile, we initially employed the RGB-Depth Intel Realsense camera to quantify the laborers' body movements during the chicken hanging process. The system utilizes machine learning for human pose tracking, focusing primarily on the workers' shoulder, elbow, and wrist movement during the human process of rehanging chickens from tables onto shackles. To qualify the accuracy of the machine learning-based MediaPipe joint detection model in this application, markers were manually placed on the three joint key points as ground truths. Overall, the MediaPipe framework demonstrates a high degree of accuracy in tracking human joints. For the robotic control side, we used the Aloha bimanual robot hardware platform to develop and test algorithms for autonomous whole chicken grasping and manipulation. The Aloha system provides a low-cost, easy-to-operate teleoperation system for collecting many, high-quality robot demonstrations for manipulation tasks. The Aloha system consists of two pairs of robot arms: a leader and a follower. The two arms are kinematically equivalent, allowing for teleoperation to simply sync the joint states between the leader and follower arms. This setup makes the data collection of human demonstration very convenient. There are four cameras of the Aloha system that capture the states of the chicken placed on the table, and the image data serves as input data for our training algorithms. We evaluated state-of-the-art imitation learning methods, diffusion policy, and Long-Short Term Memory (LSTM) - Gaussian Mixture Model (GMM) (LSTM-GMM), on the whole chicken pushing task, demonstrating that imitation learning can acquire whole chicken manipulation skills. The process of the whole chicken manipulation task involved the robot's end-effector moving to a position suitable for pushing the whole chicken and then pushing it into a designated region. We used a rubber chicken toy as our experimental object, which possessed reasonable realistic characteristics of a chicken. The initial position and orientation of the whole chicken slightly varied in each test. The irregular shape of the whole chicken added to the challenge of the task. If the robotic arms are pushed at an improper location or fail to adjust their actions and coordinate in real-time based on observations, it could lead to the ultimate failure of the task.
Publications
- Type:
Conference Papers and Presentations
Status:
Published
Year Published:
2023
Citation:
Wang D. (2023), Robust vision intelligence for food quality evaluation. In USDA AMS seminar
- Type:
Conference Papers and Presentations
Status:
Published
Year Published:
2023
Citation:
Feng, Y., Mergner F., Crandall P., Pallerla C., Wang D.# (2023), Cost-Effective Video Camera System to Estimate Workers Risk of Work-Related Musculoskeletal Disorders (WRMSD). In 2023 American Society of Agricultural and Biological Engineers (ASABE) Annual International Meeting.
- Type:
Conference Papers and Presentations
Status:
Published
Year Published:
2023
Citation:
Wang D. (2023), Vision Intelligence for Smart Food Manufacturing. In 2023 American Society of Agricultural and Biological Engineers (ASABE) Annual International Meeting.
- Type:
Conference Papers and Presentations
Status:
Published
Year Published:
2023
Citation:
Wang D. (2023), Vision Intelligence for Smart Food Manufacturing. In 2023 ASABE Arkansas Session Annual Meeting.
- Type:
Conference Papers and Presentations
Status:
Submitted
Year Published:
2024
Citation:
Yihong Feng, Wan Shou, Yu She and Dongyi Wang. Synthetic Data Augmentation for Chicken Carcass Instance Segmentation with Mask Transformer. submitted to 2024 American Society of Agricultural and Biological Engineers (ASABE) Annual International Meeting.
- Type:
Conference Papers and Presentations
Status:
Submitted
Year Published:
2024
Citation:
Zhengtong Xu, Raghava Uppuluri, Wan Shou, Dongyi Wang, and Yu She, Whole Chicken Manipulation via Imitation Learning, submitted to 2024 American Society of Agricultural and Biological Engineers (ASABE) Annual International Meeting.
- Type:
Journal Articles
Status:
Published
Year Published:
2023
Citation:
Xiang, L., & Wang, D.* (2023). A Review of Three-dimensional Vision Techniques in Food and Agriculture Applications. Smart Agricultural Technology, 100259.
- Type:
Journal Articles
Status:
Published
Year Published:
2023
Citation:
Sheeraz Athar, Gaurav Patel, Zhengtong Xu, Qiang Qiu, and Yu She. VisTac: Multi-Modal Sensing Finger for Robot Manipulation. IEEE Sensors. September 2023, DOI: 10.1109/JSEN.2023.3310918.
- Type:
Journal Articles
Status:
Under Review
Year Published:
2024
Citation:
Tushar, N., Wu, R., She, Y., Zhou, W. and Shou*, W., 2024. Robot Tape Manipulation for 3D Printing. arXiv preprint arXiv:2401.08982. Under review, Advanced Intelligent Systems.
- Type:
Journal Articles
Status:
Under Review
Year Published:
2024
Citation:
Zhengtong Xu and Yu She, LeTO: Learning Constrained Visuomotor Policy with Differentiable Trajectory Optimization, IEEE Robotics and Automation Letters. (Under Review).
- Type:
Journal Articles
Status:
Submitted
Year Published:
2024
Citation:
Mahmoudi, S., Wang, D.* (2023) Leveraging Imitation Learning in Agricultural Robotics: A Comprehensive Survey and Comparative Analysis.
- Type:
Conference Papers and Presentations
Status:
Published
Year Published:
2023
Citation:
Wang D., Shou W., She Y., (2023) Multimodal sensors guided robotic chicken grasping and rehanging for integrated and autonomous poultry processing. In NSF National Robotics Initiative (NRI) Principal Investigators meeting.
|