Progress 01/01/18 to 12/31/21
Outputs Target Audience:Our target audience includes local seafood industry partners and business associates in the Chesapeake Bay area. We have developed highly collaborative partnerships with local crab meat picking facilities for the adoption of computer vision guided processing machines. The target audience also encompasses academics and other scientific communities interested in the smart food manufacture and automation field. Changes/Problems:
Nothing Reported
What opportunities for training and professional development has the project provided? How have the results been disseminated to communities of interest?We disseminated our results in various ways. Firstly, we demonstrated our results by making videos and showcasing our work to Chesapeake Blue crab processing companies such as J.M. Clayton Seafood company. Secondly, we presented our work at conferences through peer-reviewed journal article publications and demonstrations to peers at Maryland Robotics Center. Thirdly, we presented our work to government agencies that fund our project in venues such as NSF/NIFA annual PI meetings. We also had the opportunity to show our prototype machine capabilities to NIFA national program leader during his in person lab visits. What do you plan to do during the next reporting period to accomplish the goals?
Nothing Reported
Impacts What was accomplished under these goals?
Through intensive research and experimentation, we developed a working prototype that utilizes smart-food processing techniques to aid in extracting meat from Chesapeake Blue crabs. We have furthered the existing body of scientific knowledge of crab morphology, robotic control, and deep learning algorithms. This research is part of our efforts to establish the three foundational components of our robotic system: mechanical systems to hold the crab in a known and indexable position and move it through the system, a vision system capable of capturing necessary information about crab morphology and predicting the location of meat compartments, and a robotic control system capable of using a high-pressure waterjet knife to cut open the crab in preparation for picking. Starting with the mechanical positioning system, our team used fresh Chesapeake Bay crabs to collect information on the size and shape of the crabs' inner compartments. This data was used to design several iterations of a mechanical holding apparatus that complements the shape and strength of the crab's structure. Each design was further iterated across various surface textures and physical geometries until a design optimized across both strength, capability, and manufacturability emerged. This mechanical system serves as a handling system for the crabs throughout the rest of the systems. The finalized design consisted of a magnetically coupled holder that attaches to the underlying conveyor belt. The holder is made from stainless steel and isattached via robotic manipulators. Using the conveyor belt geometry, and its downward turn downstream from processing, the holder automatically detaches and releases crabs. In conjunction with the mechanical hold-down system, we developed a three-station system that works in parallel on a conveyor belt in a FIFO manner. The first station consists of a vision system capable of capturing the required information. A CMOS camera can capture the image of each crab'sinner morphology. Thereafter, a bespoke deep learning algorithm was developed to create a general model capable of taking three geometric feature points from each crab and transposing an optimized cut template onto it. Each transposed template contains geometric information through which a cut line can be generated for each crab. Ultimately, the cut lines trace around the edges of each outer meat compartment in order to minimize the amount of effort needed to extract the crab meat. In the second station, the cutlines are passed on to a robotic XY gantry and utilize the motion plan to guide the waterjet knife cutting system to execute the cuts. Our team increased the performance of the proposed system by further adding a third station to the system. A Z-axis water knife is attached to a linear motor that has a submillimeter resolution. This motor is capable of being moved in the Z direction to peel off additional sections of chitin from atop the meat compartments. This ultimately exposes the crab's Jumbo Lump and chamber meats. Thisreduces a great amount of time and effort it takes a human to extract meat from each compartment. In summary, we have built the three cornerstones of the project. We have developed a system capable of holding each crab in a predictable position, created an algorithm to image and fit each crab to a template, and built a robotic gantry/water knife system capable of dissectingeach crab open to minimize the time and effort required to remove the meat. Our system is capable of processing crabsat speeds of up to 30crabs/min/lane, which is a 15 fold increase in productivity.This project enables the synergy of the robot-human team that will make the complex and labor-intensive crab meat harvesting job much easier, safer, and more efficient.
Publications
- Type:
Theses/Dissertations
Status:
Published
Year Published:
2021
Citation:
Dongyi Wang. 2021. ADVANCED VISION INTELLIGENT METHODS FOR FOOD, AGRICULTURAL, AND HEALTHCARE APPLICATIONS - VISION VUIDED AUTOMATED CHESAPEAKE BAY BLUE CRAB PROCESSING. Dissertation. Graduate School, University of Maryland Library.
- Type:
Conference Papers and Presentations
Status:
Published
Year Published:
2021
Citation:
Wang, D., Ali, M., Cobau, J., Tao, Y.# (2021). Designs of a customized active 3D scanning system for food processing applications. ASABE paper no. Vol.2021: 2100388.
- Type:
Journal Articles
Status:
Published
Year Published:
2020
Citation:
Tao, Y., Wang, D. 2020. Recent Advances of Artificial Intelligence Applications in Food. IUFoST Scientific Information Bulletins (SIB), 2020(9). DOI: 10.13140/RG.2.2.25942.27203.
- Type:
Conference Papers and Presentations
Status:
Published
Year Published:
2021
Citation:
Tao, Y. 2021. Application of Imaging and Vision-Guided Intelligence in Developing Automated Oyster Sorting and Crabmeat Picking Systems. Invited Speaker at the 2021 Atlantic and Gulf Coast Seafood Technology Conference (AGSTC). June21-22, 2021.
- Type:
Conference Papers and Presentations
Status:
Published
Year Published:
2021
Citation:
Tao, Y., D.Wnag, M. Ali. 2021. Machine Vision-Guided Food Manufacturing. 2021 USDA NIFA PI Meeting PPT Presentation. September 29, 2021.
|
Progress 01/01/20 to 12/31/20
Outputs Target Audience:In year 3: we have been continuing the research and experiments based on the year 2 progress. An active structural laser imaging system was set up to obtain full-sizecrab 3D morpholgies. The system includes line laser and single-axis galvonometer.The galvomometer is controlled by NI DAQ board. The reflected laser line position was well calibrated with camera.The scanning degree resolution is 0.0134 degree, and the accuracy of 3D imaging system is 0.25mm.The result can be better matchedto the off-line micro-CT model for internal crab cartilage predictions. Changes/Problems:
Nothing Reported
What opportunities for training and professional development has the project provided?
Nothing Reported
How have the results been disseminated to communities of interest?
Nothing Reported
What do you plan to do during the next reporting period to accomplish the goals?Our current work plans toemploya fourth station (first station: imaging station, second station: leg cutting station to expose crab jumbo meat, third station: z cut station to expose crab chamber meat) which consistsof a six-degree of freedom robotic arm and meat picking end effector.The robotic movement is generated by taking the inverse-kinematics of a previously learned 3D path. The robotic movement end effector extracts the jumbo lump meat from crabs' posterior chambers in a scooping manner.
Impacts What was accomplished under these goals?
An online 3D scanning system was designed to better describe the crab 3D surface morophology. It could bemapped to offline crab internal 3D bone structures and better guide the robotic movement to expose crab chamber meat.
Publications
- Type:
Conference Papers and Presentations
Status:
Under Review
Year Published:
2021
Citation:
Dongyi Wang, Mohamed Ali, James Cobau, Yang Tao, '3D Machine Vision for Sensing and Automation', ASABE 2021 Annual Meeting
- Type:
Conference Papers and Presentations
Status:
Under Review
Year Published:
2021
Citation:
Mohamed Ali, Dongyi Wang, Yang Tao, 'Robotic Extraction of Chesapeake Crab Meat', ASABE 2021 Annual Meeting
|
Progress 01/01/19 to 12/31/19
Outputs Target Audience:In this project, we reached academic and industrial audiences. Changes/Problems:
Nothing Reported
What opportunities for training and professional development has the project provided?
Nothing Reported
How have the results been disseminated to communities of interest?
Nothing Reported
What do you plan to do during the next reporting period to accomplish the goals?More research and experiments need to be conducted to optimize the design of the crab hold-down mechanism, fine adjustment of Z direction cut, the third gantry of meat extraction, and other related tasks.
Impacts What was accomplished under these goals?
In year 2, we have been continuing the research and experiments based on the year 1 progress. New accomplishments include: Improved the calibration between camera and the gantry stations to ensure the accuracy between the vision and the robotic waterjet knife cut. The tests showed the system can achieve the crab leg removal and crab core extraction at Gantry Station One at <1mm accuracy. Enabled real-time on-line continued batch processing for Gantry Station One core extraction. The on-line rate for waterjet crab core extraction has reached approximately 15 crabs per minute, about 10 times faster than manual operation. Explored crab internal 3D morphological bone structures based on off-line micro-CT images to identify the middle deck for robotic separation between crab chamber layers to expose the lump and chamber meat. A crab internal morphology bone model has been established for guiding the crab disassembly. A structural laser imaging was installed to obtain the crab height information. It is linked to the off-line micro-CT model which can reflect the corresponding crab fatness to the internal crab deck bone structures. The second gantry station with Linmot servo-control was on-line connected and synchronized to the other stations (vision, conveyor, and gantry station 1). The synchronized servo controls allowed precision internal Z deck cut to reveal the lump and chamber meat. In summation, we have prototyped the vision-guided intelligent robotic crab picking machine equipped with color camera, laser for 3D imaging, and two gantry stations for crab leg removal, crab core extraction, and crab chamber bone removal. The current prototype has been tested on-line in lab and is able to perform the main functions of crab disassembly.
Publications
- Type:
Conference Papers and Presentations
Status:
Published
Year Published:
2019
Citation:
Tao, Y. Machine Intelligence Embedded in Food Processing Automation Systems. 2019. IFT Annual Meeting, June2-4, 2019. New Orleans.
- Type:
Conference Papers and Presentations
Status:
Published
Year Published:
2019
Citation:
Tao, Y. Machine Vision-Guided Food Manufacturing. 2019. IFT Advanced Food manufacturing program. PI meeting. June 2, 2019. New Orleans.
|
Progress 01/01/18 to 12/31/18
Outputs Target Audience:In this phase of the project we reached both a industrial and academic audience. Changes/Problems:
Nothing Reported
What opportunities for training and professional development has the project provided?
Nothing Reported
How have the results been disseminated to communities of interest?Yes, to stakeholder in seafood industry. What do you plan to do during the next reporting period to accomplish the goals?Continue our resarch and development along the current path. Our team will work to further improve the results form this reporting period. We will further collabrate with the seafood industry including demos and QA sessions to optimize the system and process.
Impacts What was accomplished under these goals?
Through intensive research and experimentation, we have furthered the existing body of scientific knowledge of crab morphology, robotic control, and deep learning algorithms. This research was part of our efforts to establish the three foundational components of our robotic system: mechanical systems to hold the crab in a known and indexable position and move it through the system, a vision system capable of capturing necessary information about crab morphology and predicting the location of meat compartments, and a robotic control system capable of using a high pressure waterjet knife to cut open the crab in preparation for picking. Starting with the mechanical positioning system, our team used fresh Chesapeake Bay crabs to collect information on the size and shape of the crabs' inner compartments. This data was used to design several iterations of a mechanical holding apparatus which compliments the shape and strength of the crab's structure. Each design was further iterated across various surface textures and physical geometries until a design optimized across both strength, capability, and manufacturability emerged. This mechanical system serves as a handling system for the crabs through the rest of the systems. In conjunction with the mechanical system, we developed a vision system capable of capturing the required information. The CCD camera can capture the image of each crab's inner morphology. A bespoke deep learning algorithm was developed to create a general model capable of taking three geometric feature points from each crab and transposing an optimized cut template onto it. Each transposed template contains geometric information through which a cut line can be generated for each crab. Each cut line the vision system generates traces around the edges of each outer meat compartment in order to minimize the amount of effort needed to extract the crab meat. This cut line is turned into coordinate points and passed to the robotic waterjet knife cutting system. Our team increased the performance of the proposed system by adding a third axis (Z) to the initially tested X and Y. In order to accomplish this, our team added a water knife attached to a linear motor. This motor is capable of being moved in the Z direction to peel off additional sections of chitin from the meat compartments. This change reduces the amount of time it takes a human to extract meat from the compartments. In summation, we have built the three cornerstones of the project. We developed a system capable of holding each crab in a predictable position, created an algorithm to take pictures and fit each crab to a template, and built a robotic gantry/water knife system capable of cutting each crab open to minimize the time and effort required to remove the meat.
Publications
- Type:
Journal Articles
Status:
Published
Year Published:
2018
Citation:
Wang, D, R. Vinson, M. Holmes, G. Seibel, Y. Tao. 2018. Convolutional Neural Network Guided Blue Crab Knuckle Detection for Autonomous Crab Meat Picking Machine. J. of Optical Engineering. Vol.57(4), 043103.
- Type:
Conference Papers and Presentations
Status:
Published
Year Published:
2018
Citation:
Wang, D., Holmes, M., Vinson, R., Seibel, G., & Tao, Y. (2018). Machine Vision Guided Robotics for Blue Crab Disassembly---Deep Learning Based Crab Morphology Segmentation. In 2018 ASABE Annual International Meeting. ASABE Paper No. 1800570.
- Type:
Conference Papers and Presentations
Status:
Published
Year Published:
2018
Citation:
Wang, D., Holmes, M., Vinson, R., Seibel, G., & Tao, Y. (2018). Machine Vision Guided Robotics for Blue Crab Disassembly---Deep Learning Based Crab Morphology Segmentation. Presented at Northeast Agricultural and Biological Engineering Conference (NABEC), Morgantown, WV. July29-Aug1, 2018.
|
|