Source: UNIV OF PENNSYLVANIA submitted to
ROBOT SWARMS AND HUMAN SCOUTS FOR PERSISTENT MONITORING OF SPECIALTY CROPS
Sponsoring Institution
National Institute of Food and Agriculture
Project Status
TERMINATED
Funding Source
Reporting Frequency
Annual
Accession No.
1007026
Grant No.
2015-67021-23857
Project No.
PENW-2015-08504
Proposal No.
2015-08504
Multistate No.
(N/A)
Program Code
A7301
Project Start Date
Sep 1, 2015
Project End Date
Feb 28, 2019
Grant Year
2015
Project Director
KUMAR, R.
Recipient Organization
UNIV OF PENNSYLVANIA
(N/A)
PHILADELPHIA,PA 19104
Performing Department
1306 - Mechanical Engineering
Non Technical Summary
Unmanned Aerial Vehicles provide outstanding opportunities to have a transformative impact on farming technology. This proposal outlines research designed to assist farmers in improving the productivity and profitability of their farms thus strengthening rural economies and increasing the economic competitiveness of U.S agriculture.The motivation for this project comes from the challenges faced by the $60B (US only) specialty crop industry (over $600B worldwide). While the three target crops in this proposal account for a small fraction of this industry ($6B in the US), they are representative crops. Solutions that are developed for these three crops can be extended and applied to the entire industry. Florida had an estimated production value of $76.4 million on 4,300 acres of blueberries and $1.27 billion on 476,300 acres of citrus in 2014; the tomato production value was $455.9 million on 34,000 acres in 2013. The three problems addressed in this proposal‚ - estimating yield, detecting crop stress and monitoring pests- are also representative of the problems faced by this industry.Estimating yield prior to harvest is crucial for growers to better manage their resources. The yield estimation of specialty crops is especially important because of the high economic value of these crops. Early yield prediction gives growers more opportunities to set a market price and plan post-harvest logistics. Providing accurate counts of immature (green) fruit on all trees across the orchard will indicate the quantity of fruitlets on individual trees and increase the efficiency in the cataloging and marketing of product. In addition, this information will guide advanced manual thinning of the initial fruit set, permitting the grower to optimize the size and quantity of fruit.Detecting crop stress will enable corrective action that includes irrigation and application of fertilizer. Abiotic stresses such as water stress or nutrient deficiency are common problems in crop production, and sometimes have symptoms similar to biotic stresses. One of the objectives of this study is to investigate methods of differentiating the abiotic stresses from the biotic ones for more accurate decision-making.Diseases and pest infestations are a common threat to crop production. In blueberry production, major diseases include mummy berry, phytophthora root rot, botrytis blight, and septoria leaf spot. They are also common in Florida and Pennsylvania. The onset of many of these diseases is rapid and the time from infection to symptom appearance and to death of the plant is short. In addition, early detection is essential because these diseases spread rapidly if treatment is not timely. Finally, some diseases and pests are not exclusive to blueberries, tomatoes, or citrus. Because they have many hosts and are very mobile, they can become a problem at any time during the season, so persistent scouting is necessaryIt is common practice for human scouts to inspect farms periodically to estimate yield, detect crop stress, and monitor pest density. Although the expert knowledge of a human scout is invaluable, it is impractical for humans to conduct thorough inspections and collect the quantitative data that is necessary for precision farming. Indeed a swarm of low-flying co-robots working synergistically with one or more human scouts can enable efficient information gathering to optimize the use of resources such as water, labor, and fertilizers.A swarm can collect information either opportunistically or deliberately over several fields in a small time interval. This ability to conduct operations in a small, prescribed time window is particularly useful for gathering consistent stress data. A swarm can alert the human scout and direct her to possible "trouble spots" in a field with low vigor or yield. Typically, a human scout uses a truck to conduct inspections. The swarm can leverage automated charging stations on the truck to recharge allowing for optimal coverage, while adapting to the paths chosen by the scout as well as the intelligence gathered by the scout. Thus a human scout assisted by a swarm of co-robots and a decision support system can monitor large areas and gather actionable intelligence at an unprecedented scale.UAVs are game changers! They will have a transformative impact on farming technology. Beyond the broader impacts of the research, we will work closely with national agricultural groups to disseminate the results of the research conducted. The technology described in this proposal also provides a unique opportunity to connect STEM education with agriculture. As described above we will do this by developing courseware at all levels ranging from middle school students to graduate students. While our work in Philadelphia will help us target a primarily low-income and underrepresented minority population, the work in Florida will help us reach out to growers, farmers, and crop consultants to implement these units. We will also reach out to the general public through demonstrations, public lectures and websites
Animal Health Component
0%
Research Effort Categories
Basic
70%
Applied
20%
Developmental
10%
Classification

Knowledge Area (KA)Subject of Investigation (SOI)Field of Science (FOS)Percent
2162499202030%
4027310202040%
2031120102010%
2031460102010%
2030999102010%
Goals / Objectives
In an effort to reduce the amount of food lost annually between production and consumption by rendering the agricultural systems that generate food, fiber, and feed smarter and more efficient, this project brings together research groups from the University of Pennsylvania and the University of Florida to address the problems of:(a) obtaining timely estimates of yield;(b) diagnosis of crop stress; and(c) detection of pests to enable efficient utilization of resources such as labor, water, fertilizers, and pesticides, as well as risk management and financial planning for specialty crops.Our focus is on three representative crops -blueberries, tomatoes, and citrus- each with a different approach to farming (trellis-based, close-to-ground rows, and trees respectively), but representative of other crops such as apples, peaches, and strawberries.We propose the synergistic use of a swarm of Unmanned Aerial Vehicles (UAVs) that operate with human scouts. Customized sensors carried by low-flying co-robots working synergistically with human scouts will enable production of high resolution, multidimensional maps, which can be used to monitor large farms at unprecedented spatio-temporal resolutions, greatly improving the efficiency and yield of farm operations. Although UAVs have been successfully used before for precision agriculture, we will, for the first time, investigate autonomous flight at low altitudes that will enable detailed, multidimensional maps from side views.We will develop the framework and algorithms to deploy multiple UAVs that can collaborate with and be controlled by a single human scout.We will develop a decision support system that will enable a human scout and the swarm of co-robots to operate in concert over extended periods while accommodating constraints on sensing, navigation speeds, and power consumption.In addition to exploring new frontiers in the use of UAV swarms and co-robots, the proposed research will contribute to agriculture science along three directions:(1) developing automatic stress monitoring for agriculture using inter-row sensing with a swarm of UAVs;(2) investigating the potential for differentiating water-stressed trees from disease-infected plants or trees by comparing and combining the discrimination results from both thermal sensing and hyperspectral sensing; and(3) investigating the feasibility of pest density monitoring and mapping using RGB images and physical samples obtained from the UAV swarm.Most importantly, we will develop a novel approach to precision agriculture that will provide growers with a data-driven deployment strategy that makes synergistic use of a networked robotic system working interactively with a human scout.
Project Methods
We propose a swarm of co-robots (UAVs) that assist a human scout (farmer) by providing them with the information and actionable intelligence required to manage the farm. Specifically, providing reliable yield estimates will allow the farmer to plan the logistics of harvesting and distribution. Monitoring crop stress will allow optimized irrigation and scheduling of nutrient delivery. Detecting pests will permit the farmer to target diseased plants for treatment before pests overrun the farm. The swarms are co-robots - they coordinate with the scout, extending their reach to the entire farm. They collaborate with the scout, providing information that is complementary and synergistic with the information that can be collected manually.The scout uses a truck to drive around the farm for inspection. The truck will be instrumented with GPS and equipped with charging stations for the UAVs. As shown in our previous work, each UAV will be capable of landing autonomously. It will use GPS to provide an initial estimate of the truck and onboard sensors to precisely locate its landing/charging station. Landing positions can differ from the launch positions, thereby allowing the scout to move independently of the UAVs. We will develop a Decision Support System that will advise the scout of routes likely to yield maximal information and warn them of areas out of reach of the robots due to battery limitations.A semantic map is a metric map composed of regions with such labels as "high crop-yield", "nutrient-stressed", "water-stressed"‚ "high pest-pressure", and "unknown". The map is designed to be easy to interpret. The proposed research will use principled methods from machine learning, and planning algorithms to generate an optimal deployment plan for the scout and the swarm to generate the map."Efforts": We will work closely with national agricultural groups to disseminate the results of the research conducted. The technology described in this proposal also provides a unique opportunity to connect STEM education with agriculture. We propose the creation of middle school science units that ask students to study food security issues, challenges faced by farmers, and how robots can help solve those problems. We will work with the Philadelphia School District, a primarily low-income and underrepresented minority population, to implement these units. We will pursue the development of UAV swarms and instructional material for use in undergraduate courses at the University of Pennsylvania. We will also engage the general public through an extensive scheduling of demonstrations that include lab tours given to 1,200 K-12 students per year, Philadelphia Science Festival activities that reach thousands of children and adults, and Science After Hours presentations at museums in Philadelphia and Citrus Show and various field days in Florida."Evaluation": The experiments will be carried out in three stages over three years.Through year one, we will develop six UAVs, test the automated recharging system, develop AgDSS, test multi-spectral sensor package using potted trees, and test UAV control and navigation algorithms, part of which will be carried out at the netted outdoor flight facility at the University of Pennsylvania.In Florida, during year one, we will investigate the spectral reflectance characteristics of major diseases affecting tomatoes and blueberries and create a library of spectral signatures. We will conduct a band selection on the spectral data to detect key bands for detection of different diseases. These bands will be used later to select filters for multi-band cameras onboard the UAVs.During year two, we will carry out field testing of our system at blueberry, tomato, and citrus farms in Florida, working in close collaboration with growers.Year three will be used to refine the system and explore opportunities for technology transfer with industry. The specific evaluation plan for the three major goals contributing to agriculture science is given next.Yield estimation - Appropriate sampling strategies for each fruit will be evaluated. High resolution images acquired will be used for automated fruit counting using machine vision algorithms. For citrus, UAVs will carry out close-range flights between trees. For blueberry vines, UAV flights will be carried out between trellises, at an altitude of about five feet. Since some blueberry fruits may be hidden due to canopy structure, we will evaluate sampling strategies to observe yield from close-range images when only a subset of fruits on the canopy is visible. Tomatoes grow in row structures at ground level, presenting a challenge, and acquisition of ground-truth data of yield may involve more human supervision. We will correlate the fruit count obtained from select locations with multispectral imagery and canopy characteristics. For each case, estimation accuracy will be evaluated by comparing predicted yield with ground-truth yield from a random set of locations in the farm.Measuring crop stress and disease - The evaluation will consist of two phases. For the first phase of experiments, we will develop a model that detects and differentiates stress and diseases using samples of stressed, diseased, and healthy trees as ground-truth. The second phase will use this model to test data-driven sampling strategies to generate maps of stress and disease hotspots.Analysis of variance (ANOVA) will be performed on the spectral data of water-stressed, diseased and healthy plants at the 0.01 level of significance by a Tukey's studentized range test. Several data mining techniques will be applied in order to select the best hyperspectral wavelengths to discriminate between hyperspectral data of the studied classes. The "one data out" approach for cross-validation will be used in order to assess the classification accuracy of each model (multilayer perceptron, and radial basis function). We will compare and evaluate the discrimination results on disease detection for the three categories: (i) with only the thermal data, (ii) with only the hyperspectral data, and (iii) with the combination of both data. The combination of both data will allow us to reduce misclassified plants obtained with thermal data or spectral measurements performed independently. Classification accuracy will be evaluated using ROC analysis, following which, we will determine the necessity of using both a thermal camera and a spectral camera, or a single system alone to properly classify water-stressed, diseased or healthy plants. Following the training and evaluation of the above models, we will carry out data-driven experiments to generate maps of stress and disease for the whole farm during phase two. Evaluation will be carried out by ground-truth data collection by human scouts.Pest density estimation - We will evaluate two approaches for pest density estimation: (i) special pest sticky-tapes at specific elevated posts in the farm that are photographed by UAVs; (ii) sticky-tapes directly on the UAVs and use data-driven sampling methodology to guide them to locations with high expected pest density. For the first approach, the relationship between the flight altitudes and counting and detection accuracy will be determined by carrying out multiple flights over the same site by varying the altitude from 10 ft up to 100 ft at increments of 10 ft. For the second approach, the UAV tours will be planned to obtain covariates of pest density, followed by landing at a location where they stay for a pre-defined dwell time. On automatic and manual retrieval of the UAVs, ex-situ pest-counting will be carried out in a lab setting to obtain ground-truth estimates of pest density. Image processing techniques used in the first approach will also be investigated for automatic pest density count onboard the UAVs. The two approaches will be evaluated by comparing the estimated pest density map with manually collected ground truth data.

Progress 09/01/15 to 02/28/19

Outputs
Target Audience:Agronomists, growers, industry, robotics researchers, machine learning researchers Changes/Problems: Nothing Reported What opportunities for training and professional development has the project provided?Kumar, University of Pennsylvania Over the course of this project, we have had over 40 students ranging from high school, undergraduate, Masters and PhD level participate in a variety of ways. It has served as the initial exposure to robotics, machine learning, and research for many of these students, and provided them with key technical skills that help them succeed in their future endeavors. Alumni of the group have gone on to pursue further education or careers in robotics, with many working on agriculture robotics and machine learning companies and startups. Two postdoctoral researchers associated with the award are now faculty members at Virginia Tech and Arizona State University. Work-study (Spencer Fox) UPenn VIPER undergrads - Delaney Kaufman, Daniel Orol Michael Hughes, master of environmental studies student, working on Capstone thesis project on innovative technologies for environmental monitoring and sustainability. Xin Wang, Masters student in Environmental Studies (2016) at University of Pennsylvania did her Capstone Project, in part with the Penn Agriculture group. As a part of her project, she accompanied the group to Florida in March 2016 and interviewed growers. UPenn M&T undergrad students - Sophie Thorel, Jai Ashar ? Field experiments A joint field experiment with Dr. Jim Schupp at Penn State University's Fruit Research and Extension Center (FREC), Biglerville, and AS&E Inc. in October 2015 demonstrated for the first time the use of backscatter X-ray imaging for detection of apples occluded by canopy. The experiment was carried out at FREC's apple orchards. Two trips were carried out to Florida in collaboration with Prof. Ehsani to Driscoll's and Lipman Produce in December 2015, and March 2016 for collection of visual and backscatter X-ray data. The March 2016 backscatter X-ray scans of strawberries and tomatoes demonstrated possible application for fruit counting of fruits that are significantly occluded by canopy. Two data collection trips were carried out to Washington state with Washington tree fruit research commission in August and October 2015. Nighttime data with active illumination on both green and red apples are collected, as well as human counted ground truth. Our counting algorithm was applied to those data and achieved 95% accuracy compared to ground truth. University of Florida carried out a series of data collection experiments with a spectrometer and controlled illumination from a halogen lamp (broad spectrum), in order to study ripeness of tomatoes, for comparison of leaves and fruits. Based on these results, the median NDVI value for tomato leaves were determined to be 0.15. NDVI increased almost linearly to 0.17 for raw green, 0.19 for matured green, and 0.22 for red tomato samples. Thus, calculating NDVI simplifies the detection and classification process for the tomato samples. How have the results been disseminated to communities of interest?Publications (Kumar, UPenn) Xu Liu, Steven W. Chen, Chenhao Liu, Shreyas S. Shivakumar, Jnaneshwar Das, Camillo J. Taylor, James Underwood, and Vijay Kumar, "Monocular Camera Based Fruit Counting and Mapping with Semantic Data Association", IEEE Robotics and Automation Letters (RA-L), 2019. M Kalischuk, ML Paret, JH Freeman, D Raj, S Silva Da, S Eubanks, DJ Wiggins, M Lollar, JJ Marois, HC Mellinger, J Das, "An Improved Crop Scouting Technique Incorporating Unmanned Aerial Vehicle-Assisted Multispectral Crop Imaging into Conventional Scouting Practice for Gummy Stem Blight in Watermelon.", Plant Disease, 2019. Xu Liu, Steven W. Chen, Shreyas Aditya, Nivedha Sivakumar, Sandeep Dcunha, Chao Qu, Camillo J. Taylor, Jnaneshwar Das, and Vijay Kumar, "Robust Fruit Counting: Combining Deep Learning, Tracking, and Structure from Motion", accepted in International Conference on Intelligent Robots and Systems (IROS), 2018. Daniel Orol, Jnaneshwar Das, Lucas Vacek, Isabella Orr, Mathews Paret, Camillo J. Taylor, Vijay Kumar, "An Aerial Phytobiopsy System: Design, Evaluation, and Lessons Learned," International Conference on Unmanned Aircraft Systems (ICUAS), 2017. Lucas Vacek, Edward Atter, Pedro Rizo, Brian Nam, Ryan Kortvelesy, Delaney Kaufman, Jnaneshwar Das, and Vijay Kumar, "SUAS for Deployment and Recovery of an Environmental Sensor Probe," International Conference on Unmanned Aircraft Systems (ICUAS), 2017. Steven W. Chen, Shreyas Skandan, Sandeep Dcunha, Jnaneshwar Das, Chao Qu, Camillo J. Taylor, and Vijay Kumar, "Counting Apples and Oranges with Deep Learning: A Data-Driven Approach", IEEE Robotics and Automation Letters (RA-L) and presented at IEEE International Conference on Robotics and Automation (ICRA) 2017. Jnaneshwar Das, Gareth Cross, Chao Qu, Anurag Makineni, Pratap Tokekar, Yash Mulgaonkar, and Vijay Kumar, "Device, Systems, and Methods for Automated Monitoring Enabling Precision Agriculture," IEEE International Conference on Automation Science and Engineering (CASE) 2015. Patents Systems, Devices, and Methods for Robotic Remote Sensing for Precision Agriculture, V.Kumar, G. Cross, C. Qu, J. Das, A. Makineni, Y. Mulgaonkar (US20170372137). Systems, Devices, and Methods for Agricultural Sample Collection. D. Orol, L. Vacek, D. Kaufman, J. Das & V. Kumar; (US patent filed, May 2018). Jointly with Ehsani (Penn + UFL): Suproteem K. Sarkar, Jnaneshwar Das, Reza Ehsani, and Vijay Kumar, "Towards Autonomous Phytopathology: Outcome and Challenges of Citrus Greening Disease Detection Through Close-range Remote Sensing," IEEE International Conference on Robotics and Automation (ICRA) 2016. Reza Ehsani, Dvoralai Wulfsohn, Jnaneshwar Das, Ines Zamora Lagos, "Yield Estimation: A Low-Hanging Fruit for Application of Small UAS," ASABE Resource: Engineering & Technology for a Sustainable World 2016. Reza Ehsani and Jnaneshwar Das, "Yield Estimation in Citrus with sUAVs," Citrus Extension Trade Journals, 2016. Ehsani, University of Florida / UC Merced Liu, T. H., Ehsani, R., Toudeshki, A., Zou, X. J., & Wang, H. J. (2018). Detection of citrus fruit and tree trunks in natural environments using a multi-elliptical boundary model. Computers in Industry, 99, 9-16. Wan, P., Toudeshki, A., Tan, H., & Ehsani, R. (2018). A methodology for fresh tomato maturity detection using computer vision. Computers and Electronics in Agriculture, 146, 43-50. Liu, T. H., Ehsani, R., Toudeshki, A., Zou, X. J., & Wang, H. J. (2018). Identifying immature and mature pomelo fruits in trees by elliptical model fitting in the Cr-Cb color space. Precision Agriculture, 1-19. (Tokekar, Virginia Tech) Kevin Yu, Ashish Kumar Budhiraja, Spencer Buebel, and P. Tokekar. Algorithms and Experiments on Routing of Unmanned Aerial Vehicles with Mobile Recharging Stations. Submitted to the Journal of Field Robotics, 2018. Note: revision under review. Kevin Yu, Ashish Kumar Budhiraja and Pratap Tokekar. Algorithms for Routing of Unmanned Aerial Vehicles with Mobile Recharging Stations. Proceedings of the IEEE International Conference on Robotics and Automation (ICRA), 2018. A. K. Budhiraja. View Point Planning for Inspecting Static and Dynamic Scenes with Multi-Robot Teams. Masters Thesis, Virginia Tech, 2017. K. Yu, A. K. Budhiraja, and P. Tokekar, Algorithms for Routing of Unmanned Aerial Vehicles with Mobile Recharging Stations. Submitted to the ICRA 2018. K. Yu, A. K. Budhiraja, and P. Tokekar, Algorithms for Routing of Unmanned Aerial Vehicles with Mobile Recharging Stations. Poster at the International Symposium on Multi-Robot and Multi-Agent Systems, December 2017. K. Yu, A. K. Budhiraja, and P. Tokekar, Algorithms for Routing of Unmanned Aerial Vehicles with Mobile Recharging Stations and for Package Delivery, 2017. International Symposium on Aerial Robotics, 2017. Note: unrefereed presentation. Select presentations: Ehsani, APS meeting 2016 Ehsani, R. Sensor System for Monitoring Horticultural Crops: Challenges and Opportunities. Phenome 2018, Tucson Arizona, February 14-17, 2018. Ehsani, R. AgTech Research in High Value Crops. Ag Tech Day. UC Merced Ag Tech Day conference. May, 4th , 2018. Ehsani, R. The Power of Digital Observation in Agriculture. Agrologic Food Tech Summit. San Francisco. July 18-19, 2018. What do you plan to do during the next reporting period to accomplish the goals? Nothing Reported

Impacts
What was accomplished under these goals? Through the duration of this award, we developed new methods for mapping specialty crop yield and health, using low cost lightweight sensors, unpiloted vehicles, and cloud/IoT infrastructure. Apart from development of new systems and methods for precision agriculture, the innovations that have resulted from the award have enabled new research directions such as soil analysis for monitoring ecosystems of dry rangelands, as well as damage assessment after natural disasters. ? Kumar, University of Pennsylvania We developed an agricultural decision support system (AgDSS) for data analytics. This cloud application facilitates rapid annotation of field data acquired by our systems by multiple users, and is now enabling introspection of predictive models [9]. In year 2, large volume of labels from this tool and the use of deep learning algorithms for fruit detection, eliminated the need for controlled illumination, and expanded the range of fruits and farm structure we can work with [3]. Through year 2 and 3 we also developed systems to enable aerial collection of agricultural samples of soil or leaves [4,5], and a patent was filed on this system [b]. Finally, in year 2 we started developing framework and algorithms to deploy multiple UAVs that can collaborate with and be controlled by a single human scout [6]. Ehsani, University of California, Merced In 2018, Dr. Ehsani's group redesigned the fruit counting sensor and developed a detection algorithm with a significant improvement over the sensor that was developed earlier in 2016. The previous sensor was using a 320 X 240 pixels CCD sensor with an adjustable optical lens and an infrared filter. However, the low resolution of the image was limiting the rate of detection success. The new sensor uses a CCD sensor with the resolution of 640 x 480 pixel. This increase in resolution in both vertical and horizontal dimensions caused four times reduction in the image processing speed. Therefore, the new processor is an ARM Cortex-M7 with higher clock frequency with the same cost of the previous processor. The additional microcontroller in the last design eliminated and both tasks of image processing and fruit counting are handled using only by one processor. The noticeable change in the new system design is the LCD display that excludes the need for the additional attached computer for visualizing the real-time data. This can help to reduce the requirements and ease the use of the sensor in-field. Finally, the accuracy of yield estimation is enhanced. Tokekar, Virginia Tech In year 1, progress was made in formulating and solving the UAV+scout deployment problem. We have developed an offline algorithm that solves for deploying teams of UAVs and scouts to visit and sense around a given set of points while minimizing the total travel time. Our model incorporates scenarios where there are three types of points: (i) that can be sensed by UAV alone; (ii) that can be sensed by scouts alone; and (iii) points that can be sensed by both scouts and UAVs. It also allows for multiple scouts and possibly heterogeneous teams of UAVs. Our solution is based on a Generalized Traveling Salesperson (GTSP) Algorithm, and finds the optimal tour for each scout and UAV. A journal paper describing these results will be submitted in September. The code implementing our algorithm will be released along with the paper. Our current work is on extending this algorithm to the online case where the points appear/disappear due to spatio-temporal Gaussian Processes. The VT subcontract ended on Aug 31, 2017. https://www.youtube.com/watch?v=Cuc3zs8dRm0

Publications

  • Type: Conference Papers and Presentations Status: Accepted Year Published: 2019 Citation: Xu Liu, Steven W. Chen, Chenhao Liu, Shreyas S. Shivakumar, Jnaneshwar Das, Camillo J. Taylor, James Underwood, and Vijay Kumar, "Monocular Camera Based Fruit Counting and Mapping with Semantic Data Association", in IEEE Robotics and Automation Letters, vol. 4, no. 3, pp. 2296-2303, July 2019.
  • Type: Journal Articles Status: Accepted Year Published: 2019 Citation: M Kalischuk, ML Paret, JH Freeman, D Raj, S Silva Da, S Eubanks, DJ Wiggins, M Lollar, JJ Marois, HC Mellinger, J Das, "An Improved Crop Scouting Technique Incorporating Unmanned Aerial Vehicle-Assisted Multispectral Crop Imaging into Conventional Scouting Practice for Gummy Stem Blight in Watermelon.", Plant Disease, 2019.


Progress 09/01/17 to 08/31/18

Outputs
Target Audience:Agronomists, growers, industry, robotics researchers, machine learning researchers. Changes/Problems: Nothing Reported What opportunities for training and professional development has the project provided?This grant has trained and supported 1 undergraduate student, 6 graduate students, and 2 postdoctoral scholars in the last year. Jnaneshwar Das accepted a new position of Alberto Enrique Behar Research Professor, School of Earth and Space Exploration at Arizona State University in July 2018. How have the results been disseminated to communities of interest?Presentations and demonstrations were provided to approximately 1,200 K-12 students. These demonstrations included events such as NanoDay, Women in Computer Science (WICS) High School Day for Girls, and GEARS Day hosted by Advancing Women in Engineering. On May 31, 2017, the group hosted and co-planned the 7th Annual Philly Robotics Expo which involved demonstrations, presentations, and mechanical, electrical, and programming design classes for 350 students (included in the numbers above). On April 22, 2018, Kumar hosted and participated in the Be A Pennovator event in partnership with the Philadelphia Science Festival and the Franklin Institute. This event had 200 students from the region participate in robotics related activities, including a computer vision workshop. During the months of June and July 2018, Kumar was responsible for 4 different K-12 summer programs. The Summer Mentorship Program in Robotics, developed by Kumar, ran for its fifth straight year with funding from the Army Education Outreach Program's UNITE fund. The four-week class included 18 students from underrepresented populations in Philadelphia Public Schools. Students worked on programming, engineering design, circuit design, and presentation of work utilizing Arduino based, Penn designed, TOBI robots. The Engineering Summer Academy at Penn's robotics program, also created by Kumar, ran in July 2017. This three-week class included 22 students from around the world. Students developed walking robots using Arduinos and rapid fabrication techniques. In collaboration with the Steppingstone Scholars organization, the group ran two four-week summer robotics classes for 30 underrepresented students using Mindstorms, Sparki, and Scratch. Finally, the group ran the robotics track of the Girls in Engineering, Math, and Science (GEMS) program during the final week of June. This camp provided an opportunity for 24 middle school girls to learn about robotics in a collaborative, mentored, constructivist environment. Kumar's group participated in the development of a course in collaboration with Penn's Graduate School of Education for in-service teachers focused on computational thinking. This course, called "Experiences in Applied Computational Thinking Certificate Program" launched in July, 2018 with 10 teachers in the pilot class. Finally, Kumar's group actively supports and partners with FIRST Robotics to host FIRST LEGO League (FLL) and FIRST LEGO League Jr. in Southeast PA, including 223 middle school teams, 50 elementary school teams, 10 competitions, 5 expositions, and the regional championships. Researchers served as competition judges, referees, and volunteers at the 2017- 2018 FLL Championship hosted by the University of Pennsylvania. This program impacted approximately 2,000 middle school students in the greater Philadelphia area. The group also provided teacher professional development opportunities each summer to prepare teachers and coaches for the FLL competition. As part of this engagement, we provided funding for 45 FLL teams in underserved public schools in Philadelphia and hosted an Americorps VISTA to support these efforts. What do you plan to do during the next reporting period to accomplish the goals?If a no-cost extension is approved through February, we will work closely with Das (UPenn postdoc, now at Arizona State University), Ehsani at UC Merced, Industry partners: Aerial Applications (PA) and Glades Crop Care (FL), and TechnoServe, a non profit international organization. Specifically we will engage in the following focused activities. 1. Apply semantic mapping and counting to a variety of crops in CA.Here we will use the NVIDIA DGX and NVIDIA DGX-1 recently acquired by Penn for large scale crop mapping. 2. We will develop algorithms for tree and fruit size estimation from monocular camera and IMU. 3. We will develop the software tools for context aware object search (in collaboration with Aerial Applications). This will be used for damage assessment in crops after hurricanes. 4. We will complete the AgDSS software pipeline. This includes two goals: a. Open source the pipeline and facilitate deployment. b. Use datasets from UC Merced to test the pipeline. 5. Finally, as part of our mission to develop low-cost tools for precision farming, we will use the Intel Aero a low cost low SWaP agile UAV platform and customize it for farm operation. Specifically we will carry out experiments with the Aero and use it for close-range farm inspection with Glades Crop Care and University of Florida.

Impacts
What was accomplished under these goals? University of Pennsylvania Through year 2 and 3, we developed an agricultural decision support system (AgDSS) for data analytics. This cloud application facilitates rapid annotation of field data acquired by our systems by multiple users, and is now enabling introspection of predictive models. In year 2, large volume of labels from this tool and the use of deep learning algorithms for fruit detection, eliminated the need for controlled illumination, and expanded the range of fruits and farm structure we can work with. Through year 2 and 3 we also developed systems to enable aerial collection of agricultural samples of soil or leaves, and a patent was filed on this system. Finally, in year 2 we started developing framework and algorithms to deploy multiple UAVs that can collaborate with and be controlled by a single human scout. University of California, Merced In 2018, Dr. Ehsani's group redesigned the fruit counting sensor and developed a detection algorithm with a significant improvement over the sensor that was developed earlier in 2016. The previous sensor was using a 320 X 240 pixels CCD sensor with an adjustable optical lens and an infrared filter. However, the low resolution of the image was limiting the rate of detection success. The new sensor uses a CCD sensor with the resolution of 640 x 480 pixel. This increase in resolution in both vertical and horizontal dimensions caused four times reduction in the image processing speed. Therefore, the new processor is an ARM Cortex-M7 with higher clock frequency with the same cost of the previous processor. The additional microcontroller in the last design eliminated and both tasks of image processing and fruit counting are handled using only by one processor. The noticeable change in the new system design is the LCD display that excludes the need for the additional attached computer for visualizing the real-time data. This can help to reduce the requirements and ease the use of the sensor in-field. Finally, the accuracy of yield estimation is enhanced.

Publications

  • Type: Conference Papers and Presentations Status: Accepted Year Published: 2018 Citation: Xu Liu, Steven W. Chen, Shreyas Aditya, Nivedha Sivakumar, Sandeep Dcunha, Chao Qu, Camillo J. Taylor, Jnaneshwar Das, and Vijay Kumar, "Robust Fruit Counting: Combining Deep Learning, Tracking, and Structure from Motion", accepted in International Conference on Intelligent Robots and Systems (IROS) 2018.
  • Type: Journal Articles Status: Submitted Year Published: 2018 Citation: Xu Liu, Steven W. Chen, Chenhao Liu, Shreyas S. Shivakumar, Jnaneshwar Das, Camillo J. Taylor, James Underwood, and Vijay Kumar, Monocular Camera Based Fruit Counting and Mapping with Semantic Data Association, submitted to IEEE Robotics and Automation Letters (RA-L). September 2018.
  • Type: Journal Articles Status: Published Year Published: 2018 Citation: Liu, T. H., Ehsani, R., Toudeshki, A., Zou, X. J., & Wang, H. J. (2018). Detection of citrus fruit and tree trunks in natural environments using a multi-elliptical boundary model. Computers in Industry, 99, 9-16.
  • Type: Journal Articles Status: Published Year Published: 2018 Citation: Wan, P., Toudeshki, A., Tan, H., & Ehsani, R. (2018). A methodology for fresh tomato maturity detection using computer vision. Computers and Electronics in Agriculture, 146, 43-50.
  • Type: Journal Articles Status: Published Year Published: 2018 Citation: Liu, T. H., Ehsani, R., Toudeshki, A., Zou, X. J., & Wang, H. J. (2018). Identifying immature and mature pomelo fruits in trees by elliptical model fitting in the CrCb color space. Precision Agriculture, 1-19.
  • Type: Journal Articles Status: Published Year Published: 2018 Citation: Kevin Yu, Ashish Kumar Budhiraja and Pratap Tokekar. Algorithms for Routing of Unmanned Aerial Vehicles with Mobile Recharging Stations. Proceedings of the IEEE International Conference on Robotics and Automation (ICRA), 2018.
  • Type: Journal Articles Status: Under Review Year Published: 2018 Citation: Kevin Yu, Ashish Kumar Budhiraja, Spencer Buebel, and P. Tokekar. Algorithms and Experiments on Routing of Unmanned Aerial Vehicles with Mobile Recharging Stations. Submitted to the Journal of Field Robotics, 2018.
  • Type: Conference Papers and Presentations Status: Other Year Published: 2018 Citation: Ehsani, R. Sensor System for Monitoring Horticultural Crops: Challenges and Opportunities. Phenome 2018, Tucson Arizona, February 14-17, 2018.
  • Type: Conference Papers and Presentations Status: Other Year Published: 2018 Citation: Ehsani, R. AgTech Research in High Value Crops. Ag Tech Day. UC Merced Ag Tech Day conference. May, 4th , 2018.
  • Type: Conference Papers and Presentations Status: Other Year Published: 2018 Citation: Ehsani, R. The Power of Digital Observation in Agriculture. Agrologic Food Tech Summit. San Francisco. July 18-19, 2018.


Progress 09/01/16 to 08/31/17

Outputs
Target Audience: Growers Washington Tree Fruit Research Commission National Mango Board (field experiments in Puerto Rico) Agriculture researchers University of Florida, Quincy and Lake Alfred extensions Penn State, Fruit Research and Extension Center Industry Aerial Applications (PA) 3D SensIR (CA) Nouka Technologies (inactive since March 2017) Changes/Problems:In the development of the mobile app for farm inspection, gathering customer requirements will be crucial for adoption within the grower and agronomist community. We are ensuring that a diverse group of growers and agronomists are approached to provide their requirements. For field trials of the phytobiopsy and sensor probe system, the challenges will be in acquiring sufficient prior data such as NDVI maps to guide the experiments, and in the experiment design with a scout or human expert in the loop. Flying UAVs additionally pose operational challenges, however, we are working on mitigating this through extensive simulations prior to field trials, and use of Penn outdoor net. What opportunities for training and professional development has the project provided? Matthew Schmittle, a University of Delaware rising senior did his summer internship at Penn and helped develop a simulation framework for swarms of UAVs in an agricultural setting. Farirai Baya, an undergraduate junior at Penn worked through Fall 2016 and Spring 2017 on active spectroscopy for improved plant health monitoring. Her research working with the agriculture group has produced valuable spectroscopy data for tomato plants. Spencer Fox, a rising senior at Penn did his work-study through summer, improving the agricultural sample collection systems for aerial leaf-sample collection and environmental probe deployment. Daniel Orol and Lukas Vacek presented their papers at the International Conference on Unmanned Aircraft Systems (ICUAS) 2017 in Miami Florida in June 2017. Michael Hughes, master of environmental studies student, is working on his Capstone thesis project with the Penn agriculture group on innovative technologies for environmental monitoring and sustainability. How have the results been disseminated to communities of interest? Jnaneshwar Das and Mathews Paret, invited talk at Florida State Horticultural Society's In-service Training on Application of New Technologies for Improved Management Strategies for Horticultural and Agronomic Crops, June 2017. Jnaneshwar Das, invited talk at AgDroneTech17, a full-day workshop held at the international Conference of Unmanned Aircraft Systems (ICUAS), June 2017. K. Yu, A. K. Budhiraja, and P. Tokekar, Algorithms for Routing of Unmanned Aerial Vehicles with Mobile Recharging Stations and for Package Delivery, 2017. Presentation at International Symposium on Aerial Robotics, 2017. What do you plan to do during the next reporting period to accomplish the goals? We are developing a mobile app to enable growers and agronomists to annotate objects of interest (e.g., fruits, blossom, disease symptoms), train deep networks on the cloud, and use their mobile devices to carry out counting of fruits and blossoms, or disease detection. In the case of inspection, the data collection can be carried out by scouts using their handheld devices. The mobile app will also enable affordable and scalable deployment on farm machinery, and ground or aerial robots. The OpenUAV testbed was developed at Penn through year 2 to define the hardware, and for end-to-end UAV swarm simulations in unstructured settings such as orchards. In year 3, we will carry out extensive mixed-initiative simulations, followed by two field experiments in University of California, Merced for agricultural aerial leaf-sample collection, and sensor probe deployment and recovery. We will work with University of California, Merced and 3Dsens-IR to explore low SWaP dual-wavelength LiDAR technologies for improved crop stress monitoring. ?

Impacts
What was accomplished under these goals? In year one (2015-2016), working with UFL, Penn State, industry partners, and growers, Penn prototyped and tested imaging systems for fruit counting and crop health monitoring. Field trials were carried out to collect multi-modal data from citrus, apple, tomato, and strawberry farms for evaluating fruit counting and disease (HLB) detection algorithm. We also started developing two small unmanned aircraft systems (sUAS) for physical sample collection of leaves for plant pathology since optical methods alone are susceptible to misdiagnosis, and manual sample collection is not scalable. These systems being developed in collaboration with Mathews Paret (plant pathologist, University of Florida) and Reza Ehsani will augment scouts, by enabling autonomous collection of leaf samples for lab analysis, as well as deployment and recovery of an environmental probe for pest sample collection. In year two, progress has been made on three thrusts -- yield estimation using deep learning, cloud resources for plant model training, and systems for agricultural sample collection. On the yield estimation front, we demonstrated use of deep learning algorithms for robust counting of apples and oranges in the field, and published the results in the IEEE Robotics and Automation Letters (R-AL) 2017. After finishing extensive experiments in collaboration with Washington State Tree Fruit Research Commission in year to quantify accuracy of our methods compared with manual count, this year we evaluated our fruit counting algorithms on mangoes (data collected in Puerto Rico in collaboration with National Mango Board), tomatoes (data collected at Lipman farms in Florida), and strawberries (collected at Driscoll's farms in Florida). The web-based agricultural decision support system (AgDSS) developed in year 1 has now available live at https://annotate.label.ag. This tool, developed for annotation and learning of crop features (e.g., fruits) is also being integrated with a cloud-enabled UAV simulation framework ( http://openuav.us). The integration effort will enable simulation of UAV swarms in farm settings, with human scouts in the loop interacting with the simulation on a web-browser. Finally, we released the annotated fruit dataset used in our R-AL 2017 paper. The first phase of field trials and testing of phytobiopsy and environmental sample collection systems were completed in year 2, and a provisional patent was filed covering these systems for autonomous collection of agricultural samples. University of Florida In 2017, Reza Ehsani upgraded the sensor that was developed in 2016, the new sensor and an embedded system are developed for detecting fruits distributed on the tree's surface and counting them as a part of this project. This sensor is a 320 X 240 pixel CCD sensor with an adjustable optical lens and an infrared filter. The sensor's captured images are processed in an ARM Cortex-M4 processor to calculate the coordinates (x-y) and sizes of fruits with defined color (in real-time). A microcontroller is programmed to count the number of signatures detected and processed in processor and estimate the number of fruits in the processed image. For 2018, the plan is to test the new camera along with a new active thermography technique to enhance the accuracy of yield estimation. Virginia Tech In year 1, the Virginia Tech team developed an offline algorithm based on Generalized Traveling Salesperson Problem (GTSP) for deploying a team of robots to sense a given set of points of interest. In year 2, we have extended this formulation to additionally account for limited battery lifetime for Unmanned Aerial Vehicles (UAVs) which can be recharged using mobile recharging stations mounted on Unmanned Ground Vehicles (UGVs). The input to the algorithm is a set of points that must be monitored by the UAVs. The algorithm finds the optimal path for the UAVs that visit all the points in the least amount of time. We explicitly consider the case where the UAVs have limited battery lifetime to visit all the points on a single charge. We study two optimization problems: (1) where to place the minimum number of stationary recharging stations? and (2) how to find paths for UGVs acting as mobile recharging stations? to enable the UAV to monitor all points of interest in the least amount of time. The algorithm accounts for different speeds of the robots, landing and take-off times, as well as recharging times.

Publications

  • Type: Journal Articles Status: Published Year Published: 2017 Citation: Steven Chen, Shreyas Skandan, Sandeep Dcunha, Jnaneshwar Das, Chao Qu, Camillo J. Taylor, Vijay Kumar, "Counting Apples and Oranges With Deep Learning: A Data-Driven Approach," in IEEE Robotics and Automation Letters, vol. 2, no. 2, pp. 781-788, April 2017.
  • Type: Conference Papers and Presentations Status: Published Year Published: 2017 Citation: Lukas Vacek, Edward Atter, Pedro Rizo, Brian Nam, Ryan Kortvelesy, Delaney Kaufman, Jnaneshwar Das, Vijay Kumar, "sUAS for deployment and recovery of an environmental sensor probe," 2017 International Conference on Unmanned Aircraft Systems (ICUAS), Miami, FL, USA, 2017, pp. 1022-1029.
  • Type: Conference Papers and Presentations Status: Published Year Published: 2017 Citation: Daniel Orol, Jnaneshwar Das, Lukas Vacek, Isabella Orr, Mathews Paret, Camillo. J. Taylor, Vijay Kumar, "An aerial phytobiopsy system: Design, evaluation, and lessons learned," 2017 International Conference on Unmanned Aircraft Systems (ICUAS), Miami, FL, USA, 2017, pp. 188-195.
  • Type: Journal Articles Status: Under Review Year Published: 2017 Citation: Wan, P., Toudeshki, A., Tana, H., Ehsani, R. (2017). A methodology for fresh tomato maturity detection using computer vision. Computers and Electronics in Agriculture
  • Type: Theses/Dissertations Status: Published Year Published: 2017 Citation: A. K. Budhiraja. View Point Planning for Inspecting Static and Dynamic Scenes with Multi-Robot Teams. Masters Thesis, Virginia Tech, 2017.
  • Type: Other Status: Published Year Published: 2017 Citation: K. Yu, A. K. Budhiraja, and P. Tokekar, Algorithms for Routing of Unmanned Aerial Vehicles with Mobile Recharging Stations. Poster at the International Symposium on Multi-Robot and Multi-Agent Systems, December 2017.
  • Type: Conference Papers and Presentations Status: Submitted Year Published: 2018 Citation: K. Yu, A. K. Budhiraja, and P. Tokekar, Algorithms for Routing of Unmanned Aerial Vehicles with Mobile Recharging Stations. Submitted to the ICRA 2018.


Progress 09/01/15 to 08/31/16

Outputs
Target Audience: Growers Washington Tree Fruit Research Commission (apple) Driscoll's (blueberry, strawberry) Lipman Produce (tomato) Agriculture Researchers University of Florida, Quincy and Lake Alfred extensions Penn State University, Fruit Research and Extension Center (FREC), Biglerville, PA Industry AS&E Inc. (backscatter X-ray imaging for improved fruit counting) Glades Crop Care Nouka Technologies Changes/Problems:A key concern has been fast changing FAA regulations, and lack of clarity on autonomy for swarms (infeasible to have one pilot per robot for field deployments). This impacts the scope of the project and restricts flight planning. Our mitigation plan is the use of the outdoor 100'x50'x50' netted test facility at UPenn that allows emulating field conditions (with GPS) for realistic tests on a sample set of plants. Ground-truth position information is available from an outdoor motion capture system attached to the structure. What opportunities for training and professional development has the project provided? High-school and undergraduate researchers worked on AgDSS development, design and fabrication of phytobiopsy system and pest-trap system, as well as development of a smart sensor suite with active illumination. DaVonne Henry, an undergraduate researcher under the NSF REU program carried out extensive spectroscopy studies, demonstrating potential efficacy of hyper-spectral imaging for improved fruit counting and disease detection. Delaney Kaufman, an undergraduate rising sophomore, and Steven Chen, a masters student, got the opportunity to attend the 5th anniversary of the National Robotics Initiative held in the Capitol complex in Washington DC on June 9, 2016. At the event, they were able to interact with leading robotics researchers and congresspersons, and demonstrated their research on this project. Xin Wang, a masters student in Environmental Studies, and Environmental Advocacy and Education at UPenn did her Capstone video project and report titled "Potentials and limitations of unmanned aerial vehicle (UAV) applications in environmental research" in part based on interviews during a field trip to Lipman Produce, and Driscoll's in Florida in March 2016 along with the project team. Edidiong Okon, an undergraduate junior funded by the NSF Louis Stokes Alliances for Minority Participation (LSAMP) Program interned on this project during Summer 2016, investigating design-based fruit counting methods for comparison with data-driven deep learning approaches. How have the results been disseminated to communities of interest? Yield estimation in citrus with SUAVs (Citrus Industry article, published 04/01/2016) http://www.crec.ifas.ufl.edu/extension/trade_journals/2016/2016_April_suavs.pdf http://label.ag is open-source, full sources available at https://github.com/dcunhas/agdss/ Summary video of research (made March 2016) https://www.youtube.com/watch?v=X6W8VVQwCIg Press: NSF article on National Robotics Initiative (NRI) 5th-anniversary event, Living in the Robotic Age, that features photo and a few lines on this project. Article also redirected to by official White House OSTP Twitter page tweet. Conference papers in preparation for IROS and CASE 2017 submissions (Virginia Tech) The paper describing the offline solution to the scouts+UAV deployment problem is planned to be submitted to IEEE Transactions on Robotics in September 2016. The ROS package implementing our algorithm will be released on github. What do you plan to do during the next reporting period to accomplish the goals? Field trials of autonomous aerial phytobiopsy, and autonomous pest trap deployment and recovery. Current tests are being conducted in a space with a motion capture system. For the next cycle we will scale-up to field conditions with natural lighting, and visual-servoing. We are designing an experiment in collaboration with Prof. Ehsani at UFL (Co-PI) for Spring 2017 wherein we will compare aerial pest-trap based pest-pressure monitoring with current best-practices. AgDSS development - we will continue extending http://label.ag (AgDSS) for visualization of model outputs (trained on labeled data). We are also extending the user-interface for use on mobile devices (e.g., iPad, smart phones with multi-touch screens). Finally, we are working on an API that will allow fast search and use of vector labels and images from AgDSS for machine learning. (Virginia Tech) In year 2, there are two main research thrusts on the algorithmic side. (1) We will extend our offline UAV+scout deployment algorithm to the online case, where the points that must be sensed appear/disappear over time. We have already started working on the online model using Gaussian Processes as the underlying framework. (2) We will incorporate additional real-world constraints such as limited energy, unequal speeds, and sensing constraints. Our plan is to extend the GTSP algorithm using the Constrained Markov Decision Processes framework, which allows for multi-objective optimization. (University of Florida) In year 2, we plan to (i) to repeat the yield estimation using low cost camera under different light conditions and using controllable light sources; (ii) to design and built an over- the-row platform equipped with light source and camera sensors array for field trial in tomato fields; (iii) to study the feasibility of combining a low-cost x-ray system with camera based yield estimation sensor.

Impacts
What was accomplished under these goals? Crop data collection A joint field experiment in October 2015 with Dr. Jim Schupp at Penn State University's Fruit Research and Extension Center (FREC), Biglerville, and AS&E Inc., demonstrated for the first time the use of backscatter X-ray imaging for detection of apples occluded by canopy. The experiment was carried out at FREC's apple orchards. Two trips were carried out to Florida in collaboration with Prof. Ehsani to Driscoll's and Lipman Produce in December 2015, and March 2016 for collection of visual and backscatter X-ray data. The March 2016 backscatter X-ray scans of strawberries and tomatoes demonstrated possible application for fruit counting of fruits that are significantly occluded by canopy. Two data collection trips were carried out to Washington state with Washington tree fruit research commission in August and October 2015. Nighttime data with active illumination on both green and red apples are collected, as well as human counted ground truth. Our counting algorithm was applied to those data and achieved 95% accuracy compared to ground truth. University of Florida carried out a series of data collection experiments with a spectrometer and controlled illumination from a halogen lamp (broad spectrum), in order to study ripeness of tomatoes, for comparison of leaves and fruits. Based on these results, the median NDVI value for tomato leaves were determined to be 0.15. NDVI increased almost linearly to 0.17 for raw green, 0.19 for matured green, and 0.22 for red tomato samples. Thus, calculating NDVI simplifies the detection and classification process for the tomato samples. Agricultural Decision Support System (AgDSS) We prototyped and evaluated http://label.ag, an agricultural decision support system for labeling and annotation of fruits and visual symptoms of biotic stresses The AgDSS has been used to acquire labels for orange and apple data acquired using our sensor suite Compute resources for deep learning on agricultural big-data A 16-core CPU based compute node, with two latest graphics cards (Nvidia TITAN X Pascal) has been set up for large-scale training on datasets labeled on http://label.ag Vehicles: Aerial robot team being developed with heterogeneous capabilities. Autonomy under development, manual tests being carried out Aerial 'phytobiopsy' or leaf sample collection for ex-situ analysis (e.g., genomics), enabling precision phytopathology A smart pest-trap for monitoring of pest density. Designed for deployment and recovery using autonomous aerial robots Swarm and Scout Planning and Coordination (Virginia Tech subcontract) (Virginia Tech) In year 1, progress was made in formulating and solving the UAV+scout deployment problem. We have developed an offline algorithm that solves for deploying teams of UAVs and scouts to visit and sense around a given set of points while minimizing the total travel time. Our model incorporates scenarios where there are three types of points: (i) that can be sensed by UAV alone; (ii) that can be sensed by scouts alone; and (iii) points that can be sensed by both scouts and UAVs. It also allows for multiple scouts and possibly heterogeneous teams of UAVs. Our solution is based on a Generalized Traveling Salesperson (GTSP) Algorithm, and finds the optimal tour for each scout and UAV. A journal paper describing these results will be submitted in September. The code implementing our algorithm will be released along with the paper. Our current work is on extending this algorithm to the online case where the points appear/disappear due to spatio-temporal Gaussian Processes.

Publications

  • Type: Conference Papers and Presentations Status: Accepted Year Published: 2016 Citation: S. Sarkar, J. Das, R. Ehsani, V. Kumar, Towards autonomous phytopathology: Outcomes and challenges of citrus greening disease detection through close-range remote sensing, presented at the 2016 IEEE International Conference on Robotics and Automation (ICRA), Stockholm, Sweden, 2016, pp. 51435148.
  • Type: Other Status: Published Year Published: 2016 Citation: R. Ehsani and J. Das, Yield estimation in citrus with SUAVs, Citrus Extension Trade Journals, pp. 16-18, 2016.
  • Type: Other Status: Published Year Published: 2016 Citation: Reza Ehsani, Arash Toudeshki, Peng Wan, New sensor technology for yield estimation and disease detection, in Tomato Proceedings, September 2016, pp. 15-16.
  • Type: Conference Papers and Presentations Status: Submitted Year Published: 2017 Citation: D. Orol, J. Das, L. Vacek, I. Orr, M. Paret, C. J. Taylor, V. Kumar, "An Aerial Phytobiopsy System: Design, Evaluation, and Lessons Learned," in review, 2017 IEEE International Conference on Robotics and Automation (ICRA).
  • Type: Other Status: Published Year Published: 2016 Citation: Reza Ehsani, Dvoralai Wulfsohn, Jnaneshwar Das, Ines Zamora Lagos, "Yield Estimation: A Low-Hanging Fruit for Application of Small UAS," in ASABE Resource: Engineering & Technology for a Sustainable World, July 2016, pp. 16-18.
  • Type: Journal Articles Status: Submitted Year Published: 2017 Citation: "Counting Apples and Oranges with Deep Learning: A Data-Driven Approach", Steven Chen, Shreyas Skandan, Sandeep Dcunha, Jnaneshwar Das, Chao Qu, Camillo J. Taylor, Vijay Kumar, in review (revise and resubmit) 2017 Robotics and Automation Letters.