Penn Agriculture Project

From Penn Agriculture Project
Revision as of 11:42, 3 March 2018 by Aguser (talk | contribs) (Publications)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to: navigation, search


We are developing smart robotic systems to improve efficiency and yield of farm operations. Our goal is to provide specialty crop growers with a data-driven deployment strategy that makes synergistic use of a networked robotic system working interactively with a human scout.

First, we have developed a lightweight and self-contained multi-spectral 3-D imaging system that has been deployed using unmanned aerial vehicles (UAVs), ground vehicles, and carried by a human scout. Acquired data have been used to train statistical models enabling persistent monitoring of crop yield, morphology, and health.

Second, we are developing the framework and algorithms to deploy multiple UAVs that can collaborate with and be controlled by a single human scout.

Finally, an agricultural decision support system (AgDSS) is being developed to facilitate annotation of field data acquired by our systems, and introspection of learned predictive models. Our technology stack will enable a human scout and a swarm of co-robots to operate in concert over extended periods while accommodating constraints on sensing, navigation speeds, and power consumption.

Research thrusts

Data-driven yield-estimation

DL fruit counting pipeline.png

Deep segmentation on mangoes. Deep segmentation on apples.

Crop health monitoring

Aerial phytobiopsy

Aerial smart pest trap deployment and recovery

Autonomous insect-trap recovery with Penn Aerial Robotics NSF challenge UAV,
being tested by team leader Lukas Vacek.
Autonomous insect-trap deployment at the NSF UAV challenge. Clip shows
Penn Aerial Robotics NSF challenge UAV during one of the scored trials.

Agricultural decision support system (AgDSS)

Decision support systems are key to data-driven discovery. We have developed an open-source web-based agricultural decision support system for labeling and annotation of fruits, and visual symptoms of biotic stresses. Explore the AgDSS at the following site (requires registration).

The source code is available at this GitHub page

UAV autonomy

See the OpenUAV project for UAV hardware, software, and end-to-end simulation resources.


  1. Xu Liu, Steven W. Chen, Shreyas Aditya, Nivedha Sivakumar, Sandeep Dcunha, Chao Qu, Camillo J. Taylor, Jnaneshwar Das, and Vijay Kumar, "Robust Fruit Counting: Combining Deep Learning, Tracking, and Structure from Motion", submitted to International Conference on Intelligent Robots and Systems (IROS) 2018 (PDF).
  2. Daniel Orol, Jnaneshwar Das, Lukas Vacek, Isabella Orr, Mathews Paret, Camillo. J. Taylor, Vijay Kumar, "An aerial phytobiopsy system: Design, evaluation, and lessons learned," 2017 International Conference on Unmanned Aircraft Systems (ICUAS), Miami, FL, USA, 2017, pp. 188-195. (PDF).
  3. Lukas Vacek, Edward Atter, Pedro Rizo, Brian Nam, Ryan Kortvelesy, Delaney Kaufman, Jnaneshwar Das, Vijay Kumar, "sUAS for deployment and recovery of an environmental sensor probe," 2017 International Conference on Unmanned Aircraft Systems (ICUAS), Miami, FL, USA, 2017, pp. 1022-1029. (PDF).
  4. Steven W. Chen, Shreyas Skandan, Sandeep Dcunha, Jnaneshwar Das, Chao Qu, Camillo J. Taylor, Vijay Kumar, "Counting Apples and Oranges With Deep Learning: A Data-Driven Approach," in IEEE Robotics and Automation Letters, vol. 2, no. 2, pp. 781-788, April 2017. (PDF)
  5. Reza Ehsani, Dvoralai Wulfsohn, Jnaneshwar Das, Ines Zamora Lagos, "Yield Estimation: A Low-Hanging Fruit for Application of Small UAS," in ASABE Resource: Engineering & Technology for a Sustainable World, July 2016, pp. 16-18.
  6. Reza Ehsani and Jnaneshwar Das, “Yield estimation in citrus with SUAVs,” Citrus Extension Trade Journals, pp. 16-18, 2016.
  7. Suproteem. K. Sarkar, Jnaneshwar Das, R. Ehsani and V. Kumar, "Towards autonomous phytopathology: Outcomes and challenges of citrus greening disease detection through close-range remote sensing," 2016 IEEE International Conference on Robotics and Automation (ICRA), Stockholm, 2016, pp. 5143-5148. (PDF)
  8. Jnaneshwar Das, Gareth Cross, Chao Qu, Anurag Makineni, Pratap Tokekar, Yash Mulgaonkar, Vijay Kumar. "Devices, Systems, and Methods for Automated Monitoring enabling Precision Agriculture," In IEEE International Conference on Automation Science and Engineering (CASE), 2015. (PDF)
  • Patent
    • Systems, Devices, and Methods for Robotic Remote Sensing for Precision Agriculture, V.Kumar, G. Cross, C. Qu, J. Das, A. Makineni, Y. Mulgaonkar (U.S. patent, 2017) US20170372137.
    • Systems, Devices, and Methods for Agricultural Sample Collection. D. Orol, L. Vacek, D. Kaufman, J. Das & V. Kumar; (provisional patent filed, July 2017).

Software and datasets


Steven, Delaney, and JD at the 2016 Congressional Robotics Caucus at the U.S. Capitol Complex, celebrating the fifth anniversary of the National Robotics Initiative (NRI). Penn Agriculture Project was one of 10 projects selected from hundreds of NRI funded projects to showcase the initiative's impact.
(left-right) Jnaneshwar Das, Steven Chen, and Delaney Kaufman at the 2016 Congressional Robotics Caucus at the US Capitol Complex, celebrating the fifth anniversary of the National Robotics Initiative (NRI). Penn Agriculture Project was one of 10 projects selected from hundreds of NRI funded projects to showcase the impact of NRI.

Xin Wang, Masters student in Environmental Studies (2016) at University of Pennsylvania did her Capstone Project, in part with the Penn Agriculture group. As a part of her project, she accompanied the group to Florida in March 2016 and interviewed growers.


  • Undergraduate students



back to top of page

Resources are for academic and educational purposes.

Website maintained by Jnaneshwar Das, University of Pennsylvania, email: djnan [at]