Download tool to explore, filter, and download field- and lab-data. Windows: Download Zip.All systems: Visit our GitHub page for installation instructions (all systems) Video tutorial: YouTube A small written manual and explanation of the query parameters can be found here
A generative adversarial network (GAN) to translate plant-images taken in the lab to look like they had been taken in the field. Publication pending. Preview:
Original image taken in lab
Translated image (i.e., this image is generated)
Real image of a plant taken in the field
Gantry-Style Imaging Robot
Multispectral 3D Scanner
Hyperspectral Camera
Imaging Drones
Semi-autonomous fielddata collector
Semi-autonomous fielddata collector
Transferring models to the real world
Convolutional neural networks generally perform better on data that is similar to the data they had been trained on. To ensure that the models we train at TerraByte perform under field-conditions we have different strategies at our disposal. A very common one is transfer learning: Here a trained model is retrained on data from the new environment, while carrying over already trained filters for low- to high-level features. Another way is to modify our data such that it appears similar to field-data, although the images had been taken in the greenhouse. For transfer learning we collect a large amount of (initially unlabeled) field data. For the second approach we use image segmentation and background removal to digitally "plant" our lab-data in field-like environments.
Applying 3D CNNs to multispectral 3D point models
The multispectral 3D scanning project establishes a framework of generating labeled 3D imaging datasets of real plants and developing 3D convolutional neural networks to solve detection, classification and regression problems related to agriculture. The imaging scanner used in the project is the Phenospex PlantEye, which is a unique plant sensor that produces real-time multispectral 3D point models of plants and crops. The output of the scanner contains detailed information of more than 14 parameters related to the morphological and spectral specifications of plants such as plant height, 3D leaf area, leaf inclination, digital biomass and light penetration depth. PlantEye also allows the visualization of the 3D point models with different spectral indices like NDVI or PSRI, where each index enables the calculation of multiple measures related to the plant health, performance and photosynthesis. We create fully automated systems to conduct data preprocessing and analysis by developing 3D convolutional neural networks to learn, interpret and analyze 3D point models of plants in order to monitor plant growth and to improve plant health, quality and yield.
Getting plants' spectra from IR to UV
Insects see the world differently from humans. While our eyes have only three types of colour sensitive cells, butterflies for example have four types of colour receptors, allowing them to see more colours than we do. These insects have evolved this trait to better find food, mate, and to avoid predators. A device that has a similar superhuman ability is the hyperspectral camera. It scans the scene at hundreds of different wavelengths in the visible and infrared spectrum. In other words, a hyperspectral camera can distinguish hundreds of colours of light, way more than the three colours – red, green, and blue – our cell phone camera captures. Hyperspectral imaging of plants enables scientists to discover plant properties that are invisible to humans. At the University of Winnipeg, we use a hyperspectral camera to find new ways to visually characterise plants.
The following video shows the reflectance of a plant under different wavelengths. The video is colorized such that the color of the each image corresponds roughly to each wavelength's color in the visible part of the spectrum. Note that wavelengths beyond 800 nm (Infrared) are normally not visible to the human eye. We still gave it a red hue in this video.
Stochastic training algorithms
A stochastic neural network (SNN) is a kind of artificial neural network that has probabilistic network weights and usually trained under the Bayesian framework. The SNN provides a set of probability distributions for the network weights instead of weight points comparing with the regular neural network. This allows SNN to reduce the risk of overfitting and overconfident for the network prediction. Although developing an efficient training algorithm is not easy, Bayesian statistics can provide a good solution to analyze and train the SNN. My research is to explore stochastic training algorithms for the SNN by using Bayesian methods and solve them with numerical methods such as the Markov Chain Monte Carlo method and the Sequential Monte Carlo method.
Automated data collection with drones and a rover
We investigate several methods of obtaining field data with different camera systems. From simply mounting several cameras on a tractor up to drones. We are developing a field rover with the goal of fully automating the collection of field data and starting first tests with it in the growing season of 2021. The imaging equipment ranges from normal RGB-cameras to multi- and hyperspectral cameras.
Team members become drone pilots
Three of our team members, Cara Godee, Manisha Ajmani and Michael Beck, received training in flying drones and to become officially certified drone pilots. Another step into more semi-autonomous field data collection.
Co-organised by Manisha Ajmani
The University of Winnipeg is proud to sponsor, taking place online Saturday, August 14. Through this event, women researchers will get an opportunity to share their research and experiences. Dr. Nisha Ajmani UWinnipeg Master of Science in Applied Computer Science student Maryam Bafandkar is one of eight leading female scientists sharing expertise in technology, science, medicine, and engineering. Her presentation, Artificial Intelligence in Agriculture, will provide insight into what Bafandkar has learned through her work with UWinnipeg’s cutting-edge which aims to transform food production in Manitoba and beyond. Through this project, she works closely with Dr. Manisha Ajmani, co-organizer for this year’s Soapbox Science event.
A semiautonomous field-imager
We are proud to present our new field rover! We work closely with the local industry partners R-Tech and Northstar Robotics to get this one in the fields to get close and personal with our plants.
Title |
Date |
---|---|
A New Dataset of Millions of Labelled Images for Machine Learning Applications in Digital Agriculture... more |
May 2021 |
Exceptional faculty and staff to be recognized during Spring Convocation... more |
May 2021 |
Data to drive better food outcomes... see video |
May 2021 |
Basin-scale and seasonal evaluation of automated threshold methods for surface water detection... more |
April 2021 |
The Future of Work and Learning... more |
April 2021 |
TerraByte in PLOS' Plant Phenomics & Precision Agriculture Collection... more |
March 2021 |
Robot generates images to prepare the soil for the future of agriculture... more |
December 2020 |
Growing the digital agriculture industry... more |
November 2020 |
Virtual Soapbox Science event celebrates women in STEM... more |
November 2020 |
UWinnipeg postdoc wins $20k Aquahacking Challenge... more |
October 2020 |
Globus Aids University in Efforts to Increase Crop Yields through HPC and Machine Learning for Digital Agriculture... more |
September 2020 |
A Legacy of impact: Dr. Annette Trimbee pdf |
July 2020 |
DeepGeo illustrates how entrepreneurs and researchers can work together to create new economic opportunities pdf |
February 2020 |
Horizon Manitoba -- Building a brighter future together pdf |
January 2020 |
Farmer 4.0 How the coming skills revolution can transform agriculture pdf | August 2019 |
University of Winnipeg dives into agriculture research... more |
August 2019 |
One $50K donation sparks over $3 million in research funding... more |
June 2019 |
Accelerating Innovation Researchers Set to Grow the Digital Agriculture Industry with Intelligent Technologies pdf |
May 2019 |
UWinnipeg receives $2.4 million to grow digital agriculture... more |
April 2019 |
Government of Canada supports growth of the digital agriculture industry in Manitoba... more |
April 2019 |
UWinnipeg profs are mapping Norway... more |
March 2019 |
UWinnipeg lands $250,000 Weston Seeding Food Innovation grant... more |
February 2019 |
Seeding Food Innovation - Awarded Project 2018 Enabling the next revolution in global food production through automatically labelled data sets and machine learning... more |
November 2018 |
UWinnipeg receives $750,000 in NSERC Grants for research... more |
October 2018 |
Generous donation supports UWinnipeg computer revolution more |
June 2018 |
Title |
Authors |
Journal/Venue |
Year |
Links |
---|---|---|---|---|
An extensive lab- and field-image dataset of crops and weeds for computer vision tasks in agriculture | M. A. Beck C.-Y. Liu C. P. Bidinosti C. J. Henry C. M. Godee M. Ajmani. |
CyVerse | 2021 | CyVerse |
Presenting an extensive lab- and field-image dataset of crops and weeds for computer vision tasks in agriculture | M. A. Beck C.-Y. Liu C. P. Bidinosti C. J. Henry C. M. Godee M. Ajmani. |
CVPPA/Arxiv | 2021 | Arxiv |
An embedded system for the automated generation of labeled plant images to enable machine learning applications in agriculture | M. A. Beck C.-Y. Liu C. P. Bidinosti C. J. Henry C. M. Godee M. Ajmani. |
PLOS One | 2020 | PLOS One Winnspace |
Weed seedling images of species common to Manitoba, Canada | M. A. Beck C.-Y. Liu C. P. Bidinosti C. J. Henry C. M. Godee M. Ajmani. |
Dryad | 2020 | Dryad |
EAGL-I: Embedded Autonomous Generator of Labeled Images | M. A. Beck | Talk at Phenome 2020 | 2020 | Phenome 2020 |