Computer vision and robotics may be the future of orchard management, but one powerful piece of technology waits, untapped, already in your pocket.
The camera on your smartphone could be used to assess crop load in apple canopies, once equipped with an app in development by Washington State University engineers.
Compared to purpose-built camera and computer systems that scan every tree along the row, the smartphone system trades some accuracy for affordability and accessibility, said Manoj Karkee, who leads the agricultural automation and robotics lab at WSU’s Center for Precision and Automated Agricultural Systems in Prosser, Washington.
“Our goal is to develop something useful even though it’s not 100 percent accurate. It doesn’t have to be 100 percent to be useful,” Karkee said.
He envisions the app eventually selling for something like $10 and being used by growers to take a sampling approach to crop load assessment throughout an orchard block — perhaps capturing variability in production in different soil types or locations.
Tests of his prototype app last year actually proved quite accurate — within 95 percent of actual packout — but Karkee said it’s more reasonable to expect 85 to 90 percent accuracy, depending on the orchard system. The more two-dimensional the canopy, the easier it is for the computer program to “see” more of the apples.
But let’s back up a minute.
The app itself is simply an interface; the real innovation is the work Karkee and his team at CPASS have done for the past several years teaching computer programs how to analyze images of tree canopies, looking for fruit and flowers and branches. Those computer vision programs “on the back end” will provide the foundation for eventual automation of everything from pruning to harvest.
“The back end is constantly evolving and improving because of those other projects,” Karkee said. “Robotic harvest is the outcome of imaging and image analysis, analyzing what it is and where it is.”
The backbone is the same, but applications differ when it comes to a key problem facing all camera-driven orchard robotics: not all the apples are visible due to foliage, branches and other apples. Researchers call this occlusion, and Karkee said that it’s actually an easier problem to manage in a harvest application, because when you pick an apple, it allows you to see the ones hiding behind it.
Assessing crop load a month before harvest, however, means the camera can accurately see only a given percentage of the apples, depending on the orchard system. Users of the app will need to calibrate the system by taking images and then counting by hand on several trees, to teach the computer program what percentage of apples are visible in the canopy.
“The question is what percentage can you see,” he said. “Anytime we do these models, there is uncertainty. The less we can see, the more uncertainty.”
A similar calibration process is required by the only commercially available crop load counting system, developed by Intelligent Fruit Vision. The FruitVision system brings a high-powered computer into the orchard to process images and count and size fruit in real time.
The United Kingdom-based company recently partnered with Washington Tractor to sell an orchard-ready version of the system, mounted on an electric utility task vehicle, for $90,000, including an enclosed trailer for transport.
“Now we have a turnkey product,” said Washington Tractor’s Karl Wirth, describing the modifications they made to make the technology work for growers, including no camera-shaking vibrations thanks to electric power, speed control to increase accuracy, a rotating arm to hold the camera system, and a generator in the trailer in case you run down the UTV batteries in the field.
While that system may meet the needs of some growers looking to invest in detailed data collection, Karkee said he’s taken a “philosophically different approach” with his low-cost, smartphone-based app.
The app works by scanning the canopy of one tree at a time, collecting a series of images that can be used together by the computer program to calculate a position and size estimate for each recognized apple in three-dimensional space. Uploaded to the cloud, the images are then analyzed by the computers at CPASS.
This just takes a couple of seconds, but since orchards often lack good connectivity, it’s more likely that the app can be used to collect data all day and then a grower can review the results back at the office at the end of the day, Karkee said.
Right now, the prototype app needs more testing and an improved user interface before he will feel comfortable sharing it, and he’s looking for industry support to make that happen.
So far, the development of the app was supported by internal funding from WSU, along with some seed money from the Washington State Wine Commission — to see if the technology can be used to assess vineyard crop load as well.
“Some growers would like to have a tool that’s handy and useful and less perfect,” he said. “I think it could be a useful tool. Whether it becomes one, we have yet to see.” •
—by Kate Prengaman
Related: Computer vision systems can count apples and provide a new perspective on crop load
What is name of application please tell me
Hi Shoeb, as far as I know, the app is still in development.
Now its a year later – What is the name of the application or do you have any contact info to the creators