In the Dusty Fields of Autonomous Farming, High-Performance 3D Vision Sensors Offer the Clearest View

As we move more deeply into the twenty-first century, the age-old agricultural industry is exploring new technologies that can help farmers “make smarter decisions on the field and do more with less,” McKinsey & Company writes—including autonomous solutions. Autonomous farming solutions will increase safety, mitigate labor shortages, amplify output, and maximize equipment utilization. Yet the farm is an oft inhospitable environment. Beyond the common adverse conditions that today’s equipment must deal with—mud, wind, rain, fog—dust presents particular challenges for autonomous systems, making it very difficult to “see” and guide a vehicle or piece of “intelligent” equipment. 

 

Cultivating soil and harvesting crops is a dusty business. And as temperatures rise and rainfalls decrease, climate scientists believe we may see an ever-broadening swath of dustier farmlands. As NASA scientist Benjamin Cook explains, “If we see continued increases in aridity and drought, as we expect with climate change, we’ll start to see further increases in dust loads in the atmosphere.” To combat dusty farming environments, autonomous systems must be equipped with sensors that can assemble accurate 3D pictures of their surroundings, even in the face of low visibility.

 

Dust is composed of solid, non-translucent particles that, when blown from the earth, form a dense cloud that is particularly difficult to see or navigate through. Precipitation such as snow, rain, or fog also reduces visibility, but in all but the most severe cases, navigation by human or vision-based systems is still possible. With dust, however, it takes a particularly high-resolution and robust sensor to receive enough photons that pass through the dust cloud to form an accurate 3D model of objects behind the dust.

 

One particular sensing technology has been proven to work well in this challenging environment: 3D camera-based stereo vision. High-resolution advanced stereo vision sensors have demonstrated an exceptional ability to “see” through dust in agricultural settings due to their precise calibration, high resolution, multiple perspectives, and wide baseline (the distance between the cameras). 

 

The latest technology in stereo vision delivers superb, precise 3D pictures of a scene in real time. Calibration software that runs on a GPU enables the cameras to be “untethered”— that is, separated and mounted independently. At 7 to 20 frames-per-second, the two camera images are aligned and compared, yielding depth measurements for 35 million pixels per second. This unparalleled data density is what enables this new generation of stereo vision sensors to “see” through dust. With 5 million pixels per frame, any light photons that make it through the dust cloud are captured by the sensors and used to calculate depth to the structures from which the photon emanated. 

 

The dominant competing technology for ranging is lidar (light detection and ranging). Lidars emit light with lasers, and those photons bounce off objects in the scene and return to a sensor within the lidar. Lidars measure the time those photons take to return and calculate the distance to the object from which the photon bounced. Lidar is accurate, but reliant upon the photons both making it through the dust cloud, and returning successfully through the dust cloud again without being blocked by dust particles in either direction. Additionally, as compared with advanced stereo vision, which has 35 million points per second to work with, lidars have only between 600,000 and 6 million points, depending on the lidar technology. So with 6x to 60x fewer points to work with, and the reliance on a successful round trip for each photon, lidar is significantly less effective in measuring distance in dusty environments. 

 

Head-to-head: comparing state-of-the-art stereo vision sensors and lidar

NODAR recently conducted performance tests to analyze how our advanced stereo vision platform compares to lidar sensors in various dust conditions. The tests were conducted with two 5.4-megapixel Sony IMX490 cameras with 6-millimeter, 70-degree field of view lenses and a 110-centimeter baseline. The lidar test configuration used an Ouster digital lidar sensor (OS1-128) with 128-channel resolution.

 

Concrete was blown into the air to simulate low, medium, and heavy levels of dust in three scenarios: viewing a stationary target, driving in an off-road environment, and stationary tillage. To compare the two sensor technologies, the test measured the number of valid-range data points each system returned, indicating how well they could “see” obstacles behind the cloud of dust.

 

The results: cameras surpass lidar 

Stationary target: Both systems returned 100% valid data points when the environment was dust-free. But when dust was blown through an open area to obscure a small cart, the 3D cameras significantly out-performed lidar at all density levels—successfully delivering sufficient range measurements through mild, moderate, and heavy clouds of dust when lidar could not:

  • In mild dust, 3D cameras returned nearly 90% of valid data points, while lidar leveled off at 40% 
  • In moderate dust, 3D cameras returned more than 50% of valid data points, while lidar dropped to less than 20%
  • In heavy dust, 3D cameras returned more than 20% of valid data points, compared to lidar at less than 10% 

 

Dynamic off-road driving: A commercial-grade tractor tested the two sensor systems by driving off-road in a high-vibration environment. Compared to the lidar test results, the camera-based sensor provided the most stable measurements, and offered longer range and higher data density than the lidar system.

 

Stationary tillage: During the tillage test (tillage is the process of turning over soil to prepare it for planting), cameras provided better visibility in mild and moderate dust. The camera-based sensor also provided higher density and longer-range measurements in this environment. 

 

The triple win

Autonomous farming is sparking a positive outlook across the agricultural industry today. “As more growers realize the triple win that farm automation can represent—greater agricultural productivity and profits, improved farm safety, and advances toward environmental-sustainability goals—excitement about these technologies will spread,” McKinsey predicts. To achieve these results, automated equipment must be able to operate in harsh farming environments. The most difficult of these settings is dusty conditions. In this brave new world, making the smartest possible technology choices may be more important than ever. Today, in the final analysis, only advanced stereo vision sensors are able to see adequately through dust clouds for automated equipment to operate effectively and safely. 

 

Comments (0)

This post does not have any comments. Be the first to leave a comment below.


Post A Comment

You must be logged in before you can post a comment. Login now.

Featured Product

Cree LED J Series® JB3030C E & F Class White LEDs

Cree LED J Series® JB3030C E & F Class White LEDs

Introducing our cutting-edge J Series® JB3030C E & F Class White LEDs, featuring industry-leading LED efficiency up to 242 LPW or 3.33 PPF/W typical. Sharing the same high-reliability package, the two performance options of E & F Class allow luminaire manufacturers to boost performance for high efficacy lighting in outdoor areas, indoor harsh environments and horticulture applications. J Series JB3030C LEDs are an easy design choice: footprint compatible with 301B/H, available LM-80 data, and a full range of color temperatures (2700-6500K) and CRIs (70-80-90). Upgrade your lighting with unmatched performance and durability.