Australia’s Defence Science and Technology Organisation (DSTO) has demonstrated using UAV sensor data being provided to an unmanned ground vehicle in real time to enable it to navigate more effectively in cluttered environments.

The December 2006 demonstration saw three-dimensional laser radar (ladar) and electro-optic imagery data fused into 3D information and provided to the UGV. In turn the UGV fused that information into a navigable picture in time frames ranging from several minutes to as low as 10-30s..

Dr Anthony Finn, head of DSTO’s automation of the battlespace research programme, says the co-operative approach allowed “creation of a single, homogenous view of the world.... It gives you a very clear view of what the terrain is like and what we are and are not able to drive over.”

The demonstration used a Yamaha RMAX helicopter and an Ontario Drive & Gear Argo eight-wheel drive all-terrain vehicle modified to operate as a UGV. A millimetric wave radar was fitted to the UGV as its sole navigation sensor.

In the initial phase, the UAV was flown repeatedly over the trials area to build an airborne picture of the terrain using the ladar and electro-optic sensors.

“That provides us with an image from the air which gives 3D structure,” says Finn.

The UGV was then sent into the same area with instructions to autonomously transit the area using only radar data. That transit saw multiple delays and course corrections as the vehicle struggled to make sense of its environment, says Finn.

UGV navigation in unknown terrains is fundamentally restricted by the range limits of its sensor payloads. Finn says the ARGO UGV radar provided an event horizon at around 50-100m (165-330ft) ahead of the vehicle. “Much of what they are looking at is occluded.”

In the final phase of the demonstration the UAV data was provided to the UGV to give it a wide area picture against which to compare its own radar returns. This allowed the UGV to “drive through that 3D environment and obviously plan ahead and predict where we need to go in a more optimal sense, and in addition to that, complete the 3D multi-perspective view of the world.”

The ladar unit was a commercially available unit, which was miniaturised by DSTO to meet trial requirements.

Airborne sensor data was processed in real time during its collection, with the time delays directly related to the processing into a 3D picture and data fusion by the UGV. “When the UAV passes its data to the UGV, the UGV sits there and thinks about it as well in terms of the planning. So once it has got its information, in the order of 5-10 seconds it thinks about it, works out its plan, and then starts going. The UGV does all of its moment to moment control, for obvious reasons, in real time”.

Rectification of distortions in the LADAR and EO data gathered by the UAV was carried out in the data fusion process by algorithms which compensated for the air vehicles forward motion and axis shifts during flight.

The demonstration was carried out at the Australian Centre for Field Robotics trials range at Marulan, half-way between Sydney and Canberra. The trials area comprised a mixture of open terrain and medium density vegetation.

Finn says the demonstration confirms that “the value of unmanned vehicles increase when you fuse or integrate them across the environments; the assets can be placed deeper and more aggressively. That significantly improves your situational awareness picture.”

Source: FlightGlobal.com