GUY NORRIS / LOS ANGELES

NASA's Jet Propulsion Laboratory (JPL) is developing a cellular neural network based on the capabilities of the human eye for autonomous terrain recognition, as it progresses its insect-like sensors testing on unmanned air vehicles (UAV) (Flight International, 9-15 September).

The California-based JPL is developing the neural network with UAV developer MLB. Based on the human eye and the optical processing capabilities of the visual cortex, the prototype system is showing "high rate real-time processing at the rate of 30 frames/s or so," says NASA.

The tests involve "training" the neural networks to recognise features of interest, like water courses and other distinct natural features. Further research next year will focus on neurally-inspired intelligent flight controls, and will build on research under way at NASA Ames.

The research is part of NASA's plan to develop a "bio-inspired" vehicle capable of autonomous flight on Mars by 2009. The Bees UAV should be able to automatically maintain attitude reference and altitude hold, and be capable of terrain following and controlled flight through ravines and narrow passages. Autonomous target recognition and natural scene identification are also seen as necessary.

NASA's tests of 1.5-5kg (3.3-11lb) delta-wing UAVs in the Mojave Desert should lead to a downselection of the most promising devices in the first half of next year.

The optical suites being tested are based on the eyes of honeybees, dragonflies and rabbits, and NASA hopes they will lead to sensors allowing the 1,000km (540nm)-range Martian UAV to resolve targets ranging in size from 50mm (2in) to 10µm.

The Bees UAV would enter the Martian atmosphere protected by a "backshell" released when a supersonic parachute is deployed at an altitude of around 17,700ft (5,400m). A second parachute, deployed at 1,180ft would slow the Bees for the final descent.

Source: Flight International