Site: Laboratoire d'Informatique Fondamentale et d'Intelligence Artificielle (LIFIA)
Institut IMAG
46, Avenue Felix-Viallet
38031 Grenoble Cedex

Date Visited: May 10, 1993

Report Author: M. Lee



M. Lee


James Crowley, Ph.D.; Professor, Institut National Polytechnique de Grenoble
Patrick Reignier; Ph.D. Candidate, Active Vision and Mobile Robotic Group


Laboratoire d'Informatique Fondamentale et d'Intelligence Artificielle, or LIFIA, is one of seven research laboratories that make up IMAG. Other labs include INRIA, INPG, UJF, CNRS, LGI, and LDS. LIFIA is divided into two divisions. One division contains the formalists. The second division is made up of four groups: robotics, vision, artificial intelligence, and active vision. Professor James Crowley heads up the group PRIMA - Mobile Robotics and Active Perception. There are thirteen Ph.D.s and Ph.D. candidates in this program, which has a budget of 1.4 million francs plus salaries.

This group is researching the basic technologies that will enable sensor-based control of remote vehicles. The scientists' focus is on using data from a video camera and from a sonar as input to a vehicle control system. They are also involved in 3-D model construction of observed objects. They develop and demonstrate their research on land robots, although the technology is equally applicable to underwater vehicles. The laboratory's scientists are interested in collaborative efforts that will demonstrate their research on underwater vehicles.


The laboratory has a program in navigation and control from local sensors. The WTEC team first saw a description and then a demonstration of LIFIA's system on its simulator. It is comprised of a series of sonar sensors that are mounted to look out horizontally around the periphery of a mobile robot. By processing the data from these sensors, the distance to a barrier or wall is determined. Points that are continuous and in line are linked and defined as being part of a wall segment. Wall segments are matched against a world model for fit. If a match is achieved, the vehicle uses its estimated position in this world model as an input to its global navigation system. It is able to be commanded relative to its global position as determined from the sonar or it can be commanded to respond directly with respect to the sensed barrier, as in obstacle avoidance or tracking. The data from these disparate types of sources are brought together and fused in an extended Kalman filter.


The technologies that we saw (including the research vehicle, software, video sensor subsystems, sonar subsystems, navigation, control, and sensor fusion) all seem to be applicable to underwater vehicle systems. Overall, LIFIA's work seems to be competitive with work in the United States and in other countries. The laboratory seems to be funded at a level that allows it to have a reasonable computer and hardware environment for its research. The laboratory's scientists are converting their software operating environment to VXworks, which will make them compatible with many research organizations in the United States and will make interchange of software more feasible. They are interested in joint projects with IFREMER and other French underwater vehicle research organizations to demonstrate their technology underwater. Active vision and control, using sensing of local terrain data, are potentially important technologies in the future of remote underwater vehicles.


Causse, O. and J. Crowley. "A Man Machine Interface For a Mobile Robot."

Chehikian, A. and J. Crowley. "Fast Computation of Optimal Semi-Octave Pyramids."

Crowley, J. and P. Reignier. "Asynchronous Control of Rotation and Translation for a Robot Vehicle." 1993.

Crowley, J., P. Bobet, and C. Schmid. "Auto-calibration by Direct Observation of Objects." March 1993.

Crowley, J., P. Stelmasszyk, T. Skorkas, and O. Pugerm. "Measurement and Integration of 3-D Structures by Tracking Edge Lines."

Vision as Process. Esprit Basic Research Action. BR 3038/BR 7108.

Published: June 1994; WTEC Hyper-Librarian