Imagine, the date is January 3, 2000, three days after the arrival of the new Millennium. It is also three days after the catatrophe that striked mankind and left, on the surface of the earth, ruined cities and a thick cloud of radioactive smoke. Nobody knows what happened. The survivors are mostly located in underground facilities. There are many questions remain unanswered: Are there other survivors out there? What has really happened? Because of the radiation and many unknown factors, it is not possible to send human search-and-rescue team to survey the surface. The ideal tool for the job would be autonomously flying machines that are smart enough to explore a large area in a hostile environment. Unfortunately, none is available…
Of course, there is still time before the new Millennium. As a good engineer, you anticipate and prepare for the worst to happen. You look back in the past and find the history on the development of autonomously flying machines…
Despite the obvious advantage of autonomous flying vehicle, the complexity involved makes the concept, until recently, realizable only in movie settings. Many challenges, such as stable autonomous controller and vision-guided navigation system, need to be solved before autonomous flying robots can become a reality. In view of this, Robert Michelson, past president of the Association for Unmanned Vehicle Systems (AUVS), organized the first International Aerial Robotics Competition (IARC) in 1990 to move forward the state-of-the art technology for aerial robots.
The ongoing battle towards perfection
Since 1990, AUVS has been holding the IARC annually. Each year, aerial robots built by participating university teams will be required to perform specific tasks, such as autonomous flight, object identification and interaction with ground objects. In the beginning, these missions were thought to be almost unachievable by teams made up of university students. However, time has proven the opposite. Throughout the years, despite the ever increasing level of difficulty in the mission, participating teams were able to consistantly achieve the target requirements.
In 1997, the competition saw the helicopter-based robot from Carnegie Mellon University successfully achieved autonomous flight, real-time visual navigation, drum identification and label recognition; thereby completing over 90% of the mission. It was then the organizers felt that the time has come for stepping up the competition to the next level.
Towards the Millennium
Up to 1997, the competition was conducted under highly structured environment. For example, teams were told to expect certain types of objects to be identified. To attain higher level of difficulty and to bring the mission closer to reality, the previously task-specific missions will be replaced by goal-specifc missions. The selected mission goal is search-and-rescue.
Instead of trying to complete specific tasks, under the new mission, each team will design a group of highly intelligent robots (both aerial and ground) to perform search-and-rescue operation in a disaster-like environment. The environment will be characterize by a random arrangement of obstacles as well as potentially damaging moving objects. For example, telephone poles may be in the flight path of the air vehicle and water jets from bursted pipes may be shooting randomly into the sky. The robotic teams will be required to navigate through hostile environment and look for human-like bodies. The human animatronics can be survivors (featuring waving arm motion and screaming sound) or dead bodies (motionless).
In addition to reporting the locations of these human bodies, the new mission also allows provision for tasks that will assist in real search-and-rescue operations. For instance, tasks such as providing life-support equipment to survivors, extinguishing fires and identifying hazards to be avoided are included in this category. As a result, the possibility is unlimited.
The battle field
To bring the overall mission down to a manageable size, the competition will be divided into multiple stages, with the final competition set out for teams to achieve human search-and-rescue performance. In light of the beginning of the next millennium and to celebrate its tenth anniversary, the International Aerial Robotics Competition will hold the final competition, termed the Millennial Event, in 2000. The competition in 1998 was one of the two Qualifiers prior to the Millennial Event.
The 1998 Qualifier took place in August. The competition was held at U.S. Department of Energy’s Hazardous Material Management and Emergency Respone (HAMMER) facility in Richland, Washington. Subsequent competitions will also be held there. HAMMER offers many of the physical hazards that are suitable for the competition. Large scale fires, obscurring smoke and shooting water jets are among those that can be expected for the final competition.
The Waterloo Aerial Robotics Group (WARG) was among the many competitors participated in the 1998 Qualifier. Fourteen other teams from around the world competed. Among the teams were entries from schools such as Massachusetts Institute of Technology (MIT), Georgia Institute of Technology, University of California (Berkeley), University of British Colombia and Simon Fraser University.
WARG was established in 1997 at the University of Waterloo for building an entry into the Millennial Event. In their first competition, WARG has accumulated the second highest score, in addition to getting the best Overall Innovation award, in the 1998 Qualifier. The team is expected to compete again in the 1999 Qualifer to be held later this year in June.
Because of the potential sudden appearance of dangerous obstables, the flight control for the air vehicle will be quite dynamic, i.e. abrupt changes in the flight path might be required due to a jet of fire appearing in front of the helicopter. As such, a stable and robust control scheme will be essential to ensure flight safety. Similarly, the sensor system for detecting obstacles is also critical to the survival of the air vehicle. Therefore, our focus is on developing the control and vision system for the competition.
Our choice of air vehicle is a JR-Propo model helicopter. The helicopter is equipped with a Pentium 200 Single Board Computer (SBC) from Technoland for processing power, GPS unit from NovAtel for positioning data, PC/104 Capture Card from InSync Technologies for image data, 3DM solid state Roll-Pitch-Yaw sensor from MicroStrain for orientation data and radio modem from FreeWave. The SBC has 64 MEG of RAM and PC/104 expansion bus. PC/104 Flash Drive and Serial Expansion board (from ConnectTech Inc.) will be added to the SBC for extended functionality. The PC/104 Capture Card will be used together with two Sony cameras to provide steoerscopic image data. The NovAtel GPS, cameras and 3DM sensor together collect all the sensory data for the helicopter. The SBC onboard the helicopter will be communicating back to a base station computer (via the radio modem) for status update and system monitoring. The base station computer is a Pentium laptop computer. Both systems run QNX 4. Communication between them is via QNX message passing.
Early on, we have decided to perform all processing tasks onboard the helicopter in order to avoid delay in the helicopter response time. Tasks on the helicopter are divided into subsystems. There will be an artifical intelligence system responsible for high level decision making such as navigation. The AI system will make use of data collected by the various sensor systems (such as vision) in compiling its decisions. As well, AI will steer the course of the helicopter by sending commands to the control system. The vision system processes image data collected by the cameras and provides information such as potential location of survivors. The control system will be responsible for maintaining in-flight stability of the helicopter. It will accept flight commands in the form of desired position of the helicopter. Once a new command is received, the control system will generate the necessary control signals for bringing the helicopter to the specified position and output those control signals to the servos.
The AI system will also communicate to the base station computer about the findings that it has acquired from the field. In addition to monitoring, the base station computer also serves as a backup computation engine. For example, images that contain potential objects of interest (such as survivors) may be sent to the base station for further processing to confirm (or reject) the findings.
QNX comes to the rescue!
The foremost requirement for a successful entry is stability of the software platform. We cannot afford the Operating System (OS) to crash in the middle of a flight. Further, to be able to react quickly to unexpected events and maneuver away from danger, the controller must be running in a relatively fast and periodic fashion. This implies that a Real Time Operating System (RTOS) will be required. Real Time Operating Systems are often characterized by size and speed with good ones being small and fast. Another, often ignored, characteristic of a good RTOS is guarenteed minimum time to service. This last characteristic is the most importatant in our situation. For example, we must be able to switch to the control process on time to calculate the control signals to avoid crashing. By the same token, a RTOS is also necessary for the capturing and processing of visual data used in detection of obstacles. Thus, selection of both the software and hardware to be used onboard the air vehicle are of equal importance.
Where does QNX fit into this problem and what made QNX the best choice? Being a RTOS, QNX forms a good candidate for our project. Also, there are several key features of QNX that made it by far the best choice for this project: its self-hosted environment, scalability, and most importantly its inherent distributed nature. Each of these features made QNX attractive, and even critical to the success of our project.
Having a self-hosted environment allows group members not only to work at home on their peronsonal computers but to prototype solutions in an interactive environment. Having this interactive environment allows for rapid development since testing and debugging can occur on desktop PCs. This allows more people in the project to be productively writing code at one time than would be possible if they had to run and test the code on the real embedded systems.
QNX’s scalability coupled with the self-hosted environment gives our project a single operating system that can be used in the field. Ranging from our laptop computers to the embedded Pentium SBCs and, in the future, to the smaller 386 PC/104 form factor computers, QNX can run as the OS for all of them. Due to QNX’s micro kernel design and the flexibility that QSSL provides to developers, we can reduce our system size down to the bare minimum that is required to run. This means that we don’t need any physical drives. We can rely completely on flash-based devices and RAM drives for perminant and temporary data storage. In a ruggedized system, such as ours, having solid-state components is important. For instance, a standard hard drive would not be able to function correctly on our helicopter due to strong vibration. At the other end of the spectrum, we are able to scale QNX up as a full-blown general purpose OS to be used on our ground station laptops.
Lastly, and most importantly, QNX’s inherent distributed nature plays a critical role in our system. By design, QNX is able to provide a seamless distributed system, using the same form of inter-process communication (IPC) that it uses on a single machine, to allow processes across many nodes (machines on a network) to communicate. This is a very powerful concept. In our design, each robot in the system is a QNX network node. This allows the AI systems on each individual robot and the AI system on the ground station to communicate as if they were all part of the same computer. An example of how we exploit this functionality is our Configuration Server. This server process runs on our ground station laptop and provides a central source for all the robots to query their configuration and profiles at boot time. This lets us use the hard drive of the laptop as the central storage source. With this setup, for simple configuration changes in the field, we can simply edit a file on the base station laptop and restart the robot – no firmware changes! Besides providing an easy way for processes to communicate across a network, QNX’s distributedness goes even deeper; all resources on all nodes can be accessed from any node on the network – files, devices, etc. This is very useful while testing in the field. If we are having a problem with a sensor on a robot we can simply open the data port on that robot remotely from our base computer and see what the data looks like; as if the data port was actually attached to the base station!
Looking into the future
As mentioned, the WARG team has accumulated second place standing after the first Qualifier in 1998. Our success relied on an innovative divison of labour design and our choice of a stable software platform for development. QNX’s flexibility has sped up our design process significantly and increased the efficiency and reliability of our system. We are looking forward to compete again in the 1999 Qualifier with QNX on our side once more.
International Aerial Robotics Competition Web Site:
Filed under: WARG News