Networked UAV Command, Control and Communication

9
Networked UAV Command, Control and Communication Jack Elston * , Eric Frew , Brian Argrow University of Colorado, Boulder, CO, 80309, USA The networked UAV command, control, and communication project has designed, im- plemented, and tested a network-centric, intelligent flight management system for UAVs. The system utilizes an existing ad-hoc network to demonstrate the ability of UAVs to make mission level decisions autonomously based upon network metrics and operator-specified parameters. Each UAV features a highly modular hardware architecture which provides a standardized interface to the onboard systems and enables system scaling. A high-level bus protocol is used on this interface to transport messages between the avionics package, network node, supervisory computer, and scientific payload. All decisions are made au- tonomously, but an operator maintains control of the mission parameters, and may modify them through the ad-hoc network using a mobile monitoring station. The station features a GUI to display feedback on node status as well as provide a visualization of any sci- ence data being collected. The collective system was tested and verified utilizing both a hardware-in-the-loop simulation, and an experiment at an outdoor range. I. Introduction T he ability to collect and transport data across large, mobile networks is central to many humanitarian and scientific efforts. In emergency situations, where power is lost or extra network capacity is needed, deployment of a mobile communications support network is invaluable. 1 In wildland fire fighting situations where both incident commanders and fire fighters have need for real time situational data, a deployable mobile network is imperative to insure firefighter survival and minimal loss of property. A deeper understanding of tornado formation will lead to a direct improvement in advanced warning systems, saving many lives, 2 and will benefit from mobile sensor networks that relay data instantly from sensors placed dangerously close to the storm to researchers. Mobile sensor networks that relay time sensitive information will also aid investigation of the role of the Arctic as a bellwether for global climate change. 3 Aerial platforms have been demonstrated to be ideal for use in maintaining these mobile networks. 3 They enable deployment in areas impassible to other vehicles while maintaining the necessary mobility to provide coverage to highly dynamic, or widely dispersed networks. A network-centric methodology is needed to provide the UAVs with the ability to autonomously position themselves for ideal connectivity, and to be able to coordinate with other UAVs to provide the best effective coverage for the given situation and mission parameters. Several projects have made progress in effectively demonstrating control and coordination of miniature and small UAVs. One miniature UAV project 4 has developed a system that can support several intelli- gent vehicles, and has demonstrated a method for real-time path planning, along with cooperative aerial surveillance. 5 This could potentially be expanded to provide for the UAVs to support a mobile network and its applications mentioned above. A small UAV project 6 has developed and tested a system that enables a group of vehicles to perform coordinated tasks by assigning the group a mission and allowing the autonomous systems to perform tasking and waypoint planning in real time. The inter-vehicle communications for this project are done through ground nodes, but by moving this functionality to the vehicles, this project could similarly support a mobile network. * Graduate Research Assistant, Department of Aerospace Engineering Sciences. Student Member Assistant Professor, Department of Aerospace Engineering Sciences. Associate Professor, Director Research and Engineering Center for Unmanned Vehicles. Senior Member. 1 of 9 American Institute of Aeronautics and Astronautics

Transcript of Networked UAV Command, Control and Communication

Networked UAV Command, Control and

Communication

Jack Elston ∗ , Eric Frew† , Brian Argrow ‡

University of Colorado, Boulder, CO, 80309, USA

The networked UAV command, control, and communication project has designed, im-plemented, and tested a network-centric, intelligent flight management system for UAVs.The system utilizes an existing ad-hoc network to demonstrate the ability of UAVs to makemission level decisions autonomously based upon network metrics and operator-specifiedparameters. Each UAV features a highly modular hardware architecture which providesa standardized interface to the onboard systems and enables system scaling. A high-levelbus protocol is used on this interface to transport messages between the avionics package,network node, supervisory computer, and scientific payload. All decisions are made au-tonomously, but an operator maintains control of the mission parameters, and may modifythem through the ad-hoc network using a mobile monitoring station. The station featuresa GUI to display feedback on node status as well as provide a visualization of any sci-ence data being collected. The collective system was tested and verified utilizing both ahardware-in-the-loop simulation, and an experiment at an outdoor range.

I. Introduction

The ability to collect and transport data across large, mobile networks is central to many humanitarianand scientific efforts. In emergency situations, where power is lost or extra network capacity is needed,

deployment of a mobile communications support network is invaluable.1 In wildland fire fighting situationswhere both incident commanders and fire fighters have need for real time situational data, a deployable mobilenetwork is imperative to insure firefighter survival and minimal loss of property. A deeper understandingof tornado formation will lead to a direct improvement in advanced warning systems, saving many lives,2

and will benefit from mobile sensor networks that relay data instantly from sensors placed dangerouslyclose to the storm to researchers. Mobile sensor networks that relay time sensitive information will also aidinvestigation of the role of the Arctic as a bellwether for global climate change.3

Aerial platforms have been demonstrated to be ideal for use in maintaining these mobile networks.3 Theyenable deployment in areas impassible to other vehicles while maintaining the necessary mobility to providecoverage to highly dynamic, or widely dispersed networks. A network-centric methodology is needed toprovide the UAVs with the ability to autonomously position themselves for ideal connectivity, and to beable to coordinate with other UAVs to provide the best effective coverage for the given situation and missionparameters.

Several projects have made progress in effectively demonstrating control and coordination of miniatureand small UAVs. One miniature UAV project4 has developed a system that can support several intelli-gent vehicles, and has demonstrated a method for real-time path planning, along with cooperative aerialsurveillance.5 This could potentially be expanded to provide for the UAVs to support a mobile network andits applications mentioned above. A small UAV project6 has developed and tested a system that enables agroup of vehicles to perform coordinated tasks by assigning the group a mission and allowing the autonomoussystems to perform tasking and waypoint planning in real time. The inter-vehicle communications for thisproject are done through ground nodes, but by moving this functionality to the vehicles, this project couldsimilarly support a mobile network.

∗Graduate Research Assistant, Department of Aerospace Engineering Sciences. Student Member†Assistant Professor, Department of Aerospace Engineering Sciences.‡Associate Professor, Director Research and Engineering Center for Unmanned Vehicles. Senior Member.

1 of 9

American Institute of Aeronautics and Astronautics

Figure 1. Onboard Architecture.

Rather than first developing aerial platforms and later adapting the network to meet the UAV’s controldemands, the Networked UAV C3 methodology is to use an existing communications schema and adapt theUAVs to the network framework. In this manner, the problem of the construction and maintenance of amobile network can be more effectively approached. Furthermore, it will facilitate expansion of the networkand allow for future investigation of other network based algorithms using both fixed and mobile platforms.

(a) Ground DeployedMNR

(b) Vehicle Deployed MNR (c) UAV Deployed MNR

Figure 2. Network Nodes

The Ad-hoc UAV Ground Network (AUGNet) was developed at the University of Colorado, Boulder toinvestigate the performance of airborne mobile ad-hoc networks. The current AUGNet system allows for theconnection of many mobile nodes into an ad-hoc network utilizing dynamic source routing and the 802.11bwireless protocol.7 Small nodes may be randomly placed throughout a range, and data may then be relayedto any chosen node on the network. Network topology may change, and in the case of deployment of anode on a UAV, the network topology may change at a significant rate. The network operates on IP basedaddressing and the transport layer can support both TCP and UDP transport schemes. It has been testedand benchmarked in its current configuration with static and both terrestrial and aerial based mobile nodes(Figure 2). All of the nodes are constructed from COTS technology, keeping system cost down and enablingeasy system upgrades and integration with other devices.

A small aerial platform ( 10kg) was constructed which can support an onboard network node for approx-

2 of 9

American Institute of Aeronautics and Astronautics

imately one hour.7 The platform contains a Piccolo avionics package from Cloudcap Technologies8 whichenables the plane to be autonomously piloted around a given waypoint pattern, and to store several of thesewaypoint patterns onboard. The system has been tested and verified in several experiments conducted atan outdoor range.

This paper presents the implementation and testing of an advanced Communication, Command, andControl (C3) system for these existing small UAVs built upon the AUGNet mobile ad-hoc network. Inte-gration of devices is provided through an onboard system that allows for intelligent exchange of informationbetween the existing network and aircraft1. This synthesis combines network metrics, vehicle status, andmission parameters to create an intelligent node that may perform data-centric tasks while remaining withinspecified mission parameters. Mission parameters are defined by an operator through a console operatedfrom anywhere within the ad-hoc network. The console utilizes the network to provide real time status forboth the network and each particular node, along with providing overall situational awareness.

Figure 3. Experiment Block Diagram

II. Networked UAV C3

The goals of the Networked UAV C3 project are to provide the necessary command, control, and com-munications functions to a group of UAVs to maintain a purely autonomous flock in support of ground

communications through an ad-hoc network. In order to accomplish these goals, a modular architecturehas been implemented for the synthesis of onboard systems needed in mission level decision making. Thearchitecture, pictured in Figure 3 employs a lightweight interface node to connects to each COTS component,and provide an interface through a shared bus. Each node contains some intelligence and is responsible forinitialization, data fetching, and any needed data manipulation for a particular device. By enforcing thisparadigm, the bus traffic remains high level, and enables significant system scaling with no reprogrammingof the existing nodes. System upgrades may also be performed with ease, so long as the high-level databeing pushed to the bus remains unchanged. The current setup employs a Soekris single board computerfor 802.11b routing and communications, a Piccolo avionics packages from Cloudcap Technologies,8 a flightcomputer for making abstracted mission level decisions, and a simulated scientific payload.

A. Naiad Node

The particular interface nodes used in the current implementation are part of the Naiad system, which wasdeveloped for use in other aerospace projects.9 The Naiad node features distributed computing based uponthe Atmel ATmega128 microcontroller. These full-featured microcontrollers feature six channels of PWM

3 of 9

American Institute of Aeronautics and Astronautics

output, several general purpose I/O lines, external interrupts, 8 10bit A/D converters, and a suite of businterfaces including UART, I2C, and SPI. By utilizing one of these communication methods, the node canbe interfaced to a large number of COTS components with very little need for additional hardware. Eachnode is interconnected through the fault-tolerant, high-speed Controller Area Network (CAN) serial bus.

The CAN bus is ideal for this application due to its fault tolerance and message type based addressingprotocol. CAN supports data exchange rates up to 1 Mbps using a very large voltage differential signalto make the bus much less susceptible to noise. Coupling this with the optional fault-tolerant transceiverprovides a system that guarantees transmission so long as the destination node is connected to the bus.Furthermore, each message is broadcast to all nodes in the system, freeing the transmitting node fromverifying reception by a particular client. To enable this, messages are addressed using a type rather thandestination. Each receiving node may set a hardware filter to allow only the message types that it is interestedin to be placed in its queue. By enabling particular subsystems to transmit and receive a set of messagetypes, the system may be scaled without having to add any further software to any of the existing systemnodes. Furthermore, redundant systems may be placed on the bus without creating contention.

Figure 4. Onboard Naiad Interface Nodes and Thalassa Sensor Payload

1. Thalassa Node

The addition of a simulated scientific node allows the system to demonstrate the downlink of real-timeexperimental data that can be used to reconstruct mission parameters, or provide limits to the onboardflight computer. The particular node used in the UAV C3 system, named Thalassa, is an expanded Naiadnode with integrated temperature, pressure, and humidity sensors. The Naiad core of the Thalassa proivdescommunications across the CAN bus, sensor data acquisition, along with device interface firmware. Thisresulted in a significant savings in both development time and cost. The Thalassa board serves as an interfacefor all three sensors.

Naiad and Thalassa nodes are shown with a battery pack in Figure 4. Pictured is the mounting config-uration for the flight experiment.

2. Low-level Drivers and Device Interface

Interface to the Piccolo autopilot is provided through libraries provided by Cloudcap Technologies that havebeen modified to run given the processing and memory constraints of a Naiad node. Some functionalitywas cut out, such as message logging and queuing, but with the system being assumed to work with pseudoreal-time scheduling, the caching of messages is not important. Currently, the communications to the Piccolounit is also limited to requesting a waypoint change, requesting a turn rate, uploading a set of waypoints,reading GPS data, checking battery and communication states, and reading the current destination waypoint.The communication node does, however, provide some error handling support and will verify that uploadedwaypoints sets are properly stored, and that any waypoint change requests are received and acted upon bythe Piccolo Unit.

Interface to the Soekris 802.11b payload node was implemented using the same protocol as was used overthe TCP/UDP connection to the Soekris board from the user interface. A ping packet was transmitted at aset rate from the Soekris node in order to verify communications, and to provide stats about the serial link.

4 of 9

American Institute of Aeronautics and Astronautics

Software on the Naiad was able to decode the packets, verify checksums and either transmit over the CANbus, or respond back appropriately over the serial link.

B. Remote Monitoring Station

A user interface to the network node status and control utilizing the ad-hoc network was created. Giventhat each node communicates with its neighbors periodically in order to provide a routing table, by buildingthe GUI on top of the network, the system can provide the users with statistics on a per node basis.Communications with a particular node for data requests and control are done through the use of networksockets. Both UDP and TCP protocols are supported, and the system performs its own packetizing anderror handling in a Layer above TCP or UDP packetizing. All of these communications are logged, and canbe later used to verify experiments and extract interesting relationships between the data.

The GUI is divided into several windows which represent a given context of the system. An overviewof the network status is presented to the user in the main window. From this, he can quickly deduce thenumber of nodes connected on the network, their current location, node type (fixed, mobile, UAV), and somesimple node context information. Node context information consists currently of the destination waypointfor UAV type nodes, and node ID and altitude for all nodes. Furthermore, the flight plan for the UAV nodeis displayed on top of the geo-referenced TIFF satellite image of the range.

A secondary window exists for each node in the system and provides the operator with a command,control, and communications interface particular to the node type. In the case of the UAV node, thisinterface was not intended to replace the 900 MHz link and operator interface for the Piccolo unit.8 Instead,it provides control over only those parameters affecting mission level objectives. For the current experiment,starting waypoints for the default flight pattern, and the pattern to be flown for the experiment can bespecified. Furthermore, limits are placed on sensor measurements that will warrant a termination of theexperiment and a return to the default flight pattern. Also available in the secondary window is an interfaceto data being collected by the node. This includes various system health and status variables, round tripping times, and any data being collected by sensors connected to the node.

C. Hardware in the Loop Simulation

Initial testing of the system was done through a hardware in the loop simulation performed in the lab. Byemploying the HIL simulation developed by Cloudcap,8 and placing wireless nodes in the lab, a scenariocould be constructed that fully tested the ability of the aircraft to perform its autonomous tasks. Anyproblems with the onboard system or communication between the aircraft and remote monitoring stationusing the ad-hoc network were resolved before field deployment.

III. Experimental Results

The system has been fully verified utilizing ground and UAV nodes at an outdoor test range. An exper-iment was executed that demonstrated the UAV’s ability to make mission level decisions based upon

communications status and sensor measurements.

A. Experimental Setup

Experimental setup is given in Figure 3, and provides a detailed component view of the many systemsinvolved in the experiment. The aircraft setup, as previously discussed, is composed of a Soekris SBC for802.11b packet routing and communications to and from the plane, a Piccolo avionics package, a flightcomputer, a scientific payload, and interface nodes to tie the systems together. A ground station to supportPiccolo operations was based at the airfield and was composed of the Piccolo ground station, a Pilot Consolefor manual piloting, and a Laptop connected to the ground station serial interface to allow an operator tocommand changes to the Piccolo system. This ground station was used for takeoff and landing (since thisis performed manually), and once the UAV was placed under autonomous control, was only maintainedto provide a failsafe in case a problem was encountered during the experiment. A laptop was used as theremote monitoring station to provide an operator with status and control for the various nodes on the ad-hocnetwork. All of the network packets, along with a periodic status packet sent by each node, was transmittedto a gateway which allowed for transport to an off-site database to be used in analysis. Two network nodes in

5 of 9

American Institute of Aeronautics and Astronautics

Table 1. Experimental Test Plan

1. The UAV is manually piloted for takeoff and

correct behavior of the Piccolo unit is verified

2. Ares is commanded into autonomous mode and flies

flight plan 1, which is preloaded.

3. A “start experiment” command is sent from the remote

monitoring station to the UAV over the ad-hoc network.

4. The UAV transitions into flight plan 2 where it sends

a sensor report every second consisting of temperature,

pressure, and humidity data.

5. The UAV maintains flight plan 2 until one of the

following conditions is met:

a. The temperature probe records a “simulated” temperature

below 40 degrees Fahrenheit (potential icing).

b. The communication link between the RMS and UAV

has been down for more than 40 sec.

6. Ares transitions back into flight plan 1.

Figure 5. Experiment Test Range and Flight Patterns

6 of 9

American Institute of Aeronautics and Astronautics

addition to the node on the plane were located in the field to provide routing between the various componentson the network.

The experimental procedure presents minimal complexity and fully demonstrates the capabilities of theremote monitor station and onboard systems in the aircraft. A brief procedural outline is given by Table 1.

A visual representation of the experimental plan is shown in Figure 5. From this diagram it becomesevident that flight plan 1 is a “return to base” formation while flight plan 2 represents a much larger trackthat would be taken to gather experimental data.

The given mission parameters for mandating a “return to base” were chosen to demonstrate the abilityof the plane to conduct a mission based upon any combination of network metrics, scientific data, or aircraftstatus. Ambient temperature was chosen as a constraint as it represents both an experimental constraint(sensor measurements might not be desirable below a certain temperature), and a platform constraint (prob-lems with wing icing occur below a certain temperature). Communications with the network monitoringstation was also chosen since it allows the plane to react to a metric derived from the communicationsnetwork.

Each of these parameters were simulated as it was not desirable to lose complete communications withthe monitoring station for experimental purposes, mainly the recording of data, and the temperature in alarge outdoor environment cannot be predicted or changed. “Communications” with the monitoring stationwas defined as the reception of a ping packet through the ad-hoc network. The operator at the station couldstop and start the ping packet transmission at will and thus induce the plane to return to base should theping packet fail to be transmitted over a set period of time. Temperature was recorded from the sensor onthe Thalassa node, but when the experiment was started, the Thalassa subtracted a degree from the readingfor each second passed before transmitting the value on the CAN bus. In this manner a temperature dropcould be simulated and eventually hit a the temperature limit for the experiment specified by the operator.

B. Results

58 59 60 61 62 63 64 65 66 67

0

5

10

Way

poin

t Num

ber [

#]

Target Waypoint Number

Tracked WaypointStart ExperimentExperiment Trigger

57 58 59 60 61 62 63 64 65 66 670

50

100

Tem

pera

ture

[deg

F]

Recorded Temperature on UAV

TemperatureTrigger Level

58 59 60 61 62 63 64 65 66 670

200

400

600

800

Mission Time [min]

Pin

g Ti

me

[ms]

GS−UAV Round Trip Ping Time

Figure 6. Experimental Results: a. Destination waypoint number b. Recorded onboard temperature c.Recorded ping times from the plane to the monitoring station

Results were obtained from a 50 minute flight of the UAV which was autonomously piloted for 30 of thoseminutes. Most of the other 20 minutes were used to verify correct Piccolo operations and communicationsbetween all of the nodes in the network. During the experiment all of the messages between the networkmonitoring station and the UAV were recorded, and the network metrics were backhauled to the off-sitedatabase and stored for later analysis. The scientific data measured by the Thalassa node was downlinkedto the network monitor at a frequency of 2Hz.

Figure 6 shows the primary results of the experiment. All of the graphs shown depict a value vs. time inminutes since the plane began communicating with the network monitoring station. The top graph shows

7 of 9

American Institute of Aeronautics and Astronautics

the destination waypoint for the UAV. The waypoint plans (as can be seen in Figure 5, consist of waypoints2-7 for flight plan 1 and waypoints 10-15 for flight plan 2. Any transition between these two flight plansis made evident by a change in range of the destination waypoints. The second graph depicts the recorded(and simulated) temperature by the Thalassa node during the experiment. Interesting points include thelinear transition from the current temperature to 32 degrees Fahrenheit as the Thalassa subtracts from theactual temperature following a start of experiment command, and the transition back to actual temperaturefollowing a second start of experiment command. The bottom graph shows ping times in milliseconds forthe round trip transmission of a packet from the monitoring station to the plane and back.

By analyzing this graph, it can be shown that the plane was able to autonomously make decisions basedupon parameters set by the operator over the 802.11b ad-hoc link. For the both experiments, the temperaturelimit was set to 40 degrees, as marked by the horizontal dashed line in the second graph. The intersectionof the reported temperature and the limit is identified by the left-most dashed red line which is carriedthrough the other two graphs. In the second experiment, a communications timeout between the plane andthe monitoring station was set to 40 seconds. The absence of the ping packets transmitted between the planeand monitoring station (simulating a communications dropout) for 40 seconds is marked by the right mostvertical dashed line which is carried through the other two graphs. At both places where an experiment isbeing performed (the plane is tracking waypoints from the second pattern) and a limit is reached, a automatictransition to the “return to base” waypoint pattern can be seen. Furthermore, by looking at the second andthird graph, it can be shown that each type of trigger was acted upon since only one trigger is reached perexperiment.

Figure 7. Humidity Microclimate Interpolated from Flight Data

Following postprocessing of the data a further interesting result was encountered. The scientific datarecorded from the Thalassa node readings included relative humidity. By graphing the humidity vs. GPSposition of the plane, it can be seen that a spatially consistent change of about 2% is present over the testrange. Interpolation between the data points using the Kriging technique reveals a humidity controur ashown in Figure 7.

IV. Conclusion

An onboard flight management system was constructed to provide a link between two existing systems,an ad-hoc 802.11b network and a UAV avionics package. This synthesis provides for an intelligent UAVplatform which can make high level mission decisions based upon network metrics and specified operatingconditions. The modular nature of the onboard system allows for significant system scaling to be performedwithout requiring a large overhead. Furthermore, it enables the addition of several scientific payloads fortaking onboard measurements that may be relayed in near real time to a network monitoring station. Thesystem was fully verified through both a hardware in the loop simulation and a simple experiment conductedat an outdoor range.

8 of 9

American Institute of Aeronautics and Astronautics

References

1Meissner, A., Luckenbach, T., Risse, T., Kirste, T., and Kirchner, H., “Design Challenges for an Integrated DisasterManagement Communication and Information System,” Proc. IEEE Workshop on Disaster Recovery Networks (DIREN’02),New York City, New York, June 2002.

2“VORTEX-2,” http://www.vortex2.org, 2005.3Argrow, B., Lawrence, D., and Rasmussen, E., “UAV Systems for Sensor Dispersal, Telemetry, and Visualization in

Hazardous Environments,” Proc. AIAA of the 43rd AIAA Aerospace Sciences Meeting and Exhibit , Reno, Nevada, Jan. 2005.4Beard, R., Kingston, D., Quigley, M., Snyder, D., Christiansen, R., Johnson, W., McLain, T., and Goodrich, M. A., “Au-

tonomous Vehicle Technologies for Small Fixed-Wing UAVs,” JOURNAL OF AEROSPACE COMPUTING, INFORMATION,AND COMMUNICATION , Vol. 2, 2005, pp. 92–108.

5Beard, R., Mclain, T., Nelson, D., and Kingston, D., “Decentralized Cooperative Aerial Surveillance using Fixed-WingMiniature UAVs,” IEEE Proceedings: Special Issue on Multi-Robot Systems, (to appear), 2006.

6How, J., King, E., and Kuwata, Y., “Flight Demonstrations of Cooperative Control for UAV Teams,” Proc. AIAA 3rdUnmanned Unlimited Technical Conference, Workshop and Exhibit , Chicago, Illinois, Sept. 2004.

7Brown, T. X., Doshi, S., Jadhav, S., and Himmelstein, J., “Test Bed for a Wireless Network on Small UAVs,” 2004.8“The Cloudcap Website,” http://cloudcaptech.com, 2005.9Elston, J., Argrow, B., and Frew, E., “A Distributed Avionics Package for Small UAVs,” Infotech@Aerospace Techinical

Converence, AIAA, Arlington, VA, 2005.

9 of 9

American Institute of Aeronautics and Astronautics