Development of an extendable arm and software architecture for autonomous and tele-operated control...

8
Development of an Extendable Arm and Software Architecture for Autonomous and Tele-operated Control for Mobile Platforms Yung-Sen Li a , Shawn Hunt b , Cosmin Popovici a , Steven Walter a , Gary Witus b , R. Darin Ellis a , Gregory Auner a , Alex Cao a , Abhilash Pandya a a Department of Electrical and Computer Engineering, Wayne State University, 5050 Anthony Wayne Dr., Detroit, MI USA 48202; b Turing Associates, Inc., 1392 Honey Run Dr., Ann Arbor, MI USA 48103 ABSTRACT There is a strong demand for efficient explosive detecting devices and deployment methods in the field. In this study we present a prototype mast that uses a telescoping pulley system for optimal performance on top of an unmanned ground vehicle to be able to be controlled wirelessly. The mast and payload reaches up eight feet from the platform with a gripper that can pick up objects. The current mobile platform operators using a remote-control devices to move the arm and the robot itself from a safe distance away. It is equipped with a pulley system that can also be used to extend a camera or explosive detection sensor under a vehicle. The mast is outfitted with sensors. The simple master-slave strategy will not be sufficient as the navigation and sensory inputs will become complex. In this paper we provide a tested software/hardware framework that allows a mobile platform and the expanded arm to offload operator tasks to autonomous behaviors while maintaining tele-operations. This will implement semi-autonomous behaviors. This architecture involves a server which communicates commands and receives sensor inputs via a wireless modem to the mobile platform. This server can take requests from multiple client processes which have prioritized access to on-board sensor readings and can command the steering. The clients would include the tele-operation soldier unit, and any number of other autonomous behaviors linked to particular sensor information or triggered by the operator. For instance, the behavior of certain tasks can be controlled by low-latency clients with sensory information to prevent collisions, place sensor pods precisely, return to preplanned positions, home the units location or even perform image enhancements or object recognition on streamed video. Keywords: omnidirectional drive, mobile robots, teleoperation, mast, wireless communications, ZigBeee 1. INTRODUCTION This paper presents interim results of a research project in the design and control of a mobile robot for vehicle inspection. The task context is vehicle inspection at traffic control checkpoints such as border crossings and security checkpoints. Vehicles entering at checkpoints range in size from compact cars to tanker and tractor-trailer trucks. The task elements are to approach the vehicle, to talk to the driver and passengers, to inspect the interior compartments to the extent visible from the exterior, to inspect the vehicle exterior (possibly using an “electronic nose” to detect contraband or other hazards), to view the license plate, and to withdraw from the vehicle. The robot system will operate under teleoperation remote control by a human inspector. The objective of the robotic system is to enable the operator to perform a comprehensive external inspection while remaining at a safe standoff distance. The target platform was the Omni-Directional Inspection System (ODIS) [1]. Section 2.0 describes the electromechanical system design including the ODIS base, the mast sub-system. Section 3.0 describes the communications and hardware that we tested. Section 4.0 describes the simulation of ODIS that has been started. Section 5.0 describes the TCP/IP socket interface that allows the simulation of ODIS to execute the same commands as the physical ODIS or for any computer on the same network to operate ODIS. Section 6.0 describes development and testing status and plans. Unmanned Systems Technology X, edited by Grant R. Gerhart, Douglas W. Gage, Charles M. Shoemaker Proc. of SPIE Vol. 6962, 69621U, (2008) · 0277-786X/08/$18 · doi: 10.1117/12.778059 Proc. of SPIE Vol. 6962 69621U-1 2008 SPIE Digital Library -- Subscriber Archive Copy

Transcript of Development of an extendable arm and software architecture for autonomous and tele-operated control...

Development of an Extendable Arm and Software Architecture for Autonomous and Tele-operated Control for Mobile Platforms

Yung-Sen Lia, Shawn Huntb, Cosmin Popovicia, Steven Waltera, Gary Witusb, R. Darin Ellisa,

Gregory Aunera, Alex Caoa, Abhilash Pandyaa

aDepartment of Electrical and Computer Engineering, Wayne State University, 5050 Anthony Wayne Dr., Detroit, MI USA 48202;

bTuring Associates, Inc., 1392 Honey Run Dr., Ann Arbor, MI USA 48103

ABSTRACT

There is a strong demand for efficient explosive detecting devices and deployment methods in the field. In this study we present a prototype mast that uses a telescoping pulley system for optimal performance on top of an unmanned ground vehicle to be able to be controlled wirelessly. The mast and payload reaches up eight feet from the platform with a gripper that can pick up objects. The current mobile platform operators using a remote-control devices to move the arm and the robot itself from a safe distance away. It is equipped with a pulley system that can also be used to extend a camera or explosive detection sensor under a vehicle. The mast is outfitted with sensors. The simple master-slave strategy will not be sufficient as the navigation and sensory inputs will become complex. In this paper we provide a tested software/hardware framework that allows a mobile platform and the expanded arm to offload operator tasks to autonomous behaviors while maintaining tele-operations. This will implement semi-autonomous behaviors. This architecture involves a server which communicates commands and receives sensor inputs via a wireless modem to the mobile platform. This server can take requests from multiple client processes which have prioritized access to on-board sensor readings and can command the steering. The clients would include the tele-operation soldier unit, and any number of other autonomous behaviors linked to particular sensor information or triggered by the operator. For instance, the behavior of certain tasks can be controlled by low-latency clients with sensory information to prevent collisions, place sensor pods precisely, return to preplanned positions, home the units location or even perform image enhancements or object recognition on streamed video.

Keywords: omnidirectional drive, mobile robots, teleoperation, mast, wireless communications, ZigBeee

1. INTRODUCTION This paper presents interim results of a research project in the design and control of a mobile robot for vehicle inspection. The task context is vehicle inspection at traffic control checkpoints such as border crossings and security checkpoints. Vehicles entering at checkpoints range in size from compact cars to tanker and tractor-trailer trucks. The task elements are to approach the vehicle, to talk to the driver and passengers, to inspect the interior compartments to the extent visible from the exterior, to inspect the vehicle exterior (possibly using an “electronic nose” to detect contraband or other hazards), to view the license plate, and to withdraw from the vehicle. The robot system will operate under teleoperation remote control by a human inspector. The objective of the robotic system is to enable the operator to perform a comprehensive external inspection while remaining at a safe standoff distance.

The target platform was the Omni-Directional Inspection System (ODIS) [1]. Section 2.0 describes the electromechanical system design including the ODIS base, the mast sub-system. Section 3.0 describes the communications and hardware that we tested. Section 4.0 describes the simulation of ODIS that has been started. Section 5.0 describes the TCP/IP socket interface that allows the simulation of ODIS to execute the same commands as the physical ODIS or for any computer on the same network to operate ODIS. Section 6.0 describes development and testing status and plans.

Unmanned Systems Technology X, edited by Grant R. Gerhart, Douglas W. Gage, Charles M. ShoemakerProc. of SPIE Vol. 6962, 69621U, (2008) · 0277-786X/08/$18 · doi: 10.1117/12.778059

Proc. of SPIE Vol. 6962 69621U-12008 SPIE Digital Library -- Subscriber Archive Copy

74L

iii

w jrrI i—

L—

2. MAST DESIGN AND OPERATION 2.1 ODIS

ODIS is an omindirectional platform capable of translating in any direction and rotating simultaneously. The basic ODIS platform carried a video camera with tilt actuation, and was originally designed for underbody inspection. ODIS’s omnidirectional drive is implemented by a three-wheel drive system, in which all wheels are capable of independent pivot and rotation. ODIS has low ground clearance, and was designed for relatively smooth, flat and level surfaces.

For full-vehicle external inspection, a mast or arm subsystem to elevate a camera or other payload to a height of approximately eight feet is needed. In addition to the tilt-actuated sensor pod with the camera, “electronic nose” and ultrasonic range finder, we also chose to integrate a fine-motion manipulator arm and gripper at the top of the mast. In this project, we were constrained not to alter the code running on ODIS’s internal processor. This meant that we had to bypass ODIS’s communications and processing in order to integrate the additional degrees of freedom and sensor channels. The original ODIS OCU allowed only control of the three degrees of freedom of the base motion, and only supported analog video display.

2.2 Mast Design and Operation

We created a prototype mast that uses a telescoping pulley system. This mast is capable of folding down, as seen in Figure 1. It is operated by a 12V motor with a worm gear. A window-lift motor powers movement of the pulley system. This gives the mast the capability of holding a payload steady at any height from about 1 foot to 8 feet. The pulley system can also potentially be used to extend a camera or sensors under a vehicle. We integrate a fine-motion manipulator arm and gripper that can pick up objects at the top of the mast. The arm includes five Hitec HS-422 servo motors: one for the base, two for the shoulder, and one each for the elbow and wrist. An HS-81 is included for the gripper. Figure 1 also shows the mast half extended and fully extended, eight feet into the air.

We fabricated a thicker top plate for ODIS for use with the mast because the existing plate is thin and pliable. We also fabricated a steel box to protect and cover the motor, microcontroller and wiring. Our mast prototype is independent of ODIS and can be used on any mobile robot platform. We plan to have an RS-232 connection as the input and the mast will operate based on a messaging protocol we are writing.

Fig. 1. ODIS with Prototype Mast Retracted and Extended.

Proc. of SPIE Vol. 6962 69621U-2

2.3 Control System

Our initial plan to control the mast was to use the FRC control system [2] from Innovation First, Inc. The control system includes an operator interface and a robot controller. The operator interface takes inputs from the human operator and passes it to the robot controller. It requires the use of 900 MHz radios for wireless communication. The 900 MHz data radio that IFI sells that we thought we could use to control the mast but after trying it out, the FreeWave radio that ODIS uses for motion commands interfered with each other. We ended up testing out the ZigBee Pro [3] modules from Digi and are having good results. The ZigBee radios seem to be impervious to the EM interference from the video transmitter/receiver and the FreeWave radios.

Figure 2 shows the LPC2106 from Embedded Artists [4] that we are using which is an NXP LPC2106 microcontroller. There are two UARTs available on the board. The first UART is connected to the ZigBee module. The second UART is connected to an Intersil RS232 transmitter/receiver, which is then connected to the Sony PTZ camera. The Intersil IC takes the TTL logic from the microcontroller and brings it up to the RS232 level.

Our circuit operates at 5V. The maximum voltage of the LPC2106 is 6V. The Intersil IC also operates at 5V. The ZigBee requires between 2.8V and 3.3V to operate. The microcontroller has a pin that outputs 3.3V so we are using that to power the ZigBee module.

Fig. 2. The circuit explains how ZigBee hooks up to MC and the USB ZigBee board we have on the OCU end.

3. COMMUNICATION 3.1 Belkin Ultra-Wideband

The communications technology that we looked at was Ultra-Wideband (UWB). UWB operates in the range of 3.1 GHz up to 10.6 GHz [5]. This is capable of producing a data rate up to 480 Mbps but at shorter distances. We purchased Belkin’s Wireless Hub [6] to see how it performed. We tested this out in Manufacturing Engineering’s high-bay area. We set the computer up on the passenger-side of a car that was already in the high-bay. The UWB hub was placed on a chair three meters away from the passenger-side door and the dongle was attached to my computer. The data rate that we experienced was close to 480 Mbps at close distances but we were not able to get more than five meters away before experiencing a significant degradation in signal strength. If the signal strength went below ten percent, the digital video feed was lost and it usually took a reboot of the computer to regain connectivity. Figure 3 shows the setup that was used to test the UWB out.

Proc. of SPIE Vol. 6962 69621U-3

0

Vehicle

0

3 meters

Fig. 3. Setup Used to Test Ultra-Wideband.

3.2 Belkin 802.11n

We next moved on to a version of the upcoming 802.11n standard. Although it has not been ratified by IEEE yet, this emerging standard significantly increases the physical transfer rate by using “Multiple-Input Multiple-Output” (MIMO) technology. We purchased Belkin’s N1 Wireless Router [7] along with a PCMCIA card [8]. We are using the Belkin 802.11n router on top of ODIS connected to a device that converts Ethernet to USB. We can also generate PWM signals by using another USB device. Both devices are described in detail a little later. The remote computer is connected to the 802.11n network and all devices appear as if they are directly connected. This configuration gives us extendibility to add more sensors and actuators and the network speed is fast enough to transmit a digital video feed with acceptable latency.

3.3 Ethernet Extender

The 802.11n network performed well at distances up to 150m. This is not a long enough distance for use in the field, so we tested an Ethernet Extender to use in conjunction with the 802.11n network. We purchased the Ethernet HyperXtender GT from Netsys [9]. They are designed to operate over CAT5 or telephone wire. We connected the devices using a spool of 1000’ of CAT5e cable and we were successfully able to transmit a video feed. This allows us to create a standoff distance around five hundred meters when used with the 802.11n network.

Figure 4 depicts the new communications chain, along with other peripherals which will be explained later.

Proc. of SPIE Vol. 6962 69621U-4

Stationary

Tablet I I

Ethernet CAT5e EthernetI aoziin., Extender Extender

OCUI Base 350 m Remote I I

router

100Mbps JWireless

150 m at 200Mbp

BO2ilnUSB peripherals router

PWM peripherals flfl USB to PWMI Mobile I

Fig. 4. The New Communications Chain with 802.11n and Ethernet Extenders

3.4 Sony PTZ Camera

The existing camera on ODIS provided no way for us to be able to grab the image data and use it in algorithms. We decided to use a Sony EVI-D70 [10]. It has 18x optical zoom and is capable of panning and tilting. It outputs image data using NTSC. It also has excellent low-light sensitivity. The minimum illumination is less than 1 lux. Figure 5 shows a picture of the Sony camera.

Fig. 5. Sony EVI-D70 PTZ Camera

The Sony camera runs Sony’s proprietary VISCA protocol, which is a packet-based protocol for handling internal camera control and pan/tilt functions. We ported this code to run on the LPC2106 microcontroller and defined a messaging protocol to handle.

Figure 6 is the basic GUI that we are using. Not all of the VISCA command set has been ported over to run on the microcontroller. This basic GUI has options to control the camera’s pan, tilt, and zoom and it has buttons to control mast operations.

Proc. of SPIE Vol. 6962 69621U-5

TuringAssociates, Inc.

Pan:

Mast Operation:

71

Fig. 6. The GUI to Control Sony Camera

4. SIMULATION We created a simulation of ODIS within Webots, a simulation and prototyping software package from Cyberbotics, Ltd, to assist in teleoperation control. The model is a physically-based representation of ODIS with mass inertia and friction inputs. Having a virtual environment for development and testing is advantageous for simulation of various inspection tasks. Figure 7 shows the mast on top of ODIS simulation. We are currently working to make this model as accurate as possible.

Fig. 7. Simulation of our mast on top of ODIS

The 802.11n wireless communications were expanded to include a TCP/IP sockets, which is described in the next section. This allowed us to tap into the ODIS communication stream and to execute the same commands in the simulation that the real ODIS is executing. This feature allows commands to be sent to both the actual and virtual ODIS. In this way, changeable virtual walls and objects along with virtual sensors can be used to control and constrain the behavior of the actual ODIS for simulation and training applications.

Proc. of SPIE Vol. 6962 69621U-6

The simulation assists in teleoperation control in several ways. The simulation can show what the system is doing from an arbitrary or multiple viewpoints. Without the simulation, the user would only see the video from the camera(s) on-board ODIS. Unless the camera(s) had the portion of the robot in view, the user would have no direct feedback on the position and orientation of the robot. The simulation also computes the frame of reference indicating the direction of motion on the different parts of the vehicle in response to motion input commands. An aggressive use of the simulation is as an interface modality in which the operator grabs and drags a point or points on the simulation vehicle to achieve the desired position. Using the frame of reference and the inverse motion command model, the system computes and issues the motion commands needed to make the real vehicle comply with the forced motion. Another potential use would be to record images from various perspectives as the camera moves. Another potential use of the simulation is as a virtual world in which to plan and test maneuvers prior to executing them on the robot. A limitation of the simulation is that it does not contain the external environment (e.g. cars and trucks queued for inspection with curbs and barricades, etc.). These objects cannot be included in the core simulation since what they are and where they are changes dynamically. The system can capture visual images and corresponding range maps and with this data can approximate surface and texture maps that could be inserted into the simulated motion.

5. TCP/IP SOCKETS IMPLEMENTATION TCP/IP sockets are a method for nodes on a network to send and receive data. The 802.11n router that rides on top of ODIS provides the network infrastructure. It also provides the Ethernet out that the AnywhereUSB accepts as input. In this implementation, the server code was designed to run on the same machine that the FreeWave radio is attached to. There can be multiple clients connected to this network. Figure 8 depicts the client/server network.

Fig. 8. Client/Server Diagram Supporting N-number of Nodes

6. STATUS AND PLANS

This research has expanded the current capabilities of ODIS. We now have the capability of integrating new sensors and actuators. We tested out the emerging ultra-wide technology and found that it is not currently capable of providing fast communications at long distances. The 802.11n implementation from Belkin along with the Ethernet extender delivers the speed and distance that we need to create a safe standoff distance for use in the field. The current mast is only a prototype that we did on a limited budget and that we are currently looking at other alternatives for building the mast.

Proc. of SPIE Vol. 6962 69621U-7

The paper has reported on the progress we have made to enable the mast to have separate communications from ODIS while still maintaining a safe standoff distance. We have future work to enhance the GUI and to finish porting the VISCA control code over to the microcontroller.

REFERENCES

[1] K. Moore, N. Flann, S. Rich, M. Frandsen, Y. Chung, J. Martin, M. Davidson, R. Maxfield, and C. Wood , “Implementation of an Omni-directional Robotic Inspection System (ODIS)”, Proceedings of SPIE Conference on Robotic and Semi-Robotics Ground Vehicle Technology, vol. 4634, Orlando, FL, May 2001

[2] FRC Control System, http://www.ifirobotics.com/frc-robot-control-system-overview.shtml [3] ZigBee Pro,

http://www.newmicros.com/cgibin/store/order.cgi?form=prod_detail&part=Zigbee-Kit&id=fJ7liEwXK07OXfF84QKI3R8Po0jbk60m/

[4] Embedded Artists , http://www.embeddedartists.com/ [5] Intel, “Ultra Wideband (UWB) Technology, Technology & Research at Intel”,

http://www.intel.com/technology/comms/uwb/index.htm [6] Belkin, “Belkin: Cable-Free USB Hub”, http://catalog.belkin.com/IWCatProductPage.process?Product_Id=356042/ [7] Belkin, “Belkin: N1 Wireless Router”, http://catalog.belkin.com/IWCatProductPage.process?Product_Id=273526/ [8] Belkin,“Belkin: N1 Wireless Notebook Card”,

http://catalog.belkin.com/IWCatProductPage.process?Product_Id=273544/ [9] Netsys, “Ethernet Extenders: NV-600EKIT – Kit – Ethernet HyperXtender GT – Netsys-Direct”,

http://www.netsys-direct.com/proddetail.php?prod=NV-600EKIT&cat=7/ [10] Sony, “Broadcast and Business Solutions Company – EVID70”,

http://bssc.sel.sony.com/BroadcastandBusiness/DisplayModel?m=0&p=2&sp=22&id=72199&navid=wl_800_series_wireless_microphones

Proc. of SPIE Vol. 6962 69621U-8