SimEOC: A Virtual Emergency Operations Center (vEOC)

350
SimEOC: A Virtual Emergency Operations Center (vEOC) Simulator for Training and Research A Dissertation Submitted to the Graduate School of the University of Notre Dame in Partial Fulfillment of the Requirements for the Degree of Doctor of Philosophy by Cynthia M. Nikolai Gregory Madey, Director Graduate Program in Computer Science and Engineering Notre Dame, Indiana December 2014

Transcript of SimEOC: A Virtual Emergency Operations Center (vEOC)

SimEOC: A Virtual Emergency Operations Center (vEOC)

Simulator for Training and Research

A Dissertation

Submitted to the Graduate School

of the University of Notre Dame

in Partial Fulfillment of the Requirements

for the Degree of

Doctor of Philosophy

by

Cynthia M. Nikolai

Gregory Madey, Director

Graduate Program in Computer Science and Engineering

Notre Dame, Indiana

December 2014

c© Copyright by

Cynthia M. Nikolai

All Rights Reserved

SimEOC: A Virtual Emergency Operations Center (vEOC)

Simulator for Training and Research

Abstract

by

Cynthia M. Nikolai

Training is an integral part of disaster preparedness. Practice in dealing with

crises improves our ability to manage emergency situations. As an emergency esca-

lates, more and more agencies get involved. These agencies require training to learn

how to manage the crisis and to work together across jurisdictional boundaries. Con-

sequently, training requires participation from many individuals, consumes a great

deal of money, and cannot be conducted often. Moreover, in the current crisis man-

agement environment, most training is conducted through tabletop and paper-based

scenario exercises. In this dissertation, we describe a socio-technical training simu-

lator and research tool for upper level emergency managers. This tool is important

because it enables emergency managers to train for crises more efficiently and effec-

tively in a virtual environment. It also serves as a research tool for scientists to study

emergency management decision-making and organizational learning.

DEDICATION

To my parents, Alberta and Joseph Nikolai

ii

CONTENTS

FIGURES . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . x

ACKNOWLEDGMENTS . . . . . . . . . . . . . . . . . . . . . . . . . . . . . xiii

CHAPTER 1: BACKGROUND . . . . . . . . . . . . . . . . . . . . . . . . . . 11.1 Overview . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11.2 Application Goals . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11.3 Ensayo . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11.4 Application Features . . . . . . . . . . . . . . . . . . . . . . . . . . . 11.5 Application URL . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 31.6 Collaborators . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 31.7 Enabling Research Questions . . . . . . . . . . . . . . . . . . . . . . . 31.8 Virtual Teamwork . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 41.9 Grants . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 41.10 Scope of this Work . . . . . . . . . . . . . . . . . . . . . . . . . . . . 41.11 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5

CHAPTER 2: INTRODUCTION . . . . . . . . . . . . . . . . . . . . . . . . . 62.1 Overview . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 62.2 Relevance . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 62.3 Emergency Management Governance . . . . . . . . . . . . . . . . . . 72.4 Incident Command System (ICS) . . . . . . . . . . . . . . . . . . . . 72.5 National Incident Management System (NIMS) . . . . . . . . . . . . 92.6 Emergency Support Functions (ESFs) . . . . . . . . . . . . . . . . . . 102.7 Terminology . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11

2.7.1 Common Operating Picture . . . . . . . . . . . . . . . . . . . 112.7.2 Situational Awareness . . . . . . . . . . . . . . . . . . . . . . 112.7.3 Point of Distribution (POD) . . . . . . . . . . . . . . . . . . . 122.7.4 Standard Operating Procedures (SOP) . . . . . . . . . . . . . 122.7.5 Incident Action Plan (IAP) . . . . . . . . . . . . . . . . . . . 122.7.6 Incident, Disaster, Emergency, Crisis . . . . . . . . . . . . . . 122.7.7 Crisis Information Management System (CIMS) . . . . . . . . 122.7.8 Crisis Management . . . . . . . . . . . . . . . . . . . . . . . . 13

2.7.8.1 Small Scale Versus Large Scale Crises . . . . . . . . . 132.7.8.2 First Responders verus Emergency Managers . . . . 13

iii

2.7.9 Hot Wash . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 142.7.10 SimCell . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 142.7.11 Inject . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 142.7.12 Script . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 152.7.13 Emergency Operations Center (EOC) . . . . . . . . . . . . . . 15

2.8 Types of Training . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 152.8.1 Individuals, Groups, and Organizations . . . . . . . . . . . . . 152.8.2 Discussion-based . . . . . . . . . . . . . . . . . . . . . . . . . 162.8.3 Operations-based . . . . . . . . . . . . . . . . . . . . . . . . . 162.8.4 Seminars . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 172.8.5 Train-the-Trainer . . . . . . . . . . . . . . . . . . . . . . . . . 172.8.6 Workshops . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 172.8.7 Tabletop Exercises . . . . . . . . . . . . . . . . . . . . . . . . 172.8.8 Games . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 182.8.9 Drills . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 182.8.10 Functional Exercises . . . . . . . . . . . . . . . . . . . . . . . 182.8.11 Full-scale Exercises . . . . . . . . . . . . . . . . . . . . . . . . 18

2.9 Exercise Cycle . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 192.9.1 Exercise Staff . . . . . . . . . . . . . . . . . . . . . . . . . . . 192.9.2 Exercise Director . . . . . . . . . . . . . . . . . . . . . . . . . 192.9.3 Controllers . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 192.9.4 Senior Controller . . . . . . . . . . . . . . . . . . . . . . . . . 202.9.5 Simulators . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 202.9.6 Evaluators . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 20

2.10 The Miami-Dade EOC . . . . . . . . . . . . . . . . . . . . . . . . . . 212.10.1 Incident Command . . . . . . . . . . . . . . . . . . . . . . . . 212.10.2 Levels of Activation . . . . . . . . . . . . . . . . . . . . . . . . 21

2.10.2.1 Level 3: Monitoring and Assessment . . . . . . . . . 232.10.2.2 Level 2: Partial Activation . . . . . . . . . . . . . . . 232.10.2.3 Level 1: Full-Scale Activation . . . . . . . . . . . . . 23

2.11 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 23

CHAPTER 3: CRISIS INFORMATION MANAGEMENT SOFTWARE . . . 243.1 Overview . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 243.2 Commercial Products . . . . . . . . . . . . . . . . . . . . . . . . . . . 24

3.2.1 Incident Commander . . . . . . . . . . . . . . . . . . . . . . . 243.2.2 Advanced Disaster Management Simulator (ADMS) . . . . . . 243.2.3 WebEOC . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 253.2.4 Rapid Response Virtual Emergency Operations Center . . . . 253.2.5 E-Team . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 253.2.6 Emergency Management Staff Trainer (EMST) . . . . . . . . 263.2.7 Civil Emergency Reaction and Responder Training System (CER-

RTS) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 263.2.8 Virtual Staff Trainer . . . . . . . . . . . . . . . . . . . . . . . 26

iv

3.3 Research Products . . . . . . . . . . . . . . . . . . . . . . . . . . . . 273.3.1 IISIS . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 273.3.2 DC-Train . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 27

3.4 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 27

CHAPTER 4: DESIGN . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 284.1 Overview . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 284.2 Spiral Design . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 284.3 Design Requisites . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 29

4.3.1 User-Centered Application Design . . . . . . . . . . . . . . . . 294.3.2 Keep It Simple . . . . . . . . . . . . . . . . . . . . . . . . . . 314.3.3 Windows-based Layout . . . . . . . . . . . . . . . . . . . . . . 314.3.4 Train Like We Fight . . . . . . . . . . . . . . . . . . . . . . . 32

4.4 More Than Just Aesthetics . . . . . . . . . . . . . . . . . . . . . . . . 324.5 Expert Validation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 324.6 Design Documents . . . . . . . . . . . . . . . . . . . . . . . . . . . . 334.7 Development Environment . . . . . . . . . . . . . . . . . . . . . . . . 33

4.7.1 Technologies Employed . . . . . . . . . . . . . . . . . . . . . . 334.7.2 Jetty Server . . . . . . . . . . . . . . . . . . . . . . . . . . . . 334.7.3 Database . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 334.7.4 Virtual Machines . . . . . . . . . . . . . . . . . . . . . . . . . 334.7.5 Secure Socket Layer (SSL) . . . . . . . . . . . . . . . . . . . . 344.7.6 Redmine Server . . . . . . . . . . . . . . . . . . . . . . . . . . 34

4.8 Console . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 344.9 User Views . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 36

4.9.1 Trainee . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 364.9.2 Exercise Developer . . . . . . . . . . . . . . . . . . . . . . . . 364.9.3 Exercise Controller . . . . . . . . . . . . . . . . . . . . . . . . 364.9.4 Researcher . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 364.9.5 Evaluator . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 374.9.6 Administrator . . . . . . . . . . . . . . . . . . . . . . . . . . . 37

4.10 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 37

CHAPTER 5: THE APPLICATION . . . . . . . . . . . . . . . . . . . . . . . 385.1 Overview . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 385.2 About the Software . . . . . . . . . . . . . . . . . . . . . . . . . . . . 385.3 Tables . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 385.4 Software Architecture . . . . . . . . . . . . . . . . . . . . . . . . . . . 38

5.4.1 Trainee . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 395.4.2 Exercise Developer/Controller . . . . . . . . . . . . . . . . . . 395.4.3 Evaluator . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 405.4.4 Researcher . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 415.4.5 Administrator . . . . . . . . . . . . . . . . . . . . . . . . . . . 41

5.5 vEOC Consoles . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 42

v

5.5.1 Player/Trainee . . . . . . . . . . . . . . . . . . . . . . . . . . 425.5.1.1 Main Panel . . . . . . . . . . . . . . . . . . . . . . . 425.5.1.2 Exercise Panel . . . . . . . . . . . . . . . . . . . . . 445.5.1.3 Communication Tools . . . . . . . . . . . . . . . . . 445.5.1.4 Dashboards . . . . . . . . . . . . . . . . . . . . . . . 445.5.1.5 Learning Tutor . . . . . . . . . . . . . . . . . . . . . 45

5.5.2 Exercise Developer/Exercise Controller . . . . . . . . . . . . . 455.5.2.1 Exercise Developer . . . . . . . . . . . . . . . . . . . 455.5.2.2 Script Developer . . . . . . . . . . . . . . . . . . . . 475.5.2.3 Report Module . . . . . . . . . . . . . . . . . . . . . 475.5.2.4 Exercise Controller . . . . . . . . . . . . . . . . . . . 47

5.5.3 Researcher . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 485.5.3.1 Researcher Tools . . . . . . . . . . . . . . . . . . . . 485.5.3.2 Exercise Reports . . . . . . . . . . . . . . . . . . . . 485.5.3.3 References . . . . . . . . . . . . . . . . . . . . . . . . 50

5.5.4 Evaluator . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 505.5.4.1 Evaluation Tools . . . . . . . . . . . . . . . . . . . . 505.5.4.2 References . . . . . . . . . . . . . . . . . . . . . . . . 50

5.5.5 Administrator . . . . . . . . . . . . . . . . . . . . . . . . . . . 505.5.5.1 User Profiles . . . . . . . . . . . . . . . . . . . . . . 505.5.5.2 System Roles . . . . . . . . . . . . . . . . . . . . . . 535.5.5.3 Virtualization . . . . . . . . . . . . . . . . . . . . . . 53

5.6 System Capabilities . . . . . . . . . . . . . . . . . . . . . . . . . . . . 535.7 User Manual . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 535.8 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 53

CHAPTER 6: TESTING . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 546.1 Overview . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 546.2 Manual Testing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 546.3 Automated Testing . . . . . . . . . . . . . . . . . . . . . . . . . . . . 546.4 Demos and Evaluation . . . . . . . . . . . . . . . . . . . . . . . . . . 546.5 Master Test Plan . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 556.6 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 55

CHAPTER 7: GAMES AND SIMULATIONS FOR EMERGENCY OPERA-TIONS CENTERS: CHALLENGES AND OPPORTUNITIES . . . . . . . 567.1 Overview . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 567.2 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 567.3 Challenges . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 577.4 Opportunities . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 617.5 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 64

vi

CHAPTER 8: LEVERAGING WEBEOC IN SUPPORT OF THE HAITIANRELIEF EFFORT: INSIGHTS AND LESSONS LEARNED . . . . . . . . 658.1 Overview . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 658.2 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 66

8.2.1 Haitian Relief Effort: Sequence of Events . . . . . . . . . . . . 678.2.2 Tracking Resources In The Relief Efforts . . . . . . . . . . . . 688.2.3 Insights And Lessons Learned From The Haiti Relief Efforts . 718.2.4 Recommendations For Further Improvement . . . . . . . . . . 72

8.3 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 73

CHAPTER 9: LEVERAGING WEBEOC AND GIS IN SUPPORT OF EMER-GENCY RESPONSE . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 749.1 Overview . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 749.2 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 74

9.2.1 Leveraging WebEOC in Support of the Probowl and the Su-perbowl . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 76

9.2.2 Insights for Further Integration . . . . . . . . . . . . . . . . . 809.3 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 80

CHAPTER 10: DESIGN PRINCIPLES FOR MODERN CRISIS INFORMA-TION MANAGEMENT SYSTEMS: FROM CLOSED LOCAL SYSTEMSTO THE WEB AND BEYOND . . . . . . . . . . . . . . . . . . . . . . . . 8110.1 Overview . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8110.2 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8110.3 Background . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 82

10.3.1 Evolution of CIMS: From Closed Local Systems to the Weband Beyond . . . . . . . . . . . . . . . . . . . . . . . . . . . . 82

10.3.2 Summary of Current CIMS Design Principles . . . . . . . . . 8410.4 Additional Design Principles for Modern CIMS . . . . . . . . . . . . 8910.5 Implications and Future Directions of CIMS . . . . . . . . . . . . . . 10010.6 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 101

CHAPTER 11: A CALL FOR DATA EXCHANGE STANDARDS FOR OP-ERATIONAL CRISIS INFORMATION MANAGEMENT SYSTEMS ANDEMERGENCY MANAGEMENT EXERCISE SIMULATORS . . . . . . . 10311.1 Overview . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10311.2 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10311.3 Background . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10411.4 Simulation Data Exchange Standards . . . . . . . . . . . . . . . . . . 107

11.4.1 Background . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10711.4.2 Towards an XML Schema for Operational CIMS . . . . . . . . 10711.4.3 Proposed Schema for Exercises and Training Simulations . . . 116

11.5 Limitations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 123

vii

11.6 Implications . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12311.7 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 123

CHAPTER 12: CONCLUSION . . . . . . . . . . . . . . . . . . . . . . . . . . 12512.1 Overview . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12512.2 Contributions to the Scientific Community . . . . . . . . . . . . . . . 12512.3 Publications . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12512.4 Limitations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12812.5 Future Work . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12912.6 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 129

APPENDIX A: MIAMI-DADE EMERGENCY OPERATIONS CENTER FIELDRESEARCH REPORT . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 130A.1 Overview . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 130A.2 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 130A.3 Background . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 133A.4 Computer-Based Solutions to Emergency Management Training . . . 134A.5 Crisis Information Management and Training System . . . . . . . . . 134A.6 Ensayo . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 135A.7 Field Research in Miami-Dade . . . . . . . . . . . . . . . . . . . . . . 137A.8 Research Methodology . . . . . . . . . . . . . . . . . . . . . . . . . . 138A.9 Lessons Learned and Insights Gained . . . . . . . . . . . . . . . . . . 138A.10 Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 139A.11 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 146

APPENDIX B: CIMS XML SPECIFICATION . . . . . . . . . . . . . . . . . 150

APPENDIX C: SIMULATION XML SPECIFICATION . . . . . . . . . . . . 164

APPENDIX D: PROCESSES AND FLOWCHARTS . . . . . . . . . . . . . . 188D.1 Liaison Process Flow . . . . . . . . . . . . . . . . . . . . . . . . . . . 189D.2 Logistics Resource Request Process Flow . . . . . . . . . . . . . . . . 190D.3 Logistics Resource Request Process Flow (SimEOC implementation) . 191D.4 Exercise Developer Process Flow . . . . . . . . . . . . . . . . . . . . 192D.5 Exercise Developer Process Flow (SimEOC Implementation) . . . . . 193D.6 Exercise Controller Process Flow . . . . . . . . . . . . . . . . . . . . 194D.7 Exercise Controller Process Flow (SimEOC Implementation) . . . . . 195D.8 Exercise Design Lifecycle . . . . . . . . . . . . . . . . . . . . . . . . . 196D.9 Exercise Design Lifecycle (SimEOC Implementation) . . . . . . . . . 197

APPENDIX E: CONCEPT MAPS . . . . . . . . . . . . . . . . . . . . . . . . 198E.1 Emergency Manager Concept Map . . . . . . . . . . . . . . . . . . . 198E.2 Emergency Manager Concept Map (SimEOC Implementation) . . . . 200E.3 Exercise Developer Concept Map . . . . . . . . . . . . . . . . . . . . 201

viii

E.4 Exercise Developer Concept Map (SimEOC Implementation) . . . . . 202E.5 Exercise Controller Concept Map . . . . . . . . . . . . . . . . . . . . 203E.6 Exercise Controller Concept Map (SimEOC Implementation) . . . . . 204E.7 Exercise Evaluator Concept Map . . . . . . . . . . . . . . . . . . . . 205E.8 Exercise Evaluator Concept Map (SimEOC Implementation) . . . . . 206E.9 Planning Concept Map . . . . . . . . . . . . . . . . . . . . . . . . . . 207E.10 Planning Concept Map (SimEOC Implementation) . . . . . . . . . . 208

APPENDIX F: SYSTEM CAPABILITIES . . . . . . . . . . . . . . . . . . . . 209

APPENDIX G: TABLE SCHEMA . . . . . . . . . . . . . . . . . . . . . . . . 217

APPENDIX H: DIRECTORY LISTING . . . . . . . . . . . . . . . . . . . . . 237

APPENDIX I: MASTER TEST PLAN . . . . . . . . . . . . . . . . . . . . . . 244

APPENDIX J: FUNCTIONAL TESTING CHECKLIST . . . . . . . . . . . . 280

APPENDIX K: vEOC USABILITY TEST . . . . . . . . . . . . . . . . . . . . 291

APPENDIX L: vEOC USER MANUAL . . . . . . . . . . . . . . . . . . . . . 298

BIBLIOGRAPHY . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 321

ix

FIGURES

2.1 Flooding during Hurricane Katrina . . . . . . . . . . . . . . . . . . . 7

2.2 Incident Command System [41]. The command staff, the general staff,and the agency liaisons assist the incident commander during an emer-gency. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8

2.3 A Sample Inject . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 152.4 Types of Training . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16

2.5 The Miami-Dade Emergency Operations Center Incident CommandSystem . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 22

4.1 The Spiral Model of Software Development1 . . . . . . . . . . . . . . 30

4.2 Example Screenshots from the WebEOC Console . . . . . . . . . . . 35

5.1 The Trainee Architecture . . . . . . . . . . . . . . . . . . . . . . . . . 395.2 The Exercise Developer Architecture . . . . . . . . . . . . . . . . . . 405.3 The Evaluator Architecture . . . . . . . . . . . . . . . . . . . . . . . 405.4 The Researcher Architecture . . . . . . . . . . . . . . . . . . . . . . . 415.5 The Administrator Architecture . . . . . . . . . . . . . . . . . . . . . 425.6 The Trainee Console . . . . . . . . . . . . . . . . . . . . . . . . . . . 43

5.7 The Exercise Developer Console (left side of figure). The Script De-veloper Interface (right side of figure). . . . . . . . . . . . . . . . . . . 46

5.8 The Researcher Console (left side of figure). An Exercise Log (rightside of figure). . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 49

5.9 The Evaluator Console (left side of figure). An Exercise EvaluationGuide Analysis Sheet (right side of figure). . . . . . . . . . . . . . . . 51

5.10 The Administrator Console (left side of figure). User AdministrationOptions (right side of figure). . . . . . . . . . . . . . . . . . . . . . . 52

8.1 Incident Command System [41]. The command staff and general staffassist the Incident Commander during an emergency. . . . . . . . . . 66

8.2 Haiti Relief Effort Timeline. This is the sequence of events that oc-curred as the EOC began the Haiti relief operations. . . . . . . . . . 68

8.3 The Transportation Board. This board was used to track airplanes andpassengers coming into Miami International Airport and HomesteadAir Reserve Base. This is the list view. . . . . . . . . . . . . . . . . . 70

x

8.4 The Donation Board. This board was used to track non-county re-sources. This is the list view. . . . . . . . . . . . . . . . . . . . . . . 71

9.1 Incident Command System [41]. The command staff and general staffassist the Incident Commander during an emergency. . . . . . . . . . 75

9.2 Stadium Incident Board. This board is used to track incidents thatoccur within Land Shark Stadium. This is the list view. Here we seesignificant details of a fire incident. . . . . . . . . . . . . . . . . . . . 77

9.3 Florida Interoperable Picture Processing for Emergency Response (FLIP-PER) screenshot. This GIS software links with the backend WebEOCdatabase. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 78

9.4 Florida Interoperable Picture Processing for Emergency Response (FLIP-PER) Social Networking Screenshot. FLIPPER links with social net-working sites like Flickr and Twitter. In this figure, a simple searchfor Haiti calls up a score of pictures others have taken and posted onFlickr. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 79

10.1 A GIS aware application used by the Miami-Dade EOC. Clicking ona particular location marker on the map brings up information andpictures related to the area. . . . . . . . . . . . . . . . . . . . . . . . 95

A.1 Department of Emergency Management organizational structure. Thisis the day-to-day structure of emergency management at the Emer-gency Operations Center. . . . . . . . . . . . . . . . . . . . . . . . . . 131

A.2 Incident Command System [41]. The command staff, the generalstaff, and the agency liaisons assist the incident commander during anemergency. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 132

A.3 A WebEOC status board. . . . . . . . . . . . . . . . . . . . . . . . . 135

A.4 An example simulation in WebEOC. The user has to configure hun-dreds of injects for the exercise. In addition, if the structure of theboards have been changed since the original gathering of the data,then the injects may not reflect this change [52]. . . . . . . . . . . . 136

A.5 An emergency manager concept graph. . . . . . . . . . . . . . . . . . 142

A.6 An exercise developer concept graph. . . . . . . . . . . . . . . . . . . 143

A.7 The logistics resource request process. . . . . . . . . . . . . . . . . . . 144

A.8 Improved vEOC architecture. The architecture has 13 main modules. 145

A.9 Old User Interface. A tab-based approach. . . . . . . . . . . . . . . . 147

A.10 Improved user interface - a windows-based approach. . . . . . . . . . 148

A.11 New user interface script developer console. This is where individualsdevelop exercise scripts and injects to send to the trainees. . . . . . . 149

xi

D.1 Liaison Flowgraph . . . . . . . . . . . . . . . . . . . . . . . . . . . . 189D.2 Logistics Resource Request Process . . . . . . . . . . . . . . . . . . . 190D.3 Logistics Resource Request Process (SimEOC Implementation) . . . . 191D.4 Exercise Developer Flowgraph . . . . . . . . . . . . . . . . . . . . . . 192D.5 Exercise Developer Flowgraph (SimEOC Implementation) . . . . . . 193D.6 Exercise Controller Flowgraph . . . . . . . . . . . . . . . . . . . . . . 194D.7 Exercise Controller Flowgraph (SimEOC Implementation) . . . . . . 195D.8 Exercise Design Lifecycle . . . . . . . . . . . . . . . . . . . . . . . . . 196D.9 Exercise Design Lifecycle (SimEOC Implementation) . . . . . . . . . 197

E.1 Emergency Manager Concept Map . . . . . . . . . . . . . . . . . . . 199

E.2 Emergency Manager Concept Map (SimEOC Implementation) . . . . 200

E.3 Exercise Developer Concept Map . . . . . . . . . . . . . . . . . . . . 201

E.4 Exercise Developer Concept Map (SimEOC Implementation) . . . . . 202

E.5 Exercise Controller Concept Map . . . . . . . . . . . . . . . . . . . . 203

E.6 Exercise Controller Concept Map (SimEOC Implementation) . . . . . 204

E.7 Exercise Evaluator Concept Map . . . . . . . . . . . . . . . . . . . . 205

E.8 Exercise Evaluator Concept Map (SimEOC Implementation) . . . . . 206

E.9 Planner Concept Map . . . . . . . . . . . . . . . . . . . . . . . . . . 207

E.10 Planner Concept Map (SimEOC Implementation) . . . . . . . . . . . 208

xii

ACKNOWLEDGMENTS

Throughout this dissertation, many individuals have contributed and provided

support. First, I would like to thank my advisor, Dr. Greg Madey, for his support

and encouragement. I am grateful to the Miami-Dade EOC for sharing their time

and insights with us throughout this project, especially David Perez, Frank Reddish,

Troy Johnson, Curtis Sommerhoff, Roslyn Viterbo, Soheila Ajabshir, Craig Hall, and

the Logistics Section. I would like to thank our collaborators at Emory University,

Florida International University, and St. Thomas University, specifically, Dr. Irma

Becerra-Fernandez, Dr. Michael Prietula and Dr. Weidong Xia. Special thanks goes

to the many undergraduate and graduate students who contributed to this project

as well, including Nate Thomas, Regina Ranstrom, Sarah Aycock, Matt Mooney,

Matt van Antwerp, Nate Regola, Hung Truong, Rahul Bhandari, Daina Spense, and

Robert Leon, Arvind Gudi and Pepe Rocha, Qiuzhi (Rose) Chang, Denni Florian,

John Glynn, Soundarya Soundararajan, Mouna Yerra, Pratik Bhosale, and Arjun

Anilkumar. Special thanks also goes to the Notre Dame Center for Research Com-

puting for their assistance in the development and debugging of SimEOC, especially,

Anna Alber, David Janosik, Benoit Raybaund, David Campbell, and Dr. Timothy

Wright. I also would like to thank Notre Dame ESTEEM students Amy Flanagan

and Daniel Kestell. Additionally, I am grateful Dr. Michael Sain for his contributions

on my proposal committee. Finally, I would like to thank the University of Notre

Dame Zahm Research Travel Fund, the National Science Foundation (Award Number

CNS-0855164 and CNS-1405912), and the U.S. Department of Education (GAANN

Fellowship Award Number P200A090044) for their support of this research as well.

xiii

xiv

CHAPTER 1

BACKGROUND

1.1 Overview

In this chapter, we give a brief overview of SimEOC. We discuss application goals

and features, collaborators, and enabling research questions. We also discuss virtual

teamwork, grants, and the scope of this work.

1.2 Application Goals

The goal of this dissertation and this application was to build a virtual Emer-

gency Operations Center for (1) training emergency personnel and (2) research into

emergency management decision making.

1.3 Ensayo

In the remainder of this dissertation, you may see the name Ensayo instead of

SimEOC. Ensayo is the early prototype of this project. SimEOC began as a grant

with a project name of Ensayo. Ensayo, in Spanish, literally means rehearsal.

1.4 Application Features

There are several key features of SimEOC that make this work stand out. First,

there are very few computer-based simulators for upper level emergency managers.

This is one of the first simulators for Emergency Operations Centers available for

training and research. Other key features include:

1

• DistributedOne limitation of current training is that players physically have to come tothe EOC in order to participate. We wanted to improve this model of training.In this work, we built a distributed training simulator. This allows authorizedindividuals to access the simulator from any computer from any location in theworld.

• Web-basedAccording to a crisis information management system design principle, moderntraining systems should be similar to systems with which emergency managersare familiar and which they use regularly. [123, 159] It also should be easy tolearn on demand. [123] In this work, we modeled our system after WebEOC[50], a leading web-based commercial crisis information management systemwith which many emergency managers are familiar. People also are familiarwith the web. Having a web-based application is easy to learn and also is asmooth transition for emergency personnel. Finally, it allows world-wide access.Authorized individuals can access this system from any computer anywhere inthe world.

• Inject-Driven ScenariosSimEOC contains the ability to build virtual scenarios and send electronic in-jects, or inputs, to the trainees based on a training script. This capability ismanaged by a simulation engine in the exercise developer/controller console.

• DashboardsWe wanted a way to give the trainees immediate feedback on their decisions.Dashboards are indicators such as lives lost, total cost of resources, lives saved,etc. These can be turned on or off as user/trainee wishes.

• Learning TutorThis application includes a chat bot that the trainees can ask questions to.Currently, the chat bot is broken.

• Research LogsSimEOC records all events that occur and all actions that a user takes inresponse to an inject. These can be view by exercise developers and researchersto analysis decision-making and organizational learning in the EOC.

• Individual, Group, and Organizational TrainingTraining can be accomplished on an individual basis or training can be ac-complished for a group or organization. In SimEOC, we can accomplish both.Training on an individual basis can consist of training a single liaison. Trainingon a group level can consist of an entire group such as the public safety groupor the infrastructure group. Finally, training can consist of training the en-tire organization/EOC as well. This is one of the first emergency managementsimulators with this capability.

2

1.5 Application URL

We have successfully deployed SimEOC. The application URL for the public is

available at http://www.SimEOC.org. We have a project website as well. This

website gives information about the SimEOC including background information,

design documents, and publications. The project website is available at http:

//www.nd.edu/~veoc.

1.6 Collaborators

This work is a collaborative project among several universities. We have been

collaborating with Florida International University and Emory University in the de-

velopment of SimEOC. Additional individuals have contributed to this work as well.

These individuals include the Notre Dame Center for Research Computing as well as

several undergraduate students immersed in a research experience program. Several

gradate research assistants also contributed to the development of SimEOC.

1.7 Enabling Research Questions

SimEOC serves as both a training tool and a research tool. Some enabling research

questions this work can address include:

1. How do individuals establish and maintain trust in other team members in col-laborative virtual teams1? [4, 19, 103, 183, 147, 152]

2. What are the impacts of leadership in virtual teams? [7, 13, 20, 78, 188, 79]

3. How do individuals establish and maintain trust in the technology of the vEOCin collaborative virtual teams? [17]

4. What are the broader design implications of building virtual emergency systems?

1A virtual team is defined as “a team of interdependent members working on a common task whouse electronic media as a primary means of communication; at least some of whom are dispersed ingeographic and/or temporal dimensions” [132]

3

5. How can we improve emergency management when multiple incidents are pre-sented to the trainees at the same time?

6. What can we learn from training selective teams of the incident command systemhierarchy?

7. Can we validate some of the theoretical design principles of dynamic emergencyresponse information systems (e.g. continuous monitoring, control, and assurance[160], and the DERMIS design principles? [159])

8. How do individuals make critical decisions, and how can we improve cognitivedecision-making in emergency situations?

9. How does virtualization and partial distribution of teams affect leadership rolesand communication? [188]

1.8 Virtual Teamwork

During the development, we worked with many of our collaborators virtually. In

addition to daily emails, we had weekly teleconferences. We set this up via skype.

One person in the group prepared an agenda and another wrote up what was discussed

in the meeting.

1.9 Grants

This research is supported by several Grants. We thank the University of Notre

Dame Zahm Research Travel Fund, the National Science Foundation (Award Number

CNS-0855164 and CNS-1405912), and the U.S. Department of Education (GAANN

Fellowship Award Number P200A090044) for their support of this work.

1.10 Scope of this Work

This project was very large in nature and had the potential to grow beyond the

scope of a 3 year dissertation. There are several limitation to the current instanti-

ation. First, it has not been used by practitioners or researchers in the field. This

4

is mainly due to time limitations. Second, time constraints prevented us from im-

plemented the research features further. Finally, SimEOC has a large breadth of

features, but the depth of some of the features needs to be greatly enhanced.

1.11 Summary

In this chapter we discussed SimEOC and the features that set this work apart.

We also discussed the Universities and the Emergency Operations Center collaborat-

ing with us on the project as well as the grants that have supported this work.

5

CHAPTER 2

INTRODUCTION

2.1 Overview

In this chapter, we provide an introduction to emergency management. We begin

with a discussion of the relevance of managing disasters. Next, we define common

emergency management terms. This is followed by a short discussion on types of

training. After that, we discuss operations inside the Miami-Dade Emergency Oper-

ations Center.

2.2 Relevance

Managing crises is a complex endeavor. What makes managing a crisis so complex

is the fact that we cannot plan for everything [21]. Crises are often unexpected or they

stem from normal situations that behave in unexpected ways or ways that are beyond

the scope of the planned resources to address them [83]. Although crises often are

rare events, they can occur at any time, and the consequences can be enormous. At

the height of the H1N1 influenza outbreak between 2009 and 2010, 61 million people

became infected with this virus. In addition, H1N1 caused approximately 274,000

hospitalizations and 12,500 deaths [22]. In 2004, the Indian Ocean earthquake and

tsunami affected approximately 5 million people in Indonesia, Sri Lanka, India, and

the surrounding areas. Over 280,000 people died, and more than 1 million people

were displaced [184]. In the US, Hurricane Katrina was one of the most expensive

and devastating natural disasters in American history [134]. Over half a million

6

Figure 2.1. Flooding during Hurricane Katrina

people were affected by the hurricane, and the US energy infrastructure was severely

damaged [134]. These crises clearly show the need for disaster preparedness.

2.3 Emergency Management Governance

In the US, emergencies are managed in a decentralized, distributed fashion. That

is, emergencies are managed at the local level until they grow beyond the scope

of local resources. When emergencies grow beyond the scope of local resources,

local officials turn to the states for additional aid. When the emergencies grow

beyond the scope of state resources, states turn to the federal government. The main

agency involved in local management of crises are local and state Departments of

Emergency Management. The main agency involved in the federal response to a

crisis is the Federal Emergency Management Agency (FEMA). FEMA falls under

the Department of Homeland Security.

2.4 Incident Command System (ICS)

The Incident Command System is “a systematic tool used for the command, con-

trol, and coordination of emergency response. ICS allows agencies to work together

using common terminology and operating procedures controlling personnel, facilities,

7

Incident Command

Operations Planning LogisticsFinance/Admin-istration

Public Information

Officer

Safety Officer

Liaison Officer

CommandStaff

General Staff

Branch #1

Branch #2

Agency Liaison #1

Agency Liaison #3

Agency Liaison #2

Agency Liaison #1

Agency Liaison #3

Agency Liaison #2

Agency Liaisons

Figure 2.2. Incident Command System [41]. The command staff, thegeneral staff, and the agency liaisons assist the incident commander during

an emergency.

equipment, and communications at a single incident scene. It facilitates a consis-

tent response to any incident by employing a common organizational structure that

can be expanded and contracted in a logical manner based on the level of required

response.” [170]

The Incident Command System typically incorporates five major functional ar-

eas: Command, Operations, Planning, Logistics and Finance/Administration. (see

Figure 2.2). All of the functional areas may or may not be used based on the incident

needs.

8

An incident commander is assigned to manage each incident. The incident com-

mander has the overall responsibility for the incident. The incident commander may

be one person or it may be a team of people. The incident commander sets the

incident objectives, strategies and priorities. Depending on the needs of the incident,

the incident commander may designate a Command Staff and a General Staff. Com-

mand staff is comprised of a public information officer, a safety officer, and a liaison

officer. The section chiefs and branch director make up the General Staff. The pub-

lic information officer serves as the conduit for information to internal and external

stakeholders, including the media and the public. The safety officer monitors safety

conditions and develops measures for ensuring the safety of all incident personnel.

The liaison officer serves as the primary contact for other agencies assisting at an inci-

dent. The operations section establishes tactics and directs all operational resources

to aid in meeting the incident objectives. Operations can be further divided into

branches. Planning supports the incident by tracking resources, collecting and ana-

lyzing information, and maintaining documentation. Logistics arranges for resources

and needed services to support achievement of the incident objectives. Finance and

Administration monitors costs related to the incident [41, 166, 74]. Agency Liaisons

are individual representatives from various support organizations. They provide a

bridge between the organization and the emergency managers. They usually sit at

the EOC. The American Red Cross, the Salvation Army, and Law Enforcement are

example agency liaisons [57].

2.5 National Incident Management System (NIMS)

The National Incident Management System (NIMS) identifies a set of concepts

and principles that enable emergency managers to manage incidents. “NIMS pro-

vides a systematic, proactive approach to guide departments and agencies at all levels

of government, nongovernmental organizations, and the private sector to work seam-

9

lessly to prevent, protect against, respond to, recover from, and mitigate the effects of

incidents, regardless of cause, size, location, or complexity, in order to reduce the loss

of life and property and harm to the environment. NIMS is based on an appropriate

balance of flexibility and standardization” [43, 166, 54].

2.6 Emergency Support Functions (ESFs)

Emergency Support Functions (ESFs) are “grouping[s] of governmental and cer-

tain private sector capabilities into an organizational structure to provide support,

resources, program implementation, and services that are most likely needed to save

lives, protect property and the environment, restore essential services and critical in-

frastructure, and help victims and communities return to normal following domestic

incidents.” [163]

• ESF1 - Transportation

• ESF2 - Communications

• ESF3 - Public Works and Engineering

• ESF4 - Firefighting

• ESF5 - Emergency Management

• ESF6 - Mass Care, Housing, and Human Services

• ESF7 - Resources Support

• ESF8 - Public Health and Medical Services

• ESF9 - Urban Search and Rescue

• ESF10 - Oil and Hazardous Materials Response

• ESF11 - Agriculture and Natural Resources

• ESF12 - Energy

• ESF13 - Public Safety and Security

• ESF14 - Long-term Community Recovery and Mitigation

• ESF15 - External Affairs

10

2.7 Terminology

2.7.1 Common Operating Picture

The common operating picture is a broad view of what is happening or what has

happened in the disaster. Ideally all appropriate agencies who are working on the

disaster see the common operating picture [166, 164]. “A common operating picture

is established and maintained by gathering, collating, synthesizing, and disseminat-

ing incident information to all appropriate parties. Achieving a common operating

picture allows on-scene and off-scene personnel - such as those at the Incident Com-

mand Post, Emergency Operations Center, or within a Multiagency Coordination

Group - to have the same information about the incident, including the availability

and location of resources and the status of assistance requests.” [44]

2.7.2 Situational Awareness

Situational awareness is a state of awareness about your environment. Situational

awareness means knowing what is going on inside your immediate environment. It

also means knowing about incidental occurrences outside of your immediate environ-

ment that could affect you. For example, for an emergency manager who is dealing

with a local earthquake, situational awareness means being aware that there is a

nuclear power plant in the area and that the power plant may have a meltdown if

not contained. A general, widely applicable definition describes situational awareness

as “the perception of the elements in the environment within a volume of time and

space, the comprehension of their meaning and the application of their status in the

near future.” [66, 47]

11

2.7.3 Point of Distribution (POD)

A point of distribution is a staging area from which items can be distributed to

the public. Points of distribution are usually pre-planned. However, they can be

set up on demand as well. For example, if there were an influenza pandemic, local

schools may be assigned as points of distribution to give vaccination shots to the

public [57].

2.7.4 Standard Operating Procedures (SOP)

Standard Operating Procedures are pre-planned procedures for dealing with situa-

tions encountered. They specify a standard way to deal with a situation in accordance

with the written policies of the organization [16, 44].

2.7.5 Incident Action Plan (IAP)

An Incident Action Plan is a plan, created by the planning section of the ICS,

which specifies a strategy the emergency managers are going to use to manage the

incident or crisis. An Incident Action Plan can be oral or written [73].

2.7.6 Incident, Disaster, Emergency, Crisis

An incident, disaster, emergency, or crisis is “any event that threatens to, or

actually does, inflict damage to property or people.” Emergencies can be small or

large, and we often call large emergencies disasters. Disasters can include hurricanes

and floods, explosions and toxic chemical releases, major transportation accidents,

and national security events [39].

2.7.7 Crisis Information Management System (CIMS)

A crisis information management system is a “crisis response system [that] sup-

port[s] communications, data gathering and analysis, and decision-making.” [81]

12

2.7.8 Crisis Management

Crisis management is defined as “a systematic attempt by organizational members

to identify and detect possible crises, take actions and measures to prevent them,

contain their effects or disruption, and finally recover.” [128, 133, 150]

2.7.8.1 Small Scale Versus Large Scale Crises

Crises can be small or large. A small scale or routine crisis is one which may

be predictable, and for which there are training exercises. Emergency managers

usually are familiar with the situation and there are standard operating procedures

and policies in place for dealing with them. There also usually are adequate resources

available to deal with these. An example of a routine crisis is a house fire. A non-

routine or large-scale crisis, on the other hand, is one which is beyond the scope

of our resources. In large-scale crises, there is a significant probability of extreme

danger. There also usually is political and media involvement. They generally have

highly unpredictable outcomes. They are rare and beyond our normal experiences.

An example of a large scale crisis is a tsunami or a pandemic [83].

2.7.8.2 First Responders verus Emergency Managers

SimEOC is a training tool for emergency managers rather than for first responders.

First responders are those that are on the scene of an incident and take command

of the immediate threat. These typically include firefighters, emergency medical

services, and police officers. They can be volunteer or full time staff. Emergency

managers are full time staff who are removed from the immediate incident and who

operate at the managerial level of the incident to coordinate the response. Their role

is not to contain the immediate incident, (e.g. put out the fire, clean up the spill),

but rather to coordinate resources for the first responders and to manage public

relations. In effect, they are coordinators: “the emergency manager is responsible

13

for coordinating the plans of the various components of the emergency management

system - fire and police, emergency medical services, public works, volunteers, and

other groups contributing to the community’s management of emergencies.” [39]

2.7.9 Hot Wash

A hot wash is “a facilitated discussion held immediately following an exercise

among exercise players from each functional area. It is designed to capture feedback

about any issues, concerns, or proposed improvements players may have about the

exercise. The hot wash is an opportunity for players to voice their opinions on

the exercise and their own performance. This facilitated meeting allows players to

participate in a self-assessment of the exercise play and provides a general assessment

of how the jurisdiction performed in the exercise. The hot wash should last no more

than 30 minutes.” [167]

2.7.10 SimCell

The Simulation Cell (SimCell) is the coordination center for an exercise. Con-

trollers and other authorized personnel in the SimCell control the pace of the exercise,

issue injects, and simulate outside communication and responses for the players. [57]

2.7.11 Inject

An inject is an input, usually in the form or a status update, from one agency to

another during an exercise. An example inject is:

14

!!"#$%% &'($)*%

+,#-$.%%%%%/$)$"0"'1%21$')3%%%%%

4$'5$.% 6$7781$%!39$%%%%%

6$7781$%!$:*%%%%% ;:9$)*$5%2)*"<'%%%%%

;:=>8'%?-(@%

""#"% #% $!%&'()!*+,+-+./'0!!!!%

1+230'4+./!5!$!%&'()!678!!!!%

9/:.!;&<3&-4!!!!%

8+4=!$'=.>!'/?!(.22+--+./&>-!@'/4!4.!(./:+>2!@)&/!4.A+('0!-4.>2!:.>(&!@+/?-!@+00!'>>+,&!'/?!@)+()!-)&04&>-!@+00!B&!.A&/C!!!!%

D((&--!9DE!.>!(./4'(4!$3/+(+A'0!*+>&(4.>!:.>!+/:.>2'4+./!!!!%

F!

!G6HE0'/!7BI!J!6H&>(+-&!E0'/!7BI&(4+,&!

Figure 2.3. A Sample Inject

2.7.12 Script

A script is a series of injects that are given to the players in the form of an exercise.

2.7.13 Emergency Operations Center (EOC)

An Emergency Operations Center is a secure location where upper-level emer-

gency officials gather to prepare for, manage, and coordinate the response to an

incident (e.g. tsunami, earthquake, hurricane, pandemic).

2.8 Types of Training

2.8.1 Individuals, Groups, and Organizations

Training can be accomplished on an individual basis or training can be accom-

plished for a group or organization. Training on an individual basis can consist of

training a single liaison. Training on a group level can consist of an entire group such

as the public safety group or the infrastructure group. Finally, training can consist

15

Individual Training Organizational Training

Individual Courses Train

the Trainer

Courses

Discussion-basedO

perations-basedSeminars

Workshops

Tabletop Exercises

Games

Drills

Functional Exercises

Full-Scale Exercises

Figure 2.4. Types of Training

of training the entire EOC as well.

2.8.2 Discussion-based

Training can be discussion-based or operations-based. (see Figure 2.4). Discussion-

based training provides a forum for discussing and developing plans, policies, agree-

ments, and procedures. It typically focuses on strategic, policy-oriented issues and

it does not involve the deployment of resources. Seminars, workshops, tabletop ex-

ercises, and games are all discussion-based exercises [42].

2.8.3 Operations-based

Operations-based exercises are more complicated than discussion-based exercises.

They usually involve the deployment of resources and personnel, and they require

16

execution of plans, policies, agreements, and procedures. They also are used to

clarify roles and responsibilities and to improve individual and team performances.

Drills, functional exercises, and full-scale exercises are operations-based exercises [42].

2.8.4 Seminars

A seminar is “an informal discussion-based exercise led by a presenter or facil-

itator, used to teach or to orient participants to new or existing plans, policies, or

procedures.” They are also used to construct a common framework of understand-

ing and to assess inter-agency or inter-jurisdictional operations. They usually are

informal and lecture based [42].

2.8.5 Train-the-Trainer

Train-the-trainer workshops are exercises in which emergency management per-

sonal train each other, usually on new system capabilities and new procedures. Train-

the-trainer exercises can also consist of a review of existing procedures. They are

usually informal and lecture based [57].

2.8.6 Workshops

A workshop is “a formal discussion-based exercise led by a facilitator or presenter,

used to build or achieve a product.” The goal of workshops is to develop new ideas,

processes, or procedures and to develop a written product as a group in coordinated

activities. Workshops involve more participant discussion than lecture based seminars

and train the trainer exercises [42].

2.8.7 Tabletop Exercises

A tabletop exercise (TTX) “involves senior staff, elected or appointed officials,

or other key personnel in an informal group discussion centered on a hypothetical

17

scenario.” Tabletop exercises are used to identify strengths and shortfalls and to

enhance understanding of new concepts [42].

2.8.8 Games

A game is “a simulation of operations using rules, data, and procedures designed

to depict an actual or assumed real-life situation.” Games explore the processes and

consequences of decision-making and conduct ‘what-if’ analyses of existing plans.

They also may be used to test potential strategies. Note that games do not involve

the use of actual resources [42].

2.8.9 Drills

A drill is “a supervised activity that tests a specific operation or function of a single

agency.” Drills are used to gain training on new equipment, test new procedures, and

practice and maintain skills. Drills should be realistic, but also maintain an isolated

environment [42].

2.8.10 Functional Exercises

A functional exercise (FE) is “a single or multi-agency activity designed to eval-

uate capabilities and multiple functions using simulated response.” Functional exer-

cises evaluate management of Emergency Operations Centers, command posts, and

headquarters and access the adequacy of response plans and resources [42].

2.8.11 Full-scale Exercises

A full-scale exercise (FSE) is “a high-stress multi-agency, multi-jurisdictional ac-

tivity involving actual deployment of resources in a coordinated response, as if a real

incident had occurred.” Full-scale exercises are used to assess plans and procedures

under crisis conditions. They also are used to evaluate coordinated response under

18

high-stress, crisis conditions. Full-scale exercises involve the mobilization of units,

personnel, and equipment [42].

2.9 Exercise Cycle

Exercises typically last about 4 hours. Personnel arrive around 8:00am in the

morning to participate in the exercise, which is finished by about noon. Directly

after the exercise, the evaluators and exercise designers have a hot wash. Following

an exercise, the organization creates an after action report. The after action report

specifies how the organization performed in the exercise, what they did well, and

room for improvements. Feedback from the hot wash usually goes into the after

action report.

2.9.1 Exercise Staff

Exercise Staff consist of an exercise director, a senior controller, controllers, sim-

ulators, and evaluators.

2.9.2 Exercise Director

The Exercise Director has the overall responsibility for planning, coordinating,

and overseeing all exercise functions. He/she manages the exercise activities and

maintains a close dialogue with the Senior Controller regarding the status of play

and the achievement of the exercise design objectives. [97]

2.9.3 Controllers

“Controllers set up and operate the exercise site and plan and manage exercise

play. Controllers direct the pace of exercise play and routinely include members from

the exercise planning team. Controllers also work with the Simulation Cell (SIM-

CELL) to control the flow of the exercise and explain or clarify issues arising during

19

the exercise. Controllers have limited decision-making authority in their respective

areas. Any changes that impact the scenario or affect other areas of play must be

coordinated through the Senior Controller. Controllers record events and ensure doc-

umentation is submitted for review and inclusion in the After Action Report (AAR).

All controllers are a accountable to the Senior Controller.” [97]

2.9.4 Senior Controller

“The Senior Controller is responsible for the overall organization of [the exercise],

and will take direction from the Exercise Director. The Senior Controller moni-

tors exercise progress and coordinates decisions regarding deviations or significant

changes to the scenario caused by unexpected developments during play. The Senior

Controller monitors actions by individual controllers and ensures they implement all

designated and modified actions at the appropriate time. The Senior Controller de-

briefs the controllers and evaluators after the exercise and oversees the setup and

takedown of the exercise.” [97]

2.9.5 Simulators

“Simulators are control staff personnel who role-play as nonparticipating organiza-

tions or individuals. They most often operate out of the SIMCELL, but may occasion-

ally have face-to-face contact with players. Simulators function semi-independently

under the supervision of SIMCELL controllers, enacting roles (e.g., as media reporters

or next of kin) in accordance with instructions provided in the Master Scenario Events

List (MSEL).” [97]

2.9.6 Evaluators

“Evaluators work as a team with controllers. Evaluators are SMEs who record

events that take place during the exercise and submit documentation for review and

20

inclusion in the After Action Report (AAR). Evaluators should not have any direct

interaction with the players.” [97]

2.10 The Miami-Dade EOC

2.10.1 Incident Command

In the development of SimEOC, we modeled the command structure after the

Miami-Dade County Emergency Operations Center. In day-to-day operations, emer-

gency management staff are involved in preparedness and mitigation strategies for

future crises [39]. When a disaster strikes, the emergency management staff drop their

day-to-day roles and take on the role assigned to them by the Incident Commander.

This role usually involves leading a section or branch of the incident command system

or ICS [85]. There are five main sections in accordance with ICS. The main sections

are incident command, operations, planning, logistics, and finance/administration.

(See Figure 2.5).

Operations is organized into four branches: Public Safety, Human Services, In-

frastructure, and Municipal. Planning consists of Geographic Information Systems

(GIS), the 311 Public Information Call Center, and three units to aid in incident

planning and documentation. Finally, Logistics is divided into EOC Support and

Disaster Resources.

2.10.2 Levels of Activation

EOCs usually have 3 or 4 levels of activation. At the Miami-Dade County EOC,

there are three levels of activation. The severity of the incident determines the level of

activation. During day-to-day operations where no specific situation is occurring, the

EOC is not activated. There is, however, a 24 hour on-call duty officer. This officer

is advised of potential significant events such as those reported by the Miami-Dade

21

Figure 2.5. The Miami-Dade Emergency Operations Center IncidentCommand System

22

Alarm Office, State Warning Point, concerned citizens, and other agencies [93].

2.10.2.1 Level 3: Monitoring and Assessment

“Level III is typically a monitoring and assessment phase where a specific threat,

unusual event, or situation, is actively monitored by the EOC.” The EOC keeps a

watch on the situation. Staff enter information about the situation into the crisis

information management system and continue to observe the situation until it is

resolved or warrants additional action [93].

2.10.2.2 Level 2: Partial Activation

“Level II partial activation is typically a limited agency activation. Staff and

ESF lead agencies with a role in the incident response are activated and report to the

EOC.” Peripheral agencies are put on standby. The nature of this event is minor at

this point, however, it has the possibility of becoming more serious. The EOC may

begin 24-hour operations [93].

2.10.2.3 Level 1: Full-Scale Activation

In level I activation, the EOC instantiates the Incident Command System, and

all staff and supporting representatives are activated. Resolving the incident may

take days or even weeks. The incident is serious, and state and federal agencies are

notified in the event that their help is needed [93].

2.11 Summary

In this chapter we introduced emergency management and the management of

disasters through Emergency Operations Centers and the Incident Command System.

We defined emergency management terms including types of training. Finally, we

examined operations inside the Miami-Dade EOC.

23

CHAPTER 3

CRISIS INFORMATION MANAGEMENT SOFTWARE

3.1 Overview

In this chapter, we give a brief overview of several leading commercial and research

programs that are similar to SimEOC.

3.2 Commercial Products

3.2.1 Incident Commander

Incident commander is a commercial simulation training tool for first responders.

It allows personnel to train individually or in groups, and it also allows them to train

from any PC or laptop anytime and in any place. It is capable of generating 3D

worlds from GIS data, and it supports the creation of multiplayer scenarios. Incident

commander boasts a realistic, immersive, and engaging training environment that

hones critical decision-making skills [72].

3.2.2 Advanced Disaster Management Simulator (ADMS)

ADMS is an simulator training system for emergency managers, public safety

officials, incident commanders, crew leaders, first responders, Offices of Emergency

Management (OEM) and Emergency Operation Centers (EOC). ADMS can simu-

late a variety of large scale incidents, including natural and man-made incidents.

ADMS provides an immersive simulation experience for the trainees. “Training with

ADMS is realistic and the trainees experience the consequences of their decisions as

24

they would in real-life situations. Interactive decision making is demanded by the

trainees, which ultimately affects the development, resolution or escalation of the

event. ADMS scenarios are open-ended, with no pre-determined endings, just like

the real life situations [trainees] will face.” [1]

3.2.3 WebEOC

WebEOC is a commercial web-based crisis information management system [50].

Similar to the vEOC, WebEOC targets upper level emergency managers and provides

a system for preparing for, managing, and responding to crises [50]. It also provides

access both locally and remotely from any web browser [50]. It is Incident Com-

mand System compliant, and it includes multiple collaborative tools such as chat and

messaging features [50].

3.2.4 Rapid Response Virtual Emergency Operations Center

Rapid Response Virtual Emergency Operations Center (R2VEOC) [143] is a com-

mercial virtual EOC that is part of the St. Louis, Missouri Area Regional Response

System (STARRS) [153]. Its main focus is on interagency regional collaboration

in the Missouri area. Similar to our goals, it incorporates a plug-in for the incident

management software (E-team [107]) in use in the physical EOC. It also includes a

user training module.

3.2.5 E-Team

E-Team [105] is a comprehensive, commercial off-the-shelf (COTS) incident man-

agement system for large organizations that need to manage planned and unplanned

events and incidents. E-team is web-based and is scalable to thousands of users [106].

E-Team supports a powerful capability for regional interoperability, allowing jurisdic-

tions to securely share critical incident information with any other E Team-enabled

25

system during an area-wide response [106].

3.2.6 Emergency Management Staff Trainer (EMST)

The emergency management staff trainer is a single or multiplayer simulation-

based exercise system for upper-level emergency managers. The application does

not need to be installed - the user opens a browser and signs into EMST. Using

scenarios which can be tailored to roles for one person, multiple persons, or even

multiple agencies, the players proceeds through an exercise. EMST contains a variety

of scenarios ranging from small-scale natural disasters to large-scale technological

disasters. [46]

3.2.7 Civil Emergency Reaction and Responder Training System (CERRTS)

The Civil Emergency Reaction and Responder Training System (CERRTS) is “a

computer driven, emergency response and crisis rehearsal training system that pro-

vides realistic Homeland Security environments to train key members of Emergency

Operations Centers (EOC) and First Responders” [25]. CERRTS is similar in pur-

pose and scope to SimEOC, and it also allows anyone to train from any PC, anytime,

anywhere in the world [137].

3.2.8 Virtual Staff Trainer

The Virtual Staff Trainer is a tool that allows clients to create custom scenarios

that incorporate standard operating procedures. It includes specialized player roles,

mapping, and data support for the client’s county, city, or state [174].

26

3.3 Research Products

3.3.1 IISIS

IISIS, an Interactive, Intelligent, Spatial, Information System, is a research project

out of the University of Pittsburgh. IISIS is a socio-technical system to aid emergency

managers in the Pittsburgh area. IISIS uses a distributed knowledge base and GIS

and remote sensing imagery to provide a real time graphical representation of changes

of an area in crisis.

3.3.2 DC-Train

DC-Train is an immersive, multi-media training environment for damage control

aboard naval vessels. DC-train is similar in scope to what we have accomplished with

SimEOC; however, our tutoring system is not as extensive.

3.4 Summary

In this chapter we described projects with features that overlapped those included

in SimEOC.

27

CHAPTER 4

DESIGN

4.1 Overview

In this chapter we discuss design methodologies for developing SimEOC. We also

discuss design documents and the development environment. Finally we discuss the

SimEOC console and user views.

4.2 Spiral Design

SimEOC has been developed using the spiral development model. The spiral

model is an iterative model in which development proceeds through incremental re-

leases of the system. The lifecycle usually begins with a prototype and proceeds in

iterations as outlined in Figure 4.1 below. The development of SimEOC proceeded in

five primary iterations. The first iteration lasted from Fall 2008-Spring 2009. In this

iteration, we identified the initial requirements and documented EOC procedures.

We also started planning how to implement an initial prototype and determining the

best technologies to use to meet the requirements. Next we designed and deployed

a prototype, which consisted of an ability to log into a distributed, web-based client

and to select and run a simple script. It also included a simple crisis information

management system, which consisted of standard status boards and status updates.

The next iteration lasted from Summer 2009-Spring 2010. In this iteration, we made

significant improvements to the simple crisis information management system and

to the scripting capabilities. In this iteration, we conducted field research with the

28

prototype and solicited feedback from EOC practitioners and research collaborators

[3, 65, 84, 85, 89, 129, 175]. The third iteration lasted from Summer 2010-Spring

2011. In this iteration, we added more components to the crisis information man-

agement system and to the vEOC. Specifically, we added a map/GIS capability as

well as a simple learning tutor and a few simple dashboards. We also made re-

fined improvements to the current system, including more advanced logging and a

component which enables individuals to create evaluation objectives and target ca-

pabilities for the exercise. The fourth iteration has been from Summer 2011-Spring

2014. In this iteration, we concentrated on testing and validation of the system. We

have conducted extensive functional and usability testing of the system. We also

have conducted multiple distributed evaluations and system validations. Finally, the

fifth iteration has been in Fall 2014. In this iteration, we refined the Exercise De-

veloper/Controller console, and we created an Exercise Evaluator Console and an

Administrator Console. We also conducted a validation test with the Miami-Dade

EOC, ND, and Emory University in October.

4.3 Design Requisites

SimEOC was developed with several key design requisites.

• User-centered application design

• Keep it simple

• Windows-based layout

• Train like we fight

4.3.1 User-Centered Application Design

SimEOC was developed using user-centered application design. In user-centered

application design, the system is designed around the needs and preferences of the

users rather than the preferences of the developers. We accomplished this goal by

29

Figure 4.1. The Spiral Model of Software Development1

c©2014 Barry Boehm. Used with permission. Adapted with vEOC development dates.

30

using on-going collaboration, validation, and input from the Miami-Dade EOC and

our collaborators at FIU and Emory University. During the requirements elicitation,

we visited the EOC several times to view exercises, observe EOC procedures, and

get feedback on system requirements [95, 96, 59]. In 2009-2010, I embedded myself

in the Miami-Dade County EOC, where I observed and documented procedures and

solicited feedback on an early prototype design. In the summer of 2010, we conducted

a usability test with the Miami-Dade EOC. In Fall 2010, we conducted testing and

added features to the prototype. In Spring-Summer 2011, we conducted development

with researchers at Florida International University. In Summer 2014, we solicited

feedback from FIU on the researcher console. Finally, in Oct 2014, we went down to

Florida for a usability test with Miami-Dade EOC and Emory.

4.3.2 Keep It Simple

One of the leading requests by the Miami-Dade EOC was to “keep it simple.”

[85] In the middle of a disaster, personnel are dealing with the crisis. Few organiza-

tions have the time or resources to train new personnel; their foremost concern is on

stabilizing the crisis, not on training individuals [57, 151, 159, 182]. In fact, during a

series of exercises in San Diego in 2008 (Golden Phoenix), the request from the mil-

itary liaison regarding technology was “Four buttons max, please. The feeling was

that if it required an engineer to operate, it was a distraction and potential danger

during a crisis.” [182]

4.3.3 Windows-based Layout

During a crisis, there is a lot of information that emergency managers need some of

the time. Tab-based interfaces can waste space and become overwhelming to the user.

An alternative to a tab-based layout is a windows-based layout. In window-based

layouts, each menu item creates a new pop-up window. This allows individuals to

31

have more information readily available and to switch easily between various windows

of information.

4.3.4 Train Like We Fight

According to a leading design principle for crisis information management sys-

tems, an emergency system that is not used on a regular basis before an emergency

will never be of use in an actual emergency [159]. This principle is particularly

applicable for training. Having training systems that are similar to the actual sys-

tem is critical to the success of emergency managers. At the Miami-Dade EOC, the

crisis information management system in use is WebEOC. An additional benefit of

the window-based layout of the interface is that it is similar to WebEOC. Since the

Miami-Dade EOC is very familiar with this kind of interface, SimEOC facilitates a

smooth transition between training and an emergency.

4.4 More Than Just Aesthetics

There are many considerations that go into good interface design. In fact, an

interface can be decomposed into 4 main components. As summarized by Clark [26],

these four components are content, functionality, aesthetics, and usability. Content

includes the features that are in the user interface; aesthetics indicates how pleasing

the user interface is to the eye; functionality includes what the user interface is capable

of doing; and usability is the user-friendliness of the interface. We developed each of

these areas in the design of SimEOC.

4.5 Expert Validation

In order to validate the system and obtain an expert subject matter knowledge

base, we collaborated with one of the foremost Emergency Operations Centers in the

32

country - the Miami-Dade EOC.

4.6 Design Documents

Appendices D- E map out the process flows and mental models of various parts

of the EOC that were used as a basis for designing SimEOC.

4.7 Development Environment

4.7.1 Technologies Employed

SimEOC uses a variety of web-based technologies. On the client side, technologies

include XHTML, CSS, Dynamic HTML, AJAX, Reverse AJAX, and JavaScript. On

the server side, technologies employed are PHP, MySQL, DOJO and the Jetty server.

4.7.2 Jetty Server

SimEOC is built with the Jetty Server version 6.1.14. The Jetty Server is a web

server, and it was chosen because of the DOJO toolkit/Reverse AJAX capabilities

incorporated in the server [82].

4.7.3 Database

Behind the web client interface is a database that stores all of the system infor-

mation. The database is a MySQL implementation.

4.7.4 Virtual Machines

One of the most unique aspects of SimEOC is that all development and deploy-

ment has been accomplished via virtual machines (VMs) using the Kernel-based

virtual machine (KVM) [87] and Ubuntu [162]. Virtual machines are simulators of

other computers that run on top of other physical computers. The virtual machine

33

usually is a complete image of the simulated computer, including the operating sys-

tem and its associated applications. The simulated virtual machine is called the guest

machine, and the computer on which it runs is called the host machine [88, 176].

There are several pros and cons to using virtual machines. For example, virtual ma-

chines that run on the host operating system can be an order of magnitude slower

than standalone machines, and they create a single point of failure [88, 5]. On the

other hand, since the virtual machine isolates various servers from each other and

from the underlying hardware, this facilitates a faster time to recover from hardware

failures. It also aids in portability and transparency among various hardware storage

locations of the system. Finally, virtual machines allow for easy installation and setup

of SimEOC; Rather than needing to configure databases and servers, the user simply

can play a virtual machine and have minimal configuration and setup of SimEOC.

We have a dedicated VM set up for development and for deployment.

4.7.5 Secure Socket Layer (SSL)

All transmissions go through the Secure Socket Layer in order to access SimEOC.

For additional security protection, we are using port 443.

4.7.6 Redmine Server

Development has been coordinated through the Redmine Server at Notre Dame.

The Redmine Server is a system that assists in software development. It contains

wikis, bug tracking tools, and user forums.

4.8 Console

The vEOC is based on the WebEOC console. WebEOC is the crisis information

management system that Miami-Dade County uses at the EOC. It is composed of

window-based browsing and dynamically configurable boards. See Figure 4.2.

34

WebEOC® Town Square

7.12

ESi 823 Broad St. Augusta, GA 30901 Office: 706.823.0911 Fax: 706.826.9911 WebEOCTownSquare.com ESi911.com [email protected]

Built on the proven WebEOC Professional platform, WebEOC Town Square has been customized to meet the unique needs of local emergency managers.

Affordable Priced at $19,999, most local communities can afford this world-class crisis information management system.

Functional Emergency managers can configure Town Square with their local information, permissions, and positions – and make it their own.

Reliable Even if local power and Internet service is lost, Town Square running on emergency power will enable EOC staff to continue to manage the incident or event.

Easy to Use Town Square is easy to install and operate.

Collaboration Ready Town Square is ESiWebFUSION™ ready for open collaboration with other WebEOC-equipped agencies.

In response to requests from emergency managers throughout the U.S. who want WebEOC power and reliability but are limited by staff and budget, ESi® has developed WebEOC® Town Square™ – an incident and event management system that offers many of the same features and benefits as our industry-leading product WebEOC Professional, but at a more affordable cost.

Town Square is built from the same crisis-information management software that has been used successfully since 1996 by federal, state and local governments to manage a myriad of incidents and special events.

Priced at $19,999, Town Square enables emergency management agencies that do not need the full power or flexibility of WebEOC Professional to move from manual or home grown incident management systems to a commercial off-the-shelf product.

Real-time Information Management Town Square gives local emergency managers and first responders the ability to manage incidents and special events via the Internet from anywhere – inside the EOC, the field, an office, even a hotel on the other side of the country.

Tested and Secure Because Town Square is built on the WebEOC Professional platform, it benefits from the rigorous third-party testing to which WebEOC is subjected. The penetration testing and code-level reviews performed by Aspect Security, Inc. enable ESi to deliver a secure product emergency managers can count on. The Open Web Application Security Project (OWASP) Top Ten Project, led by Aspect Security, and the testing methodology employed has been adopted by the FTC, DISA, and others.

Everything local emergency managers need to manage incidents and planned events Collaboration common operating picture situational awareness – these are all terms synonymous with WebEOC Professional and Town Square. Town Square includes a set of the most popular WebEOC status boards, a NIMS incident action planning (IAP) tool and a suite of ICS and FEMA forms. It also includes WebEOC Mapper Lite, a quick and easy mapping tool, and WebEOC MapTac™, a briefing tool that allows users to publish, annotate and share a map, dispersion model or digital photo with other users.

The capabilities of Town Square can be enhanced with the purchase of optional plug-ins – WebEOC Resource Manager and WebEOC Mapper Professional. ESiWebFUSION™ will give Town Square customers the ability to share information and resources with other WebEOC Professional or Town Square customers.

Why local agencies who are using another agency’s system buy Town Square Town Square customers can set up and configure users when needed. They aren’t reliant on another agency’s WebEOC administrator to do it for them when time is available. For many agencies, the ability to control their own data is an overriding concern, particularly if that data contains confidential or sensitive information. Town Square will continue to operate even if Internet connectivity is lost, as long as the software is installed locally and systems remain operational.

Your local incident and event management solution

TM

WebEOC® Mapper Professional 3.0

WebEOC® and ESi® are registered trademarks of ESi Acquisition, Inc. All other trademarks are the property of their respective owners. 5.11

ESi 823 Broad St. Augusta, GA 30901 Office: 706.823.0911 Fax: 706.826.9911 www.ESi911.com [email protected]

WebEOC Mapper Professional 3.0 is the most effective visual tool for incident management available. It delivers:

Situational awareness: Through the real-time display of actionable information.

Ease of use: It’s a high-powered GIS tool built especially for emergency managers who don’t have GIS experience.

Advanced features: Including plume models and built-in modeling for a chemical release.

Full integration with WebEOC: It’s the only visualization application with full, two-way interaction with WebEOC.

User Accountability: Mapper 3.0 and WebEOC use role-based security and audit logs for full control over process security.

World-class support: Get regular updates and 24/7 support from ESi and Tucuxi Software.

WebEOC Mapper Professional 3.0 brings the power of visualization technology into the emergency operations center.

With WebEOC Mapper Professional 3.0, emergency managers are able to create a dynamic, geographically-based common operating picture without the need for specialized GIS or mapping expertise.

WebEOC Mapper Professional 3.0 allows users to view data from multiple WebEOC boards in the context of other map data to achieve an easy-to-understand common operating picture. Users can display the data with custom icons that are relevant to their own organizations.

Support for real-time GeoRSS and KML feeds enables the system to display real-time information, such as hurricane tracks, on the map. Layers from local data sources, such as a street network, can be combined with online data to enhance situational awareness.

With WebEOC Mapper Professional 3.0, users can:!

View live, dynamic, multi-user WebEOC board data on a map.!

Combine WebEOC data with other geographic information system (GIS) data or services on a single map.

Dynamically push and pull data in and out of WebEOC.

Configure the map with data from local and remote services on the fly.

Mapping and location functions are already built into the WebEOC core “Board Builder” tools. A WebEOC administrator can easily create WebEOC boards that automatically generate maps of captured information.

For example, an administrator can build a status board to monitor wildfire reports. Location fields in the status board’s data input form can be configured to allow the average WebEOC user to geocode locations on the fly.

The user interface doesn’t require the user to be a GIS expert or a WebEOC administrator. With the appropriate permissions, a user can enter address data (such as street address, building number or place-name), or latitude-longitude coordinates and simply press a map button to display information geographically. Mapper 3.0 also supports open mapping standards, including the OpenGIS WMS standard.

Put a cutting-edge visualization tool in the hands of decision makers

TM

Information Technology Solutions

ESi 823 Broad St. Augusta, GA 30901 Office: 706.823.0911 Fax: 706.826.9911 www.ESi911.com [email protected]

Benefits of WebEOC: Easy to use: Users can be trained in 15 minutes

Easy to administer: Admin Reports simplify administration, record keeping and maintenance

Highly configurable: Use any of the standard status boards “as is” or create custom boards

World-class customer support: 24/7/365, by phone, e-mail, online

Affordable: Sold on a per server basis with unlimited users

Secure: Received the U.S. Army Certificate of Networthiness

Interoperable: Enables users to send and receive CAP messages

COTS: A commercial, off-the-shelf product that is ready to install

Mobile Support: WebEOC 7.4 supports most Blackberry, Apple iPhones, Apple iPads, and Android devices

Proven success: Hundreds of installations completed worldwide.

I ntroduced in 1998, WebEOC Professional is a web-enabled, user-friendly and locally-configurable incident and event management system. With

access to the Internet, authorized emergency managers and first responders, regardless of location, can enter and view incident information in WebEOC status boards.

WebEOC is a boundless collaboration tool that creates a common operating picture, enabling emergency managers to make sound decisions quickly.

WebEOC enables users to manage multiple incidents and daily events, assign and track missions and tasks, provide situation reports, manage resources and prepare incident command system (ICS) and incident action plan (IAP) reports.

At the federal level of the U.S. government, WebEOC is used by the Environmental Protection Agency, the Government Accountability Office, the National Aeronautics and Space Administration, the Nuclear Regulatory Commission, the TVA, the U.S. Senate and the U.S. Departments of Agriculture, Commerce, Defense, Energy, Health and Human Services, Homeland Security (FEMA), Interior, Transportation, and Veterans Affairs, as well as other federal agencies which do not allow their names to be listed for security reasons.

WebEOC is also used by the District of Columbia and more than 60 state-level agencies in 40 states and U.S. territories. Nationwide, WebEOC is used by thousands of first responders and emergency managers working at the county or city level in 48 of 50 states, representing hundreds of jurisdictions. WebEOC has also been adopted by many government agencies around the world.

On the corporate side, customers can be found in banking, finance, defense, energy, entertainment, healthcare, manufacturing, telecommunications and transportation. This includes domestic and international airlines, cruise ship lines, nuclear power facilities, petrochemical companies, hospital and healthcare organizations and universities. To view the ESi customer list, visit www.ESi911.com.

WebEOC is currently offered in five versions: Professional, ST, Air, for Hospitals and Town Square. WebEOC Professional was developed to meet the needs of emergency management organizations everywhere. WebEOC ST was designed specifically for the surface transportation sector; WebEOC Air for the airline industry; WebEOC for Hospitals for the healthcare field; and WebEOC Town Square for state and local government agencies in the United States.

WebEOC® and ESi® are registered trademarks of ESi Acquisition, Inc. All other trademarks are the property of their respective owners. 7.12 TM

WebEOC® Professional Incident & Event Management Software

Figure 4.2. Example Screenshots from the WebEOC Console

35

4.9 User Views

There are 6 different user views in the vEOC. That is, there are 6 main roles that

a user can assume: the trainee, the exercise developer, the exercise controller, the

researcher, the evaluator, and the administrator.

4.9.1 Trainee

The trainee prepares for emergency situations and practices decision making by

interacting with the vEOC. Trainee positions include liaisons, branch directors, sec-

tion chiefs, incident command, general staff, command staff, elected officials, logistics,

and planning.

4.9.2 Exercise Developer

The exercise developer creates scripts and develops exercises to train emergency

personnel.

4.9.3 Exercise Controller

The exercise controller moderates the exercise/training sessions. The exercise

controller is equivalent to the controller in the scenario based functional exercises,

with the exception that injects are automatically presented to the trainee. The

exercise controller has the greater ability to begin, pause, and terminate the exercise.

The exercise controller can speed up or slow down the exercise as well.

4.9.4 Researcher

Researchers are individuals interested in studying various aspects of decision mak-

ing and emergency response. They typically are not EOC personnel.

36

4.9.5 Evaluator

The evaluator watches the trainees as they train during an exercise. Then he/she

evaluates the trainee on his/her performance throughout the exercise.

4.9.6 Administrator

The administrator maintains the vEOC software. The administrator also sets up

and moderates user profiles.

4.10 Summary

In this chapter, we discussed our scientific methodologies for developing SimEOC.

We also discussed the technologies involved in the development environment. Finally,

we concluded with a short discussion of the roles that users can assume when inter-

acting with SimEOC.

37

CHAPTER 5

THE APPLICATION

5.1 Overview

In this chapter we discuss the SimEOC application. We begin with a general

discussion about the software and tables behind the software. Next, we discuss the

SimEOC architecture. We conclude with a discussion of vEOC consoles and system

capabilities.

5.2 About the Software

SimEOC consists of 5 consoles (See Section 5.5), 26 modules (See Section 5.4)

and 273 files (See Appendix H).

5.3 Tables

The backend of SimEOC is a database of tables. SimEOC has approximately 51

tables. See Appendix G for a complete table schema.

5.4 Software Architecture

The software architecture consists of five main modules corresponding to 6 user

roles: the trainee, the exercise developer, the exercise controller, the observer, the

researcher, and the administrator.

38

5.4.1 Trainee

The trainee console consists of 12 different modules: Tutorial, Exercise Back-

ground, Boards, Mapping, Reports, Logistics, Planning, a Learning Tutor, Dash-

boards, Communication Tools, Scripting, Exercise Logging. These are outlined in

Figure 5.1 below.

GIS Engine

Learning Tutor

*Still in Work

1 inject...2 help...

5 hurricane...4 message...3 flooding...

1 inject...2 help...

5 hurricane...4 message...3 flooding...

1 inject...2 help...

5 hurricane...4 message...3 flooding...

1 inject...2 help...

5 hurricane...4 message...3 flooding...

1 inject...2 help...

5 hurricane...4 message...3 flooding...

1 inject...2 help...

5 hurricane...4 message...3 flooding...

1 inject...2 help...

5 hurricane...4 message...3 flooding...

1 inject...2 help...

5 hurricane...4 message...3 flooding...

Decision Support System/

Dashboards

summary...

messages...calls...

sim. logs...

chats ......

1 inject...2 help...

5 hurricane...4 message...3 flooding...

Logging Scripting

Planning*LogisticsReports

Exercise Background

Communication Tools

BoardsTutorial

Dojo Library

Figure 5.1. The Trainee Architecture

5.4.2 Exercise Developer/Controller

The exercise developer/controller console consists of 6 different modules: Exercise

Development Tools, Database Control Tools, Script Developer, Exercise Control, and

Reports and Evaluation, and Communication Tools. These are outlined in Figure 5.2.

39

Evaluation Metrics

1 inject...2 help...

5 hurricane...4 message...3 flooding...

1 inject...2 help...

5 hurricane...4 message...3 flooding...

1 inject...2 help...

5 hurricane...4 message...3 flooding...

1 inject...2 help...

5 hurricane...4 message...3 flooding...

1 inject...2 help...

5 hurricane...4 message...3 flooding...

1 inject...2 help...

5 hurricane...4 message...3 flooding...

1 inject...2 help...

5 hurricane...4 message...3 flooding...

1 inject...2 help...

5 hurricane...4 message...3 flooding...

Exercise Development

Tools

Exercise Controller

Script Developer

Player Reports

Database Control Tools

1 inject...2 help...

5 hurricane...4 message...3 flooding...

Dojo Library

Communication Tools

Figure 5.2. The Exercise Developer Architecture

5.4.3 Evaluator

The evaluator console consists of 3 different modules: Exercise Observation Tools,

References, and Communication Tools. These are outlined in Figure 5.3.

1 inject...2 help...

5 hurricane...4 message...3 flooding...

1 inject...2 help...

5 hurricane...4 message...3 flooding...

1 inject...2 help...

5 hurricane...4 message...3 flooding...

1 inject...2 help...

5 hurricane...4 message...3 flooding...

1 inject...2 help...

5 hurricane...4 message...3 flooding...

1 inject...2 help...

5 hurricane...4 message...3 flooding...

1 inject...2 help...

5 hurricane...4 message...3 flooding...

1 inject...2 help...

5 hurricane...4 message...3 flooding...

References

Dojo Library

Communication Tools

Exercise Evaluation

Guides

Figure 5.3. The Evaluator Architecture

40

5.4.4 Researcher

The researcher console consists of 3 different modules: Evaluation Metrics, Player

Reports, and References. These are outlined in Figure 5.4.

Evaluation Metrics

1 inject...2 help...

5 hurricane...4 message...3 flooding...

1 inject...2 help...

5 hurricane...4 message...3 flooding...

1 inject...2 help...

5 hurricane...4 message...3 flooding...

1 inject...2 help...

5 hurricane...4 message...3 flooding...

1 inject...2 help...

5 hurricane...4 message...3 flooding...

1 inject...2 help...

5 hurricane...4 message...3 flooding...

1 inject...2 help...

5 hurricane...4 message...3 flooding...

1 inject...2 help...

5 hurricane...4 message...3 flooding...

Player Reports

References

Figure 5.4. The Researcher Architecture

5.4.5 Administrator

The administrator console is still in development. The administrator console

consists of 2 different modules: User Profiles, and System Configuration. These are

outlined in Figure 5.5.

41

1 inject...2 help...

5 hurricane...4 message...3 flooding...

1 inject...2 help...

5 hurricane...4 message...3 flooding...

1 inject...2 help...

5 hurricane...4 message...3 flooding...

1 inject...2 help...

5 hurricane...4 message...3 flooding...

1 inject...2 help...

5 hurricane...4 message...3 flooding...

1 inject...2 help...

5 hurricane...4 message...3 flooding...

1 inject...2 help...

5 hurricane...4 message...3 flooding...

1 inject...2 help...

5 hurricane...4 message...3 flooding...

User Profiles

System Configuration

Figure 5.5. The Administrator Architecture

5.5 vEOC Consoles

The vEOC consoles consist of five components: the Player/Trainee, the Exercise

Developer/Exercise Controller, the Researcher, the Evaluator, and the Administra-

tor.

5.5.1 Player/Trainee

The main components of the Player/Trainee include two consoles that pop up on

the left and right side of the screen when the user logs in to SimEOC - the main

panel and the exercise panel. See Figure 5.6.

5.5.1.1 Main Panel

The main panel is the place where trainees interact with the virtual EOC. It con-

tains an area for exercise background, a basic crisis information management system,

external links, and references. The exercise background includes contextual infor-

mation necessary for trainees to review before an exercise: player handbooks, initial

42

Figure 5.6. The Trainee Console

43

status, and the EOC layout. The crisis information management system contains ba-

sic status boards and updates, resource requests, mission/tasks, and simple reports

concerning the common operating picture. References include incident action plans,

a map of the current situation, and a reference of position responsibilities. In the

future, it also will be the main location in which the trainee can interact with various

third-party tools such as WebEOC, Hurrevac, SALT, and SLOSH.

5.5.1.2 Exercise Panel

The exercise panel is the place where players interact with the exercise. The

main components of the exercise manager are communication tools, dashboards, and

a learning tutor.

5.5.1.3 Communication Tools

The communication tools enable the trainee to receive and react to injects. In

this section, we simulate various communication tools, including the telephone, the

cellphone, face-to-face communication, PDAs, and radios. These are the main tools

with which emergency managers communicate [97, 57]. We record all communication

so that trainers and researchers can go back and analyze it at the conclusion of the

exercise.

5.5.1.4 Dashboards

The dashboards provide real-time feedback to the trainee concerning the effect

of his/her decisions on various aspects of the crisis and the resources involved. For

example, one of the categories is cost to county. If the trainee makes a decision that

affects the acquisition of resources for that county, then the dashboard will update

to reflect the total cost of those resources.

44

5.5.1.5 Learning Tutor

One goal of the learning tutor is to provide real-time feedback to the trainee on

various aspects of his/her decisions. Another goal of the learning tutor is to analyze

the decisions of the trainee and to compare the decision to the “correct decision” based

upon standard operating procedures (SOPs) and expert subject matter opinion. The

learning tutor is in the early stages of development and currently, it offers assistance

in the form of a question and answer forum. As the development progresses, it also

will be a place where the trainee can receive feedback and advice on his/her decision

as well as hints and guidance about what action to take next.

5.5.2 Exercise Developer/Exercise Controller

The exercise developer/exercise controller has two main capabilities, developing

the exercise and controlling/simulating the exercise. These two functions are split

logically into the exercise developer and the exercise controller. They both are ac-

cessed through the main exercise developer console. See Figure 5.7.

5.5.2.1 Exercise Developer

The exercise developer contains all of the tools for exercise managers to create an

exercise. This console enables the developer to develop exercises through the com-

plete exercise lifecycle. Developers begin by creating exercise objectives and target

capabilities. They continue with the creation of scripts for the exercise. Finally, they

establish starting status and resources and create player and evaluator handbooks

[167, 57, 168]. In the future, it also will include a place for exercise evaluation guides

and hot washes. Finally, it includes a report module, where players can receive a

combination of feedback on the exercise from the learning tutor and from subject

matter experts.

45

Figure 5.7. The Exercise Developer Console (left side of figure). The ScriptDeveloper Interface (right side of figure).

46

5.5.2.2 Script Developer

This is the place where the exercise developer creates and edits training scripts.

Developers can add events to a script from scratch or from a database of previously

used injects.

5.5.2.3 Report Module

The report module is the central location where exercise developers and instruc-

tors come to generate reports concerning the exercise and the trainees reactions to

the various elements of the exercise. Some metrics in use include percentage of injects

missed and average time required to complete actions in response to an inject. This

module also enables one to save, create, and search through reports.

5.5.2.4 Exercise Controller

The exercise controller is the main place where the exercise controller moderates

and controls the flow of the exercise. In paper-based scenario exercises, this is equiva-

lent to the SimCell. The control itself is realized through the simulation module. The

simulation module automatically processes and sends injects from the script stored

on the server to the clients who are participating in the exercise. It also managers

the response to the inject from the player. Additionally, it is the main location from

which exercise status inputs are created, pushed to, and received by the players.

There are 6 main controls that the exercise controller can use to control and

moderate the flow the exercise: start, stop, resume, pause, next block, and fast-time.

Start begins the exercise. Stop terminates the exercise. Pause and resume enable one

to temporarily suspend and continue the exercise. This is helpful for instructors who

want to provide immediate feedback to the trainee, especially if they see an action

particularly out of line with current procedures and expertise. Next block advances

the next inject early. This is useful to challenge the trainee during long durations of

47

inactivity during a crisis (if for the example, the trainee finishes all expected actions

faster than required and expected). Fast time speeds up the simulation exercise time.

Again, this is helpful during long durations of inactivity during a crisis or when the

trainee completes his/her required actions ahead of schedule.

5.5.3 Researcher

The Researcher console is the main place where researchers interact with the

vEOC. The Researcher console is comprised of Researcher Tools, Exercise Reports,

and References. See Figure 5.8.

5.5.3.1 Researcher Tools

This is the place where researchers can analyze various features of interest. Areas

of research include why, how, and in what manner trainees react to various inputs

and how their actions affect the crisis. We also are interested in how the organization

as a whole responds to various crises. In particular, SimEOC is especially set up to

study cognitive decision-making and knowledge management. To aid in collecting

information throughout the exercise, we have a dedicated area in the script editor

for the creation of research metrics. There are five default metrics that researchers

can manipulate when creating an exercise. These metrics are task non-routineness,

task novelty, component complexity, task difficulty, and task significance [139, 187].

Users can create custom metrics as well.

5.5.3.2 Exercise Reports

This is the place where researchers create, store, and retrieve reports for trainees

and for the organization. It is similar to the exercise developer reports module in

terms of its functionality and design.

48

Figure 5.8. The Researcher Console (left side of figure). An Exercise Log(right side of figure).

49

5.5.3.3 References

References contain items that researches may want to refer to in order to put the

exercise data in context. For example, references include the disaster map that was

used in the exercise as well as the exercise script.

5.5.4 Evaluator

The evaluator console includes tools for evaluating the trainee and items to ref-

erence to put their observations in context. See Figure 5.9.

5.5.4.1 Evaluation Tools

Evaluation tools are exercise evaluation guides in an electronic form. These are

generated directly from the target capabilities identified for an exercise.

5.5.4.2 References

References include items that the evaluator can reference to aid in conducting

the performance assessment of the trainees. Example references include the target

capabilities for the exercise, the player responsibilities for each position, and the

disaster map for the exercise.

5.5.5 Administrator

The administrator console includes tools for managing users and configuring the

system for an organization. See Figure 5.10.

5.5.5.1 User Profiles

Here the administrator can create new users and edit profiles. The administrator

can also specify user roles within the system (e.g. administrator, researcher, exercise

50

Figure 5.9. The Evaluator Console (left side of figure). An ExerciseEvaluation Guide Analysis Sheet (right side of figure).

51

Figure 5.10. The Administrator Console (left side of figure). UserAdministration Options (right side of figure).

52

developer, and regular user).

5.5.5.2 System Roles

This is where the administrator can configure the system for various structural

organizations of an EOC. That is, the administrator can specify what groups are

present and which individuals fall under which groups.

5.5.5.3 Virtualization

This feature is still in work. The virtual module will be the place where exercise

managers will have the ability to create, edit, and delete virtual agents. That is, they

will have the ability to supplant individuals who are not able to participate in the

exercise with a virtual agent of the corresponding human counterpart.

5.6 System Capabilities

SimEOC includes approximately 75 different system capabilities. See Appendix F

for a detailed list.

5.7 User Manual

See Appendix L for a vEOC User Manual.

5.8 Summary

In this chapter, we examined the software and architecture behind SimEOC. In

particular, we discussed the 5 main consoles that comprise the system. We concluded

with system capabilities.

53

CHAPTER 6

TESTING

6.1 Overview

During the course of development, we performed a combination of manual and

automated testing. In this chapter, we discuss the testing, demos and evaluation.

6.2 Manual Testing

Throughout the development, we performed standard verification and validation

testing. This included tests of usability, functionality, and design requirements. In

addition to standard testing, we performed more than 2 semesters of testing during

the Fall 2010 and Spring 2011.

6.3 Automated Testing

In the advanced stages of development, we conducted a small set of automated

tests to aid in functional testing. The automated tool used was Selenium. Selenium

is a Firefox add-on that does simple record-and-playback of interactions with the

browser [145].

6.4 Demos and Evaluation

During the course of development, we completed multiple usability tests (See Ap-

pendix K), functional tests (See Appendix J), and demonstrations. At the start of

54

the project in 2010, we worked with practitioners from the Miami-Dade EOC and

researchers from Emory University and Florida International University on usability

design. Halfway through the development, in July and Oct 2010, we performed a

complete set of usability and evaluation tests with the Miami-Dade EOC. In the ad-

vanced stages of development, in June and September 2011, we performed another set

of usability and evaluation tests with researchers and vEOC developers. In Febru-

ary 2014, we performed the largest exercise on the system to date. This exercise

consisted of 12 players, two of whom were remote. The goal of the exercise was to

moderately stress the system, to identify minute bugs in the system, to confirm the

operation of the system to date, and to validate the design of SimEOC. The outcome

of the exercise was conclusively successful. Finally, in October 2014, we did a field

validation with Miami-Dade, Notre Dame, and Emory University at the Miami-Dade

EOC.

6.5 Master Test Plan

In the Fall of 2010, we did a semester of extensive testing on the website. This

testing included various functionality, aesthetics, design, and bug testing. Appendix I

contains a master test plan used in the evaluation.

6.6 Summary

In this chapter, we discussed testing, demos and evaluation of SimEOC through-

out the project.

55

CHAPTER 7

GAMES AND SIMULATIONS FOR EMERGENCY OPERATIONS CENTERS:

CHALLENGES AND OPPORTUNITIES

7.1 Overview

Practice in dealing with crises in exercise situations is critical to dealing well with

crises in emergency situations. However, training can be costly and time-consuming.

Utilizing games and simulations for crisis management can help. In this chapter, we

discuss challenges and opportunities in designing games and computer-based simula-

tors for Emergency Operations Centers.

7.2 Introduction

Behind the scenes of a normal day, emergency managers are constantly preparing

for the next emergency. Practicing in dealing with crises in exercise situations is crit-

ical to dealing well with crises in emergency situations. Crews not only have to learn

their duties, they also have to learn how to use the crisis information management

systems in use at the EOC, and they have to learn how to coordinate across jurisdic-

tional boundaries. Yet training can be a major challenge for emergency managers.

Emergency Operations Centers have limited resources. Gathering everyone at the

EOC to conduct an exercise takes a lot of time and money. In fact, due to the cost

and personnel constraints, many organizations are limited to training on one or two

exercises a year. [77, 57, 123]

One way to achieve better training is through the use of games and simulations.

56

In the emergency management domain, a game is “a simulation of operations using

rules, data, and procedures designed to depict an actual or assumed real-life situa-

tion.” [42]. A game, in our context, is an electronic tool designed for the purpose

of practicing emergency operations. A simulation is the enactment of an exercise

in which certain operational components and resources are electronically simulated

rather than actually used during the exercise. Additionally, the simulation may

include artificial agents that simulate certain individuals. We call this type of simu-

lation computer-based simulation, as opposed to role-playing simulation.

Games and computer-based simulations are still growing in their roles in Emer-

gency Operations Centers [172]. There are few games and simulators on the mar-

ket for the EOC managers. Examples of simulators that target Emergency Opera-

tions Centers managers are ADMS-EOC, EMST, Virtual Staff Trainer, and SimEOC

[1, 46, 121, 149]. A search produced no known games for EOC managers. One reason

there are few games and simulators on the market may be because of the challenges

in building a training simulator for an EOC. There are many challenges. There are

also many opportunities. In the remainder of this chapter, we discuss some of the

challenges and opportunities with respect to Emergency Operations Center games

and simulations.

7.3 Challenges

How to simulate multiple EOC configurations?

EOCs differ in how they are set up organizationally. Each EOC is unique - it differs

in the number of individuals that are part of it and the structure of the physical

setup, which can be very important to an EOC [57, 148]. While there is no standard-

ized layout to the way an EOC is configured, there typically are four main ways to

structure an EOC. EOCs can be structured by Major Management Activities, by the

Incident Command System, by Emergency Support Functions, and as a MAC Group

57

[45]. How do we build a simulator with the flexibility to accommodate the various

organizational structures of an EOC?

How to simulate the person next to you?

Verbal communication and physical proximity in EOCs is essential. The ability to

lean over and talk to/coordinate with the person next to you is critical to success-

ful knowledge transfer and operations [57, 148]. EOCs are structured physically to

facilitate such communication and coordination. How do we simulate the physical

proximity and personal relationships of emergency managers with whom they have

to work closely?

How to simulate a Crisis Information Management System (CIMS)?

Why do we want to simulate a CIMS? Emergency managers do not want to train on a

system that is different than the one they will use during an emergency [57, 148, 159].

The amount of information that is available for emergency managers has grown

tremendously over the last decade. EOCS need a way to manage, store, organize,

filter, and retrieve that information. They do this through the use of a Crisis Infor-

mation Management System. Since most EOCs use CIMS these days, and since we

do not want to train on a system that is different from the one that we use every

day, we either need to use that CIMS in our training, build a CIMS that is generic

enough across all CIMS, or emulate the CIMS in use by the EOC. All of these cases

are challenging. Using operational CIMS means that we have to build the simulator

into the CIMS. One challenge for operational simulators is how to keep the training

section of the CIMS from accidently mixing information with the real CIMS. To build

a generic CIMS is another major challenge. There is still much research as to what

is a CIMS [35, 80, 123, 159]. This makes it difficult to nail down what components

constitute a CIMS, let alone a standard simulation of a CIMS. Finally, there are

58

many different types of CIMS in use, and CIMS are very complex. In order to train

emergency managers across the country (and the world), we may have to build an

emulator for each CIMS that is in use.

How to capture the complexity of a person in a game or simulation?

People are very complex beings with a lot of knowledge and experience. How can

we capture that knowledge in a computer? Even more challenging is the fact that

EOCs can be composed of a large number of different organizations and liaisons. For

example, the Miami-Dade EOC consists of over 47 organizational liaisons and 35 mu-

nicipalities, in addition to the county’s emergency management staff and additional

coordinating elected officials and politicians.

How to simulate real-life fidelity?

Another challenge is how to simulate the conditions that occur during an emergency.

How can we create simulation that can tap the critical ecological conditions that

prepare organizational participants for their roles in times of crisis?

How to simulate rare and unpredictable conditions?

How do we go about creating a complex crisis? Crises such as earthquakes and

tsunamis can be rare, beyond the scope of our resources, and often have unexpected

consequences.

How to identify what needs to be taught?

Subject matter experts are important to determining teaching objectives. However,

there are only a limited number of subject matter experts available, and each one is

limited by their experiences in different crises. The experiences of the experts that

are programmed into the game or simulation will directly affect the outcome of the

59

learning quality of the game or simulation [151].

How to assess how well the trainee has mastered the material?

Crises can be complex and rare, and there are very few cases of real experience from

which people can draw on to use as a baseline. It is difficult to determine the value

of the training program so that we are not training for a crisis inaccurately.

How to establish and maintain trust in virtual teams?

When another person is simulated, how does one establish and maintain trust in

virtual teams or partially virtual teams? [4, 19, 103]

How to establish trust with a simulated person?

Establishing trust and relationships with a real person during a crisis is one of the

goals of exercises. How do trainees establish a trusting relationship with the people

they work with in a crisis when the exercises are conducted with a simulated person

or team?

How to establish multi-jurisdictional boundaries and roles with a simu-

lated person?

Often times, during an exercise, emergency managers clarify roles and jurisdictional

territory and improve interorganizational coordination [42]. When jurisdictional is-

sues arise with a simulated person, how do we carry that into the real world?

What standards should be used for interoperability between simulators?

Very few simulators for Emergency Operations Centers even exist. How do we de-

velop interoperability between simulators [77, 122]?

60

7.4 Opportunities

Despite the challenges of creating Emergency Operations Center games and sim-

ulators, there are many opportunities and advantages. These are outlined below.

Games and simulations allow personnel to train more frequently than they

otherwise would be able to in live and face-to-face paper-based exercises.

Simulations, especially distributed simulations, allow individuals to train more often

at less cost to the emergency managers.

With games and simulations, we can replicate conditions on a greater va-

riety of scenarios.

With games and simulations, trainers are not constrained by resources involved in

creating a simulation, both in term of time and in terms of scenarios. This gives

trainees a wider breadth of knowledge and experience on which to draw in an actual

emergency.

Games and simulations allow instructors to challenge more people than

manual or paper-based simulations.

Games and simulations allow instructors to send more injects and to track more re-

sponses from more people than paper-based simulations allow.

Games and simulations allow organizations to record the exercises for later

review or for repeat.

With computer-based games and simulations, we can view the state of the exercise

at any point in time. Games and simulations also offer the ability to easily repeat

the exercise, which offers many opportunities for research as well [172].

Games and simulations are less costly than exercises.

61

Gathering individuals at the EOC for an exercise is costly and time-consuming. Dis-

tributed simulations allow individuals to train without being physically present at

the EOC.

Games and simulations offer substantial adaptive capability to adjust to

heterogeneity in learner skill levels, knowledge, and abilities.

With games and simulations, one can vary the level of challenge for each of the

trainees. For example, some individuals can be at a beginner level and other individ-

uals can be at an advanced level [86, 146].

Games and simulations allow organizations to train individuals, selective

portions of the organizational hierarchy, and the entire organization.

Games allow organizations to focus on a single individual or a small group of indi-

viduals. Simulations, on the other hand, allow an organization to focus on groups

within the organization or the organization as a whole. For example, simulations can

train selective portions of the organizational hierarchy like a public safety group or

an infrastructure group.

Games and simulations allow emergency managers to train new personnel

without being in the middle of a disaster.

New personnel should not find themselves in a situation where they are receiving

training in the middle of a crisis. In the middle of a crisis, the foremost concern is

on managing the disaster, not on training individuals. Computer-based games and

simulations allow individuals to train before a disaster strikes.

Games and aimulations allow trainees to reap the benefit of a collective

knowledge base of expertise.

62

In creating the game or simulation, a wide knowledge base of subject matter experts

is assembled, which allows trainees to benefit from a larger knowledge base, rather

than that of one or two individual exercise controllers [151].

Games and simulation allow emergency managers to engage in a variety

of pedagogical forms.

Different forms of instruction range from simple memorization (e.g., learning ICS

terminology, local use of acronyms) to extensive event simulations requiring complex

decision making, communication, and coordination within a dynamic group setting

(e.g., distributing scarce resources under conditions of high uncertainty). These can

be easily programmed into the game or simulation.

Games and simulations allow immediate feedback to trainees and organi-

zations.

Feedback can be delayed in non-computer-based exercises while the evaluators gather

their paperwork and write their conclusions. With computer-based games and simu-

lations, feedback can be immediate [151].

Games and simulation allow emergency managers to posit and assess im-

pacts of policy in a safe environment before a disaster occurs.

Games and simulations do not have real consequences, outside of training accuracy

for a crisis. This provides a safe environment for the trainees to engage in learning

and for the organization to assess the impacts of policies and procedures without the

consequences of a real situation [69].

Games and simulations can be fun for the trainees (and the instructors).

Games and simulations can offer a little fun to the trainees and instructors [172].

63

7.5 Summary

In this chapter, we have outlined some of the challenges of creating computer-

based games and simulations for Emergency Operations Center managers. We also

discussed the opportunities and advantages involved in games and simulations.

64

CHAPTER 8

LEVERAGING WEBEOC IN SUPPORT OF THE HAITIAN RELIEF EFFORT:

INSIGHTS AND LESSONS LEARNED1

8.1 Overview

The magnitude seven earthquake that rocked Haiti has been a devastating disas-

ter for the small country [171]. They are not alone in this crisis, however. When the

earthquake struck, thousands of US citizens responded by donating money, resources,

people, and time to aid in the disaster relief. To respond to the incident and to cre-

ate a secure information-sharing environment, the Florida Miami-Dade County and

State Emergency Operations Centers (EOC) were activated. The main information

system in use at the Miami-Dade EOC is WebEOC, a web-based crisis information

management system that aids in secure coordination and collaboration among EOC

staff, liaisons, and emergency managers. As a result of the earthquake response ef-

forts using this system, we have identified seven main insights and lessons learned

with respect to crisis information management software. In this chapter, we discuss

Miami-Dades role in the Haitian relief efforts and how this lead to these insights and

lessons learned.

1Material from this chapter has appeared in the following publication: Nikolai, C., Johnson, T.,Becerra-Fernandez, I., and Madey, G. (2010). Leveraging WebEOC in support of the haitian reliefeffort: insights and recommendations. The 7th International Community on Information Systemsfor Crisis Response and Management (ISCRAM) Conference, Seattle, WA.

65

Incident Command

Operations Planning LogisticsFinance/Admin-istration

Public Information

Officer

Safety Officer

Liaison Officer

CommandStaff

General Staff

Figure 8.1. Incident Command System [41]. The command staff andgeneral staff assist the Incident Commander during an emergency.

8.2 Introduction

An Emergency Operations Center is a secure location where emergency officials

gather to prepare for, manage, and coordinate the response to an incident. In day-

to-day operations, emergency management staff are involved in preparedness and

mitigation strategies for future crises. However, when a disaster strikes, the staff

drop their day-to-day roles and take on the role assigned to them by the Incident

Commander. This role usually involves leading a section or branch of the incident

command system or ICS [85]. There are five main branches in accordance with ICS.

The main divisions are the operations section, the planning section, the logistics sec-

tion, and the finance/administration section. Leading the general staff and assuming

responsibility for the incident is the Incident Commander. The Incident Commander

also has additional support staff as well, called the command staff. The command

staff includes a public safety officer, a public information officer, and a liaison officer

[74]. See Figure 8.1.

66

8.2.1 Haitian Relief Effort: Sequence of Events

On Tuesday, January 12, 2010, the Miami-Dade EOC staff were going about their

daily operations - conducing meetings, taking steps for H1N1 mitigation, and prepar-

ing for the Probowl and Superbowl. Then a magnitude seven earthquake struck the

small country of Haiti [171]. Just as quickly, the Miami-Dade EOC and the state of

Florida sprang into action. Both the Miami Dade EOC and the State EOC went to

level 3 (a heightened state of monitoring) with the county duty officer and the state

watch office monitoring the situation. Urban Search And Rescue, Florida Task Force

1, and Florida Task Force 2 were notified of the situation. Also the U.S. Department

of State began response preparations, and Urban Search and Rescue teams from Vir-

ginia and California were mobilized.

The next day, as more information became available about the incident, the

Miami-Dade County Mayor, Miami-Dade County Commissioner, Florida State Rep-

resentatives and a Congressman of State of Florida, the Miami-Dade County Chief

of Fire, Miami-Dade County Chief of Police, the director of the American Red Cross,

the director of Catholic Charities and various other elected officials gathered at the

EOC to discuss the impact to the local community. As a result of the meeting, the

EOC activated to level 2 in order to support donation preparedness and documenta-

tion of surplus county resources to be deployed to Haiti. Level 2 is a partial activation

in which only the Lead Agencies who are needed in the incident response are required

to report to the EOC [93].

As the incident unfolded on Thursday, EOC staff took on the additional mission

to support the State Department to repatriate US citizens and permanent residents

of the United States. It continued planning for the deployment of donations and

county resources to Haiti. In addition, the EOC established and deployed a county

relief effort website [94]. Comcast and AT&T also began working with the county

to facilitate a multiple-line phone bank for toll-free calls to Haiti for individuals try-

67

 

Figure 8.2. Haiti Relief Effort Timeline. This is the sequence of events thatoccurred as the EOC began the Haiti relief operations.

ing to reach family members [94]. Miami International Airport and Homestead Air

Reserve Base began receiving U.S. citizens from Haiti for repatriation.

On Friday, the EOC went to 24-hour operations as it continued cataloguing both

county and non-county deployable and donated resources. It also continued work-

ing with the Department of State and the Miami-Dade Aviation Department on the

repatriation plan. The EOC next began coordinating with appropriate Federal and

State agencies in preparation for a potential mass migration event resulting from the

earthquake. On Sunday evening, as incident stabilized, the EOC was able to resume

daytime only operations with a small contingent of staff providing 24-hour support

for Homestead Air Reserve Base to increase repatriation effort coordination [94]. See

Figure 8.2 for a timeline of these events.

8.2.2 Tracking Resources In The Relief Efforts

The main crisis information management software in use at the Miami-Dade EOC

is WebEOC. Miami-Dade, along with several other counties in the region, began us-

ing WebEOC in early 2009. WebEOC, a product released by ESi Acquisition Inc.,

68

is a web-based crisis information management system that enables agencies and staff

within the EOC and among EOCs to share real-time information in a secure manner

[36]. One of the main features of WebEOC is that anyone with proper administrative

rights can create a board, which is the main vehicle for sharing information. Boards

are interfaces with various inputs and views that enable individuals with appropriate

access rights to input, modify, and view data that is stored in a backend database.

Each board has various levels of access rights for each registered user [49].

Since the EOC acquired the software in April, 2009, it has gone through various

exercises and has been tweaking its implementation [59, 96]. However, this was

the first time the EOC used WebEOC in support of an actual incident. When the

earthquake occurred, the WebEOC administrator created a unique incident space

for the Haiti Relief effort. After that, staff and liaisons were able to use situational

report boards, position logs, significant events, and various additional boards to help

organize the common operating picture. However, in addition to the normal boards

the EOC had created a priori, it needed an additional way to track resources. In

particular, a way to track donations and repatriation efforts was necessary, especially

information such when the airplanes were arriving, how many passengers each plane

was carrying, at which airport the airplane was landing, how many passengers were

injured, and how many were given food. Second, the EOC needed a way to track

county and non-county resources that were being donated, what items were being

donated, and how many of each item. Taking the basic boards, the EOC administra-

tor modified the fields to create the two additional boards that were needed to track

these efforts. Using WebEOC, the boards were complete in less than 30 minutes. See

Figures 8.3 and 8.4.

In addition, the EOC had to provide logistical support as well, that is, the ability

to track and deploy county resources. The two vehicles of choice were the Mis-

sion/Task board and the Resource Manager plug-in. Each one had strengths and

69

 

Figure 8.3. The Transportation Board. This board was used to trackairplanes and passengers coming into Miami International Airport and

Homestead Air Reserve Base. This is the list view.

weaknesses. The Resource Manager had the ability to allocate and deploy items.

However, it had pre-defined resource typing which limited the flexibility of the plug-

in [89]. In addition, the Resource Manager had training data in the board, which

could not be deleted at the local level. Thus, ESi had to delete the data for the EOC

before it could begin using the board. The Mission/Task board allowed deletion of

data at the local level; however it did not have the ability to deploy and track assets.

The EOC began by using Mission/Task, and then, after ESi was able to delete the

current data from the Resource Manager, the EOC began using resource manager as

well.

Finally, the EOC needed Geographic Information System (GIS) maps. The GIS

Unit Leader already had a collection of county resources geocoded in ArcMap. Thus,

when the decision was made for all police stations, fire stations, and libraries to be

dropoff points for local donations for the Haiti Relief Effort, she was able to create

dropoff point maps in a matter of minutes.

70

 

Figure 8.4. The Donation Board. This board was used to track non-countyresources. This is the list view.

8.2.3 Insights And Lessons Learned From The Haiti Relief Efforts

This activation highlighted the need for several insights and lessons learned with

respect to crisis information management software.

1. The need for on-demand boards, that is, boards that can be built on-the-fly.Crises, by definition, are rare events, and the best laid plans sometimes needtweaking. In its incident management preparation, the EOC did not anticipatethe need for additional boards in the middle of a crisis. With WebEOC, the EOCwas able to build the needed boards quickly and import data to those boardsquickly as well. This greatly contributed to the success of the relief operations.

2. The need for web-accessible boards that can be accessed from any com-puter, any time, anywhere.During this crisis, the EOC did not anticipate the need for additional logins forindividuals who were not assigned to the EOC to access WebEOC and input data.Although it was not anticipated, the EOC was able to quickly provide individu-als at Homestead Air Reserve Base and partner agencies restricted access to thesystem in order to coordinate the relief efforts.

3. The need for intuitive, easy-to-use software that can be learned in amatter of minutes, and the ability to provide different levels of accesscontrols to various users.

71

The individuals who needed access to the transportation board were not regularliaisons at the EOC. They had never used WebEOC before and had to learn howto input and view the data they needed to share with the Miami-Dade EOC. Theshort learning curve of WebEOC enabled these new liaisons to accomplish this.Because the EOC allowed non-EOC personnel to use its system, it also needed tohave various read-only access permissions to allow the individuals to see the samedata that the staff and lead agencies were seeing. Even within the EOC, variousbranches needed write access while restricting others from modifying the data.

4. The ability to jump from day-to-day operations to an incident quickly,and the ability to separate day-to-day and emergency data quickly andeasily.Unlike some events, like hurricanes, the earthquake struck Haiti with little notice.The EOC needed the ability to change from day-to-day operations to the incidentin a matter of minutes. It also needed the ability to separate the incident spacefrom day-to-day data and from other incidents in the system.

8.2.4 Recommendations For Further Improvement

1. The ability to recover from failure quickly.Luckily, there was not a software failure during this incident. However, the rushof the relief operations dictated that if there were a failure, the EOC would needto access its backup system within minutes of the failure. With WebEOC, it isestimated to take approximately 30 minutes to bring the backup system online.Additionally, if the EOC were not able to recover a backup database or if thebackup database was not recently synchronized with the current database, thenthe EOC would not be able to access the data that previously had been enteredinto the system. The entire common operating picture would have been lost.

2. The ability to integrate the crisis information management systemwith GIS.In the case of the Haiti relief efforts, there were many data for which it was usefulto geocode and map. One example was the location and operating hours of allof the public donation drop-off points in the county. Although, the WebEOCadministrator was able to import this information into the board and geocodeit to share with others, there was no direct link between the crisis informationmanagement system and the GIS software, and this would have been beneficial.

3. The ability to create various reports.The transportation board had a number of fields which were being entered. How-ever, when displaying a list of each record side-by-side, it was not practical toview all of the fields. Therefore, the EOC needed a way to create various reportsquickly and easily that would print out the fields that it was interested in.

72

8.3 Summary

Thousands of people died and millions were affected by the Haitian earthquake.

In this chapter we have discussed the timeline of events that occurred at the Florida

Miami-Dade County EOC as the situation progressed. Next we discussed how We-

bEOC provided an easy-to-use and easy-to-modify crisis information management

system for the EOC during the incident. Finally, we concluded with seven insights

and lessons learned concerning crisis information management systems in response

to emergency events.

73

CHAPTER 9

LEVERAGING WEBEOC AND GIS IN SUPPORT OF EMERGENCY

RESPONSE

9.1 Overview

Every year, the National Football League concludes the season with a Probowl and

a Superbowl. These are huge events that spawn many local parties and celebrations.

In early 2010, both events were held at Sun Life (Dolphins) Stadium in Miami,

Florida. The Miami-Dade Emergency Operations Center (EOC) was activated in

support of these events. This chapter is about how the Miami-Dade EOC leveraged

WebEOC and Geographic Information Systems (GIS) in support of the Probowl and

the Superbowl in 2010.

9.2 Introduction

Geographic information systems (GIS) are computer systems for capturing, stor-

ing, querying, analyzing, and displaying geospatial data [23]. Geospatial data are

data that include attributes and characteristics in addition to location information

[23]. For many years, GIS was considered too difficult and too expensive to use [23].

However, since the advent of greater computer hardware, software, and better graph-

ical user interfaces, GIS has slowly made its way from an optional enhancement to

the mainstream, and it now is playing a prominent role in Emergency Management,

to include the formation and display of the common operating picture [23, 38].

An Emergency Operations Center is a secure location where officials gather to

74

Incident Command

Operations Planning LogisticsFinance/Admin-istration

Public Information

Officer

Safety Officer

Liaison Officer

CommandStaff

General Staff

Figure 9.1. Incident Command System [41]. The command staff andgeneral staff assist the Incident Commander during an emergency.

prepare for, manage, and coordinate the response to an incident. They are organized

using the incident command system or ICS. There are five main branches in accor-

dance with ICS. The main divisions are Incident Command, Operations, Planning,

Logistics, and Finance/Administration. Leading the general staff and assuming re-

sponsibility for the incident is the Incident Commander. The Incident Commander

has additional support staff as well, called the command staff, which includes a public

safety officer, a public information officer, and a liaison officer. See Figure 9.1.

The main crisis information management software in use at the Miami-Dade EOC

is WebEOC [36]. Miami-Dade, along with several other counties in the region and

throughout the state of Florida, began using WebEOC earlier this year. WebEOC,

a product released by ESi Acquisition Inc., is a web-based crisis information man-

agement software system that enables agencies and staff within the EOC to share

real-time information in a secure manner. One of the main features of WebEOC is

that anyone with proper administrative rights can create a board, which is the main

vehicle for sharing information. Boards are interfaces with various views that enable

75

individuals with proper access rights to input, modify, and view data that is stored

in a backend database. Administrators assign various levels of access rights for each

board and each registered user.

9.2.1 Leveraging WebEOC in Support of the Probowl and the Superbowl

This year, both the Probowl and the Superbowl were held in Miami, Florida.

The Miami-Dade EOC was activated in support of these events. To prepare for these

events, the EOC built several boards using WebEOC. In particular, there are two

main boards: one for incidents that are directly related to the Superbowl or Probowl,

and one for events that are congruent to the Superbowl or Probowl (e.g. Superbowl

committee meetings, Probowl Meetings, and Celebration Series, local parties). For

events that occur inside the stadium that are directly related to the Superbowl or

Probowl, the main board, called Stadium Incident Viewer, stores a list of all incidents.

If an incident should occur, agency liaisons enter the corresponding details, which

include the type of incident, the date/time of the incident, the specific location of

the incident (e.g. which seating section), severity of the incident, a brief description

of the event, any actions taken, and the status. See Figure 9.2.

For incidents that are congruent to the Superbowl or Probowl, the EOC has

created a Special Events Board in which it stores data similar to the Stadium Incident

Viewer. The Special Events Board is pre-populated with known parties and local

events that could morph into an incident.

To aid in forming the most up-to-date common operating picture, Miami-Dade

County is employing a new system called the Florida Interoperable Picture Process-

ing for Emergency Response (FLIPPER). FLIPPER is a GIS system modeled after

the Virginia Interoperability Picture for Emergency Response (VIPER), which allows

one to dynamically create, geo-process, and view incident data on a map [173].

Although FLIPPER uses the same FLEX infrastructure to create the interface,

76

 

Figure 9.2. Stadium Incident Board. This board is used to track incidentsthat occur within Land Shark Stadium. This is the list view. Here we see

significant details of a fire incident.

FLIPPER is unique because it leverages the primary WebEOC database as the back-

end database. ESRI, in conjunction with Miami-Dade County, used the underlying

FLEX architecture to link the events to the primary WebEOC database. In par-

ticular, they linked the data from road closures and special events. That is, when

someone enters a new road closure in the WebEOC Road Closure Board, then a

special marker indicating a new road closure automatically appear in the FLIPPER

interface. Likewise, when someone enters a new Special Event, that event location

is pinpointed on the FLIPPER interface. One can further click on the special event

or road closure icon to obtain specific details about the event. See Figure 10.1. On

the top left corner, the 4 icons represent geo-processing tools that one can use to

interact with the map, specifically, navigation, map, tools, and help menus. On the

right hand side of the figure, there are several widgets that the users also can inter-

act with. First there are the Road Closures and Special Events. Road Closures are

indicated by a red cautionary triangle and Special Events are indicated by a yellow

cautionary triangle. By selecting a particular event with the mouse, one can see

additional details in the form of a popup box (upper middle screen). In addition to

77

 

Figure 9.3. Florida Interoperable Picture Processing for EmergencyResponse (FLIPPER) screenshot. This GIS software links with the

backend WebEOC database.

these details, one can look at the weather at this location or get a more detailed view

of the location through Bing maps. The red rectangle, in this figure, is an area which

has been selected to gather population statistics (shown in the right hand column

pie chart). The lower right column user box is where individuals can select special

events Report By Exception (RBE). RBE is the ability to isolate only the items that

are in the range of an event and eliminate all of the remaining items. For example,

one can call up a list of all critical facilities that are within a specified mile radius

of an incident. All remaining critical facilities are not shown on the map. This aids

in creating an intelligent GIS footprint because it helps emergency managers filter

out unnecessary information that may obstruct the common operating picture and

hinder critical decision-making.

Additional benefits of FLIPPER are the light, web-based, user-friendly interface

that can perform simple queries and geo-processing. One can outline a polygon on

78

 

Figure 9.4. Florida Interoperable Picture Processing for EmergencyResponse (FLIPPER) Social Networking Screenshot. FLIPPER links with

social networking sites like Flickr and Twitter. In this figure, a simplesearch for Haiti calls up a score of pictures others have taken and posted on

Flickr.

the base map, for example, and the interface will return statistics about the pop-

ulation of that area. Another important aspect of FLIPPER is that it integrates

with existing social network tools, such as Twitter and Flickr [58, 161]. The FLEX

infrastructure also supports integration with Facebook [53]. This is important be-

cause social networking often has more accurate and up-to-date information than the

systems that first responders are using (often radios) [38]. That is, individuals at

the scene have mobile phones and digital cameras from which they can gather and

share information more accurately and quickly than the standard approach of first-

responders to reporting an emergency radio relay of the incident and scene to the

command post [38]. For example, in a Flickr search for photos with the tag Haiti

after the Earthquake, one can call up a score of photographs that others have taken

and posted. See Figure 9.4.

79

9.2.2 Insights for Further Integration

1. The Superbowl and Probowl highlight the need to integrate GIS withrespect to crisis information management software.The ability to use the existing WebEOC database as the backend database forFLIPPER is the main reason that the Miami-Dade EOC has adopted this modifiedversion of VIPER; rather than having two separate systems a GIS system and acrisis information management software system - FLIPPER seamlessly integratesboth systems into one. GIS, when used in this manner, helps to form a morecomplete common operating picture and aid in better critical decision-making.

2. Social networking and mobile-based GIS is taking a greater role inGIS and emergency management.The second reason that Miami-Dade EOC has adopted the FLEX interface is thatit allows integration with existing social networking tools such as Flickr, Facebook,and Twitter. It also interfaces with Bing [14] and Yahoo Traffic, and it containsmany widgets, including one that displays population statistics.

9.3 Summary

In this chapter, we discussed how the Miami-Dade EOC has leveraged WebEOC

and GIS in support of the Probowl and the Superbowl. First we discussed an internal

and external events stadium board created in WebEOC. Next, we discussed how we

integrate Geographic Information Systems with WebEOC to form an interoperable

common operating picture. We also discussed how using Flex on top of the ESRI’s

ArcGIS Server can be used to create an intelligent common operating picture foot-

print. Finally, we concluded with some insights and recommendations with respect

to GIS and crisis information management software systems.

80

CHAPTER 10

DESIGN PRINCIPLES FOR MODERN CRISIS INFORMATION

MANAGEMENT SYSTEMS: FROM CLOSED LOCAL SYSTEMS TO THE WEB

AND BEYOND

10.1 Overview

Since Hurricane Katrina, a lot of research has gone into improving crisis man-

agement through the use of crisis information management systems (CIMS). There

has been much interest in how to design dynamic CIMS, particularly with respect

to web-based emergency management systems. In our research, we set out to de-

sign a distributed web-based training and research tool for emergency managers and

scholars. Along the way, we identified several additional design principles for modern

CIMS. This chapter outlines the resulting recommendations for modern CIMS.

10.2 Introduction

A crisis is defined as “any event that threatens to, or actually does, inflict dam-

age to property or people.” [39]. Crises can be small or large in scale. In large

scale crises, there usually is a significant probability of extreme danger and highly

unpredictable outcomes [83]. Small or large scale crises can occur at any time, and

the consequences can be enormous. At the height of the H1N1 influenza outbreak

between 2009 and 2010, 61 million people became infected with this virus. In addi-

tion, H1N1 caused approximately 274,000 hospitalizations and 12,500 deaths [22].

In 2004, the Indian Ocean earthquake and tsunami affected approximately 5 million

81

people in Indonesia, Sri Lanka, India, and the surrounding areas. Over 280,000 peo-

ple died, and more than 1 million people were displaced (World Health Organization,

n.d.). In the US, Hurricane Katrina was one of the most expensive and devastating

natural disasters in American history [134]. Over half a million people were affected

by the hurricane, and the US energy infrastructure was severely damaged [134]. In

2012, Hurricane Sandy swept through the Northeastern United States. Seventy-two

people died and 8.5 million people lost power. More than 650,000 houses were dam-

aged or destroyed [15]. Hurricanes Katrina, Sandy and other crises clearly show

the importance of disaster preparedness. Indeed, much can be improved, especially

with respect to training and collaboration among federal, state, and local govern-

ments [2, 32, 35, 71, 134, 178]. Specifically, one area that can be improved is the

design of crisis information management systems (CIMS) [21, 61, 63, 64, 127, 159].

The remainder of this chapter is structured as follows. We begin by discussing

current theory on the design of crisis information management systems. Next we

discuss our findings and recommendations for modern CIMS. Finally, we conclude

with implications and future directions of modern CIMS.

10.3 Background

10.3.1 Evolution of CIMS: From Closed Local Systems to the Web and Beyond

The use of crisis information management systems to aid in emergency manage-

ment is a fairly recent initiative. In fact, it is still growing and evolving. The evolution

of CIMS mimics the evolution of the web. Like the web, CIMS began its development

with systems that centered around organizing, storing, and retrieving information.

These early systems used tools such as Microsoft Word, Excel, and Access. Next, we

begin to see the emergence of information management systems to aid in maintaining

a common operating picture and decision support. These systems were closed and

82

isolated; each organization had its own system, but they were not necessarily con-

nected with other systems in other jurisdictions. Examples of these types of systems

include Sharepoint, E-Team, and WebEOC. Hurricane Katrina marked a changing

point in using information technology to aid in emergency management. Officials

realized that they could harness information technology to a greater extent in the

relief efforts. Systems began to focus on interconnecting closed, local systems with

each other [108]. Here we see the emergence of crisis information management sys-

tems. Examples of these types of systems include enhanced versions of E-Team, and

WebEOC (with a WebFusion augmentation). The Haiti Earthquake in 2010 marked

another pivotal point in emergency management. Organizations began using Web 2.0

technologies, digital media, and smart phones in the emergency response. Web 2.0

technologies feature wikis, SMS/texting, social media, and other social networking

and collaboration tools such as Flickr, Facebook, and Twitter. Additionally, through

the use of the web, digital media, and smart technologies, emergency managers be-

gan leveraging the public in collaboration and participation in the response efforts.

“Web 2.0 principles are strikingly applicable to disaster relief since a stricken pop-

ulation can offer the most immediate information about its own conditions. These

principles advance the ability of individuals to dialogue and partner with relief agen-

cies, rather than being consigned to the role of passive victims [...] Information may

be gathered and assembled in an open, democratic fashion.” [108]. Web 2.0 sys-

tems bring together existing trends in information technology for the humanitarian

response. These systems include increased usage of digital media technologies and

smart phones and tablets by responders to manage humanitarian information, en-

hanced reporting and distribution of information through local mass media to help

aid recipients, and customized innovative digital media tools and platforms applied to

coordinate new forms of collective action and problem-solving [108]. A third pivotal

area that emerged in the wake of the Haiti earthquake is with respect to remote pub-

83

lic volunteers or digital volunteers [154]. In addition to formal emergency response,

these digital volunteers use Web 2.0, smart phones and tablets and social media tools

to self-organize participation in the response. This self-organizing participation in

emergency response is also known as crowdsourcing. Tools digital volunteers use to

aid in the response include social media, SMS, and reporting and mapping services

such as Ushahidi, GeoCommons, and OpenStreetMap [126, 130, 154, 156, 189].

10.3.2 Summary of Current CIMS Design Principles

In a foundational paper in 2004, Turoff et al. [159] describe principles for the

design of a dynamic emergency response management information system (DERMIS).

In the first section of this paper, they outline 9 premises for DERMIS. These premises

have been derived from the historical experiences of the former Office of Emergency

Preparedness (OEP). These are reproduced below.

• Premise 1 System Training and Simulation: An emergency system that is notused on a regular basis before an emergency will never be of use in an actualemergency.

• Premise 2 Information Focus: People responding to an emergency are working14-18 hours a day and have no tolerance or time for things unrelated to dealingwith the crisis.

• Premise 3 Crisis Memory: Learning and understanding what actually hap-pened before, during, and after the crisis is extremely important for the im-provement of the response process.

• Premise 4 Exceptions as Norms: Almost everything in a crisis is an exceptionto the norm.

• Premise 5 Scope and Nature of Crisis: The critical problem of the momentis the nature of the crisis, a primary factor requiring people, authority, andresources to be brought together at a specific period of time for a specificpurpose.

• Premise 6 Role Transferability: It is impossible to predict who will undertakewhat specific role in a crisis situation. The actions and privileges of the roleneed to be well defined in the software of the system and people must be trainedfor the possibility of assuming multiple or changing roles.

84

• Premise 7 Information Validity and Timeliness: Establishing and supportingconfidence in a decision by supplying the best possible up-to-date informationis critical to those whose actions may risk lives and resources.

• Premise 8 Free Exchange of Information: Crises involve the necessity formany hundreds of individuals from different organizations to be able to freelyexchange information, delegate authority, and conduct oversight, without theside effect of information overload.

• Premise 9 Coordination: The crux of the coordination problem for large crisisresponse groups is that the exact actions and responsibilities of the individualscannot be pre-determined.

Based on these premises, Turoff et al. outline 7 resulting foundational require-

ments for a CIMS.

• Extremely easy to learn via training and exercises

• Useable by people who will have an understanding of their roles and responsi-bilities in an emergency environment

• Focus on a concise and self-evident design demanded by the small screen ori-entation and the need to minimize learning

• Allows the individual users a high degree of tailoring, filtering, and focusing ofthe interface tailored to their specific roles and responsibilities

• Supports planning, evaluation, training, exercises, and system updating andmaintenance between crisis events

• Allows the operation of the response function without the need for a single op-erational physical center except for the operation and backups for the computerhardware and software acting as a server and distributed resource databases forthis operation

• Will be designed as a structured communication process independent of thenature of a particular crisis

To accomplish this, Turoff et al. suggest 5 specific criteria for the design of the

interface for a group communication system. Specifically, the interface should include

metaphors, human roles, notifications, context visibility, and hypertext. Metaphors

are the mental models of a system that a user can learn in order to understand the

system more easily. Human roles enable the separation of responsibilities in a CIMS.

85

Individuals assigned to roles have associated authorizations with respect to what

actions they can take in a CIMS and what information they are allowed to access.

Notifications are items that need to be tracked concerning actions of others and

current events in the system. These items are then sent to interested and authorized

parties in the system. Context visibility allows a user to mark and follow particular

events that they are interested in. Hypertext are bidirectional links which have

semantic meaning and which can automatically change at any time as a result of

actions from users. Further, Turoff et al. discuss 8 resulting general design principles

and specifications. These are:

• Design Principle 1 - System Directory: The system directory should providea hierarchical structure for all the data and information currently in the systemand provide a complete text search to all or selected subsets of the material.

• Design Principle 2 - Information Source and Timeliness: In an emergencyit is critical that every bit of quantitative or qualitative data brought intothe system dealing with the ongoing emergency be identified by its human ordatabase source, by its time of occurrence, and by its status. Also, whereappropriate, by its location and by links to whatever it is referring to thatalready exists within the system.

• Design Principle 3 - Open Multi -Directional Communication: A systemsuch as this must be viewed as an open and flat communication process amongall those involved in reacting to the disaster.

• Design Principle 4 - Content as Address: the content of a piece of informationis what determines the address.

• Design Principle 5 - Up-to-Date Information and Data: Data that reaches auser and/or his/her interface device must be updated whenever it is viewed onthe screen or presented verbally to the user.

• Design Principle 6 - Link Relevant Information and Data: An item of dataand its semantic links to other data are treated as one unit of information thatis simultaneously created or updated.

• Design Principle 7 - Authority, Responsibility, and Accountability: Author-ity in an emergency flows down to where the actions are taking place.

• Design Principle 8 Psychological and sociological factors: Encourage andsupport the psychological and social needs of the crisis response team.

86

Finally, Turoff et al. conclude with 3 supporting design considerations and spec-

ifications: resource databases and community collaboration, collective memory, and

online communities of experts. Resource databases are collections of data about

available resources that are shared by a community of users and which are updated

by the individuals in the community who specifically are responsible for updating and

maintaining portions of data in areas in which they specialize. Collective memory

is a description of the world derived from a set of rules about events, their interac-

tions and dependencies, and how they relate to objects or data groupings. Online

communities of experts are communities of professional subject matter experts who

make themselves available to local agencies and possibly train with local agencies to

collaborate on a situation in which the local agencies may need additional expertise

in a particular area.

Another paper by Onorati et al. (2011) describes interaction principles for Web

Emergency Management Information Systems (WEMIS). Onorati et al. analyze a

set of common and frequent characteristics of WEMIS. They analyze the PIE (Pro-

gram, Interpretation, Effect) formalism with respect to CIMS and introduce another

characteristic to the formalism [33]. The resulting characteristics are observability,

predictability, reachability, transparency and meta-communications. Observability

is the ability of users to completely understand the output of the system from the

information visualized on the screen (interaction design principles). Predictability is

the ability to display useful information on a screen about past and future events to

aid the user in determining which actions to perform next. Reachability is the ability

to reach any state in the system from another state. That is, systems should have

the ability for users to correct mistakes in a particular state. Transparency means

that the screen displays all information and data about the state of the system so

that users can know about current effects. Meta-Communication represents rules and

semantics for the communication between users and designers.

87

In a paper on process management and geo-collaboration among first responders,

Catarci et al. (2011) outline 5 requirements for CIMS for first responders. Specif-

ically, these apply to process management systems (PMS), which are systems that

first response leaders have on their PDAs that aggregate data from several sources

to better orchestrate the crises among first responders. First, they advocate for wire-

less mobile networks for these systems. Second, since first responders may not have

knowledge of the local area, PMSs should be integrated with Geographic Informa-

tion Systems (GIS). Third, PMSs should allow for large process specifications that

are specialized by time according to the specific situation. Fourth, PMSs need to be

adaptable; they need to be able to adapt the process execution to possibly changing

circumstances and contingencies. Finally, PMS “client tools must be extremely us-

able and intuitive [...] Systems should be so intuitive that they can be easily mastered

after few interaction sessions.” [21].

In a paper on Emergency Management Information Systems, Grant (2008) de-

scribes a checklist for comparing CIMS. This checklist was derived from a Crew/Operator

Support Policy (COSPOL) study funded by the Netherlands. In this report, Grant

identifies functions that command and control (C2) information systems utilize when

managing a crisis. They do not list the entire system functions, but for the mon-

itoring and control subset of C2 information systems, they note data acquisition,

situation analysis, and planning and scheduling as part of a command and control

crisis information management system.

Gryszkiewicz and Chen (2012) describe how to integrate various concepts of tem-

porality in crisis information management systems. Specifically, they describe 6 de-

sign principles.

• Make temporal relationships between asynchronous activities salient

• Make information about past crises, and past crisis management events in on-going crises, easily accessible.

88

• Add create/last-modification time-stamps to information.

• Indicate how events and activities may develop.

• Give support for emerging information sources

Gryszkiewicz (2012) adds two more temporal design principles: support the pace

of the users tasks and work, and allow for different users being responsible for the

same tasks over time. The first principle allows the system to work more quickly or

slowly depending on the needs of the users. For example, in some cases, it may be

more efficient to have an online chat with another person rather than to publish and

subscribe to messages. The second principle allows for continuity among different

shifts of workers to access the information for which they are authorized in order to

change over more quickly and accurately during long lasting crisis events.

10.4 Additional Design Principles for Modern CIMS

In our research, we designed a distributed web-based virtual Emergency Opera-

tions Center for training and research [9, 115, 116, 149]. We began with a literature

review of current design principles. In addition, we did 9 months of field research with

the Miami-Dade EOC in Miami-Dade County, Florida (See Appendix A). In keeping

with the DERMIS principle #1, as part of the trainer, we had to design a simple crisis

information management system. Through our field research and through the design

of the system, we identified several additional design considerations, particularly for

CIMS being used in the middle of a crisis.

Design Principle 1: Modern CIMS should be easy to learn on-demand

[57]. In the middle of a crisis, personnel are dealing with the crisis. Few organiza-

tions have the time or resources to train new personnel; their foremost concern is on

stabilizing the crisis, not on training individuals [57, 151, 159, 182]. In fact, during

89

Exercise Golden Phoenix in 2008, the request from the military liaison regarding

technology was “Four buttons max, please. The feeling was that if it required an

engineer to operate, it was a distraction and potential danger during a crisis.” [182].

In addition, liaisons who gather to manage a crisis may be limited in their training

with respect to a CIMS. There are several reasons for this. First, the current practice

of training through simulation exercises requires that individuals physically have to

come to the EOC or other locations to gather for exercises [57, 158]. This limits

organizations in time, personnel, and budget in the number and type of training

exercises that they can engage in [57, 182]. In addition, during our field research,

we observed that there can be a high rate of turn-over of the liaisons of the orga-

nizations, the EOC staff, and other coordinating agencies and elected officials (e.g.

the mayor, commissioners, governor, US representatives). This means that often

the individuals and organizations must form teams in the middle of a crisis, some-

times by individuals who do not know each other and who have never worked with

each other before, or with whom they work on a very infrequent basis. These in-

dividuals need to be able to gain access to a system and learn the system within

minutes. An example of this type of situation occurred in the Haiti Earthquake of

2010 [116]. In particular, the Miami-Dade EOC became the lead support agency in

charge of coordinating local support to the repatriation efforts. In collaboration with

the Miami-Dade EOC, Miami International Airport, the State of Floridas Division of

Emergency Management, Homestead Air Force Base was tasked with accomplishing

the repatriation. Since Homestead Air Force Base is located approximately 40 miles

away from the Miami-Dade EOC, coordination had to be accomplished via remote

technologies. The Miami-Dade EOC had to set up and give access to select officials

regarding particular information about the crisis. In addition, the Base EOC needed

to give access and exchange information about the repatriation of citizens and what

kind of medical or special needs assistance victims needed to Miami-Dade EOC staff.

90

Using WebEOC, the Miami-Dade EOC was able to create a status board for view-

ing access to repatriation status information (with updates occurring in real time)

by their Command staff, Homestead Base emergency management staff and other

response officials. On the other end, the Homestead personnel had to pick up and

learn the CIMS (WebEOC) in a matter of minutes. They also were able to access

and update status boards remotely through a remote web-based login.

Design Principle 2: Modern CIMS should have the ability to add and

delete user accounts and modify access controls quickly and easily, espe-

cially during a crisis. Since crises may lead to unforeseen needs, modern CIMS

need the ability to expand and contract easily. Each incident has different needs

(e.g., resources, personnel, processes, organizational structures) and may require the

formation of teams that were not expected in order to manage the crisis. Sometimes

new individuals come into the crisis or the exercise who have never been trained on a

CIMS [57]. This is especially helpful in the case of multijurisdictional incidents. In

the Haiti earthquake example, the Miami-Dade EOC had to set up and give access

to the Homestead Air Force Base EOC. As another example, in 2010, the National

Football Leagues Pro Bowl and the Super Bowl were held in Miami. The Miami-

Dade EOC was activated to monitor both events. During the activation, the EOC

needed to coordinate with guest state and federal officials who came down to Miami

to assist in monitoring the events and to be available in case of an emergency. In

addition to regular personnel, Miami-Dade had to set up access and access controls

for these guests.

Design Principle 3: Modern CIMS should be easy to adapt to the current

situation. What makes managing a crisis so complex is the fact that one cannot plan

for everything [21]. Crises are often unexpected or they stem from normal situations

91

that unfold in unexpected ways or ways that are beyond the scope of the planned

resources available to address them [83]. Often individuals and organizations must

form teams in the middle of a crisis. Emergency managers and CIMS must be able

to adapt to the situation. In the Haiti crisis, Miami-Dade had to create new types

of unforeseen status boards and reports in the middle of the crisis. They also had to

grant access to particular information to a new group of people as well as to maintain

restrictions on the information that the new members had access to.

Design Principle 4: Modern CIMS should be web-based and distributed

(and have a means of access control for information). This does not mean

that the underlying hardware necessarily has to be built into the web or located in

different geographic areas, but rather the CIMS should provide an ability for coordi-

nating organizations who may not be in physical proximity to the CIMS to log in to a

web-based client and access information for which they are authorized. Again, in the

situation with Haiti, the Miami-Dade EOC had to coordinate with the Homestead

Air Force Base EOC, which is located approximately 40 miles away from the Miami-

Dade EOC. Local EOCs need to coordinate with tribal, state, regional, and federal

EOCs and agencies as well. Through the use of their established CIMS, the state of

Florida was able to establish interconnected regions with its 67 counties for resource

and mission tracking. Through WebEOC Fusion, an augmentation to WebEOC, the

Miami-Dade EOC was able to connect and share information with neighboring coun-

ties and with the Florida state EOC and other coordinating agencies.

Design Principle 5: Modern CIMS should integrate training systems into

them or at least make systems similar to those with which emergency

managers are familiar and which they use regularly. This is similar to Turoff

et al.s first DERMIS premise. However, instead of training on a system, organiza-

92

tions should train with the system (although training on the system needs to happen

as well). That is, training should not be actuated as an event that is distinct from

operational use. Rather, training is a capability that is built into the system as a

component of usability. Use of the system is best mastered in the context of how it is

used in practice. CIMS, for example, may include an integrated training component

that simulates a disaster. An example of such a CIMS is WebEOC.

Design Principle 6: Modern CIMS need to be more inclusive for more

types of users. Managing a crisis is a multi-agency, multi-governmental, and some-

times international activity [57, 98, 99, 100, 178, 190]. Emergency response involves

the public, volunteers, businesses, non-profit and non-governmental organizations

(such as Catholic Charities, American Red Cross, Salvation Army), donors, first re-

sponders, emergency managers, elected officials, and coordinating government agen-

cies. At the governmental level, it may involve coordination among local, tribal,

state, federal and international agencies [57, 108]. Current research identifies many

CIMS for first responders (Incident Commander [72], RimSim [18], WORKPAD

[21]). However, there is a general lack of CIMS for emergency managers. This should

include not just emergency managers and first responders, but also the public, vic-

tims, volunteers, non-governmental organizations, businesses, and elected officials. A

more integrated, holistic approach to the design of crisis information management

systems is needed.

Design Principle 7: Modern CIMS should be designed for use throughout

the entire emergency response lifecycle. Experts generally agree that there

are four main phases of emergency response: mitigation, preparedness, response,

and recovery [55, 178]. Mitigation and preparedness occur on a day-to-day basis.

Response occurs during a crisis and recovery occurs following a crisis. Mitigation

93

involves taking actions to reduce potential loss from disasters. For example, this

involves reinforcing buildings to ensure that they will not collapse from the winds

of the hurricanes or the shaking of the ground during an earthquake. Preparedness

involves establishing procedures and the roles and responsibilities of emergency per-

sonnel and jurisdictions that will be involved during an emergency. It also involves

training personnel. In the response phase, emergency personnel assist the victims

and try to reduce the possibility of further loss of life and property [39]. Recovery

involves efforts to restore the community to normal operations. In the short term,

emergency workers restore vital systems to an acceptable minimum operation. Once

this is accomplished, long term recovery can begin. In the long term, recovery may

go on for months or even years until the affected community returns to its previous

or an improved condition [55, 39]. Modern CIMS should integrate the activities from

each phase of the emergency response lifecycle.

Design Principle 8: Modern CIMS needs to integrate advanced Geo-

graphic Information Systems (GIS) into the common operating picture.

“The support for context awareness is crucial [...] [A] given user, particularly in larger

events, is likely to work in a variety of contexts, either because of physical relocation

or because of changes in the context itself. [...] Therefore, users cannot be assumed

to have local knowledge of the geography and resources of the area. Consequently,

[CIMS for first responders] should be integrated with Geographic Information Sys-

tems, which allow users to gain a deep knowledge of the area.” [21] While this is

especially true for first responders, it applies to emergency managers as well. Dur-

ing the Super Bowl and the Pro Bowl activation at the Miami-Dade EOC in 2010,

for instance, federal and state emergency managers came to the EOC to coordinate

actions and to be present in case of an emergency. These emergency managers were

not necessarily familiar with the area before coming to Miami. For the public, GIS

94

 

Figure 10.1. A GIS aware application used by the Miami-Dade EOC.Clicking on a particular location marker on the map brings up information

and pictures related to the area.

can be used to locate relatives and victims of the disaster and to mark the locations

of shelters. During the Haiti earthquake, for example, victims used a GIS aware web-

site to request help. Through the GPS and GIS, first responders and relief workers

were able to find them. For emergency managers, GIS is used to maintain a common

operating picture and gain a better understanding of the situation, to coordinate

actions, and to aid in better situational awareness and decision-making. GIS also

enables information filtering and visualization of the area in need. For example, with

an overview map, users can click on an event and get more details about the event

(see Figure 10.1).

Design Principle 9: Modern CIMS need to integrate social networking,

95

mobile-based GIS, digital media and digital volunteers. CIMS also will need

to learn how to harness and perhaps even direct the self-organizing digital volunteers

who mobilize during a disaster. Small scale collaborations of social networking began

during Hurricane Katrina, where wikis were used for collective memory, to located

missing persons, to organize volunteers, and to locate emergency housing. One of

the first large scale collaborations of social networking and digital media grew out

of the humanitarian response to the Haiti earthquake in 2010. Although there were

several limitations and technical problems, different digital media technologies, net-

works and communities were able to overcome their differences and work together to

aid in relief efforts to an extent that had not been seen before. [108]. Examples of

the use of digital media were crowdsourcing, the creation and updating of local maps

for areas for which maps were out of date, humanitarian media response and local

coordination, and coordinating to build a people-finder platform [108]. SMS, or short

message service, was used for a variety of applications. One immediate priority was

to help victims discover what was happening around them and where they could go

for help [108]. Citizens and victims were able to request help by way of text message

short code to relief workers. In addition, SMS allowed citizens to send public health

alerts to relief workers, for example, concerning water sanitation and food shortages

in relief camps. SMS was also used in a broadcast capacity by the humanitarian relief

organizations. Humanitarian relief organizations broadcast public health messages

to the population about relief services, general health, hygiene, vaccines, sanitation,

malaria, and HIV/AIDS. During the recovery, the Red Cross used SMS in other coun-

tries to raise money for the relief efforts [6]. Another problem that digital media

was able to help with was in updating outdated maps for the region. In some cases,

maps had never been created for particular areas. In others, rural populations had

migrated into urban areas so quickly that maps became outdated. Still in other areas,

some locations did not appear on any maps. In some situations, relief organizations

96

initially relied on maps based on GPS coordinates which were incorrect [108]. Us-

ing a combination of crowdsourcing, geo-tagging, and local personnel on the ground,

volunteers were able to create updated maps for the relief workers. One application

that came together during the crisis was Ushahidi, a crisis-mapping platform that

was adapted to provide emergency and rescue data. Another application that evolved

was OpenStreetMap. OpenStreetMap is built entirely by volunteers who survey with

GPS, digitize aerial imagery, and collect existing public sources of geographic data.

After supplementing the existing maps with the volunteer inputs and updates from

people on the ground with hand-held GPS devices to mark spots, OpenStreetMap

became the primary map for responders. A third application that became useful in

the aftermath of the Haiti earthquake was Google Earth. Various websites set up and

streamed information into the Google Earth database, which then was used to locate

missing individuals. Volunteers from around the world translated information and

helped write code to create systems for processing data from various websites and

forums. Another source of digital media are wikis and RSS feeds. Wikis were set up

to maintain collective memory and to share best practices and lessons learned from

the Haiti. Ushahidi developed an RSS feed for the U.S. Coast Guard to help them

retrieve emergency information that required immediate assistance. [67, 108, 189].

Design Principle 10: There is a need for a more integrated set of CIMS.

CIMS need to be able to interact with each other on a local, regional, national and

international level. Examples of systems that are able to connect local, regional

and national, levels are E-Team and WebEOC (with a WebFusion enhancement).

However, there is still much work in terms of CIMS integrating the emergency man-

agers, humanitarian responders, non-governmental organizations, businesses, donors,

elected officials, and other types of users. Rather than a closed, local crisis informa-

tion management system held by disparate emergency operation centers, emergency

97

managers need to be able to integrate diverse systems in sometimes ad hoc or unan-

ticipated ways during a crisis. One of the first examples where this took place was in

the Haiti Earthquake disaster in 2010 [108]. With their CIMS, the state of Florida

was able to establish situational awareness of resource needs regionally with the var-

ious counties that were aiding in the earthquake response.

Design Principle 11: Modern CIMS should have a set of data exchange

standards, both for operational use and for training. With respect to data

exchange standards for operational CIMS, one method in use is the comma separated

values (CSV) format. In WebEOC, for example, one can put data into a CSV for-

mat and then import it directly into a status board. One also can export the data

in various formats as well. This proved extremely useful to quickly and accurately

exchange information during the Haiti earthquake, the Super Bowl of 2010, and Pro

Bowl of 2010. As emergency managers were organizing the repatriation efforts, the

Miami-Dade EOC has several spreadsheets of information that they had collected on

the status and special needs of the individuals being repatriated on the Homestead

bound flights from Haiti. They were able to create a status board for the coordina-

tion efforts and import this information with perfect accuracy in a matter of minutes.

Another option is XML. XML is a markup language that allows users to define a set

of rules for encoding documents in a format that is both human readable and machine

readable [177]. There are many XML standards for emergency management, but

there are few standards for exchanging data in operational CIMS. Several standards

that can be used in a CIMS are outlined below.

People Finder Interchange Format (PFIF) PFIF is a data exchange format

for missing persons. In PFIF, an XML record is defined for each person. A record

contains tracking information and identifying information about a missing person.

98

Individuals also can add note records for each person too, which include up-to-date

information about the person as knowledge about their whereabouts get updated.

PFIF was created after the September 11 attack to aid in tracking survivors and

finalized during Hurricane Katrina [131].

ComCARE Vehicular Emergency Data Set (VEDS) and Automatic Crash

Notification (ACN) Initiative

The Vehicular Emergency Data Set is an XML standard for the transmission of telem-

atics data to emergency response agencies. The protocol identifies crash and medical

data elements and provides a standard format for the data exchanged between Telem-

atics Service Providers (TSPs) and public safety agencies. VEDS provides a seamless

data flow to and among multiple sources about an incident [27].

Common Alerting Protocol (CAP)

The Common Alerting Protocol “is an open, non-proprietary standard data inter-

change format that can be used to collect all types of hazard warnings and re-

ports locally, regionally and nationally, for input into a wide range of information-

management and warning dissemination systems.” [28]

Emergency Data Exchange Language (EDXL)

EDXL is a collaboration by the Department of Homeland Security/FEMA (as part

of the Disaster Management eGov initiative) and the Emergency Interoperability

Consortium. EDXL is a “cooperative effort to define a NIMS-compliant family of

shared data exchange specifications encompassing: Incident Notification and Situa-

tion Reports; Status Reporting; Resource Requests and Dispatch; Analytical Data;

Geospatial Information; Identification and Authentication” [37]

99

National Information Exchange Model (NIEM)

NIEM [2.0] is the U.S. National Information Exchange Model. “This is an interagency

initiative to provide the foundation and building blocks for national-level interopera-

ble information sharing and data exchange.” It was launched as a cooperative effort

between U.S. Department of Homeland Security (DHS), the U.S. Department of Jus-

tice (DOJ), and other government entitles. “The NIEM leverages both the Global

Justice XML Data Model (GJXDM) reference model and the GJXDM XML-based

framework and support infrastructure.” [28].

Tsunami Warning Markup Language (TWML) and Cyclone Warning Markup

Language (CWML)

The Tsunami Warning Markup Language (TWML) and Cyclone Warning Markup

Language (CWML) are “structured semantic data models” for representing tsunami

and cyclone warnings. Because these languages enable computer processing of warn-

ings, people being impacted can be quickly notified of an impending tsunami. In-

formation collected through this system can be utilized by other standards such as

the Emergency Data eXchange Language Distribution Element (EDXL-DE) and the

Common Alerting Protocol (CAP) [28].

The Homeland Emergency Response Exchange (HERE)

“HERE allows Partners involved in emergency response planning and implementation

to share available environmental, health, and natural resource information.” Using

HERE standards, emergency managers can quickly identify and assess threats to

environmental infrastructure [68].

10.5 Implications and Future Directions of CIMS

Since Hurricane Katrina, a lot of research has focused on improving crisis man-

agement through the use of information technology. However, there is still a great

100

need for improvements with respect to crisis information management systems, the

web, digital technologies, and the public. Research is also improving how we view

crisis management itself, how organizations interact, and how communities react. In

the future, CIMS will continue to advance with technology, particularly with the

web and digital technologies. The web enables enhanced interconnectivity among

information management systems, digital volunteers, and the public. As the digital

technologies evolve, we will see more advanced distributed systems as well as ad-

vanced cloud-based systems. Digital technologies also enable enhanced connectivity

and collaboration among emergency officials and the public. Some of the biggest

improvements will be in social media, smart phones and tablets, and geographic in-

formation systems. Additionally, emergency managers will need to learn to harness

and possibly direct digital volunteers. Along with crowdsourcing, digital technolo-

gies, and larger, more connected systems, we will see more advanced data exchange

standards, both for operational use and for simulations and training.

10.6 Summary

In this chapter, we discussed design recommendations for modern crisis informa-

tion management system. We began by reviewing seminal papers on the design of

effective dynamic emergency management systems and web based emergency manage-

ment systems. In the subsequent section, we identified additional design characteris-

tics of modern CIMS. There is still a great need for improvements to modern CIMS,

especially with respect to creating distributed, adaptable CIMS with appropriate ac-

cess controls. More work also needs to be done to incorporate greater collaboration

between emergency managers, humanitarian responders, non-governmental organi-

zations, businesses, donors, elected officials and the public. Emergency managers

need to learn how to integrate digital volunteers into the emergency response. There

also needs to be improvement with respect to connecting various regions of CIMS

101

together as well as better integration of social media, smart phones and tablets, and

geographic information systems. Finally, a key area that needs improvement is data

exchange standards for quick and accurate exchange of operational and training data.

In the future, CIMS will continue to see large growth and evolution with advancing

web and digital technology.

102

CHAPTER 11

A CALL FOR DATA EXCHANGE STANDARDS FOR OPERATIONAL CRISIS

INFORMATION MANAGEMENT SYSTEMS AND EMERGENCY

MANAGEMENT EXERCISE SIMULATORS

11.1 Overview

The ability to exchange data quickly and accurately during a crisis is critical.

Data exchange between crisis information management systems (CIMS) is challeng-

ing because there is no concrete definition for what components constitute a good

CIMS. Few standards exist. One of the better standards for exchanging data is ex-

tensible markup language (XML). In this chapter, we define a proof-of-concept XML

specification for a CIMS. This is followed by a similar XML specification for exercise

simulators. We conclude with a few implications and limitations.

11.2 Introduction

The use of information technology to aid emergency management is still a growing

field. In the current era, crisis information management systems (CIMS) are evolv-

ing from discrete, closed systems to multi-agency networked systems. CIMS such as

WebEOC and E-Team now have the ability to connect multiple crisis information

management systems from one Emergency Operations Center (EOC) to independent

CIMS in other EOCs that use the same base system [107, 121, 50]. As CIMS grow

more connected, quick and accurate data exchange standards also become more im-

portant. A prime example of this occurred in the Haiti earthquake of 2010. As the

103

crisis unfolded, Florida, since it was the closest US state to Haiti, became the focal

point for aid and for repatriating citizens from Haiti into the US. In particular, the

Miami-Dade EOC became the lead support agency in charge of coordinating local

support for the repatriation efforts. In collaboration with the Miami-Dade EOC, Mi-

ami International Airport, the State of Floridas Division of Emergency Management,

and Homestead Air Force Base were tasked with accomplishing the repatriation.

Since Homestead Air Force Base is located approximately 40 miles away from the

Miami-Dade EOC, coordination had to be accomplished via remote technologies. As

emergency managers were organizing the repatriation efforts, the Miami-Dade EOC

had several spreadsheets of information that they had collected on the status and

special needs of the individuals being repatriated and the flights coming in and out

of the country to and from Haiti. Using WebEOC, the Miami-Dade EOC was able to

create a status board for the coordination efforts and import this information with

perfect accuracy in a matter of minutes. As this example shows, there is a need for

quick and accurate data exchange during a crisis. Quick and accurate data exchange

is necessary during training as well.

The remainder of this chapter is structured as follows. First, we discuss the

current state of the art in XML exchange specifications. Next, we present a proof-

of-concept XML schema for operational CIMS. Third, we discuss a proof-of-concept

XML schema for exercise simulators. We conclude with a few implications and limi-

tations.

11.3 Background

As crisis information management systems become more connected, there is a

greater need to be able to exchange data. Yet few standards exist. One format

available is comma separated values (CSV). In some CIMS, such as WebEOC, one

can put data into a CSV format and then import it directly into a status board. One

104

can export the data in CSV format as well. This proved useful during the Haiti crisis

in 2010. Another valuable format is extensible markup language (XML). XML is a

markup language that allows users to define a set of rules for encoding documents

in a format that is both human readable and machine readable [177]. There are

many XML standards for emergency management, but there are few standards for

exchanging data in operational CIMS. Several standards that can be used in a CIMS

are outlined below.

• People Finder Interchange Format (PFIF)PFIF is a data exchange format for missing persons. In PFIF, an XML record isdefined for each person. A record contains tracking information and identifyinginformation about a missing person. Individuals also can add note records foreach person, which include up-to-date information about the person as knowl-edge about their whereabouts gets updated. PFIF was created to aid in trackingsurvivors after the September 11 attack and finalized during Hurricane Katrina[131].

• ComCARE Vehicular Emergency Data Set (VEDS) and AutomaticCrash Notification (ACN) InitiativeThe Vehicular Emergency Data Set is an XML standard for the transmission oftelematics data to emergency response agencies. The protocol identifies crashand medical data elements and provides a standard format for the data ex-changed between Telematics Service Providers (TSPs) and public safety agen-cies. VEDS provides a seamless data flow to and among multiple sources aboutan incident [27].

• Common Alerting Protocol (CAP)The Common Alerting Protocol “is an open, non-proprietary standard datainterchange format that can be used to collect all types of hazard warningsand reports locally, regionally and nationally, for input into a wide range ofinformation-management and warning dissemination systems.” [28]

• Emergency Data Exchange Language (EDXL)EDXL is a collaboration by the Department of Homeland Security/FEMA (aspart of the Disaster Management eGov initiative) and the Emergency Interoper-ability Consortium. EDXL is a “cooperative effort to define a NIMS-compliantfamily of shared data exchange specifications encompassing: Incident Notifica-tion and Situation Reports, Status Reporting, Resource Requests and Dispatch,Analytical Data, Geospatial Information, and Identification and Authentica-tion.” [37]

• National Information Exchange Model (NIEM)

105

NIEM [2.0] is the U.S. National Information Exchange Model. “This is aninteragency initiative to provide the foundation and building blocks for national-level interoperable information sharing and data exchange.” It was launchedas a cooperative effort between U.S. Department of Homeland Security (DHS),the U.S. Department of Justice (DOJ), and other government entitles. “TheNIEM leverages both the Global Justice XML Data Model (GJXDM) referencemodel and the GJXDM XML-based framework and support infrastructure.”[28].

• SAFE: Tsunami Warning Markup Language (TWML) and CycloneWarning Markup Language (CWML)The Tsunami Warning Markup Language (TWML) and Cyclone Warning MarkupLanguage (CWML) are “structured semantic data models” for representingtsunami and cyclone warnings. Because these languages enable computer pro-cessing of warnings, people being impacted can be quickly notified of an im-pending tsunami. Information collected through this system can be utilized byother standards such as the Emergency Data eXchange Language DistributionElement (EDXL-DE) and the Common Alerting Protocol (CAP) [28].

• The Homeland Emergency Response Exchange (HERE)“HERE allows Partners involved in emergency response planning and imple-mentation to share available environmental, health, and natural resource in-formation.” Using HERE standards, emergency managers can quickly identifyand assess threats to environmental infrastructure [68].

Clearly here is a need for more expansive XML standards for emergency man-

agement, especially in the larger context of CIMS themselves. One reason XML

standards may be lacking for CIMS is because, as a community, we have not nailed

down a concrete definition of what components constitute a good CIMS. In addition

to operational data exchange standards, there is also a need for data exchange stan-

dards for training and exercise simulations. In particular, one area in need is the

simulation of full-scale exercises. One reason that standards are lacking with respect

to training and exercises may be because there are very few simulators even available

for training upper level emergency managers. Several leading simulators of note are

the Advanced Disaster Management Simulator (ADMS), Emergency Management

Staff Trainer (EMST), and Virtual Staff Trainer [1, 46, 174]. Another training sim-

ulator is SimEOC. SimEOC is a collaborative research project with the University

106

of Notre Dame, Emory University, Florida International University, and the Miami-

Dade EOC [149]. SimEOC is a distributed web-based training and research tool for

emergency managers and scholars [115, 116, 121].

11.4 Simulation Data Exchange Standards

11.4.1 Background

A simulation, in this context, is an enactment of an exercise. Simulations can be

paper-based or electronic. In paper-based exercises, a simulation is enacted mostly

via paper inputs (e.g. players receive inputs on paper that they regard as simulated

fact). In electronic simulations, these inputs are given and responded to through the

computer. In particular, electronic simulations are enacted within a crisis informa-

tion management system. Thus, an XML schema for simulations must incorporate

information about the CIMS being simulated or the CIMS in use by the simulator.

In this work, we assume that we are simulating or emulating a CIMS. The remainder

of this section is structured as follows. First, we discuss a proof-of-concept format

for an Operational CIMS. Then we return to defining a data exchange standard for

simulations.

11.4.2 Towards an XML Schema for Operational CIMS

The main tool for managing a disaster is a crisis information management system.

A CIMS is a vehicle that aids in organizing data, managing resources, maintaining

a common operating picture, supporting situational awareness, establishing and sus-

taining command and control, facilitating decision making, and communicating with

others to aid in orchestrating the response efforts. Indeed, modern CIMS have a very

large scope. According to an article in Risk Management in 2002 [30], the best CIMS

include most of the following features.

107

• Incident tracking, logging and reporting

• Automated SOP checklists and plans

• Resource management (with full database functionality)

• Central command and control

• Messaging and communications function with tracking

• Documentation of response actions

• Contact lists

• Internet, intranet/vpn and wireless

• Radio, cellular and satellite

• Appropriate member participation

• Automated journaling

• Access to plans and data

• Mapping

• Role-based staff management

• Linking capability to access internet sources for weather and event intelligence

• Hand-held compatibility

• Fully configurable and scalable

• Compatible with existing infrastructure, databases, software and e-mail

• Status boards

• Executive briefings

E-Team, one of the leading commercial CIMS, provides the following functional-

ity1 [107]

• Incident reporting and tracking

• Situation reporting

1The features listed here are broad categories of features for informational and comparison pur-poses. They are not meant to be an all-inclusive list of features.

108

• Resource and asset management

• Action planning

• Critical infrastructure reporting

• Hospital and shelter status

• Personnel management

• Procedures and checklists

• Intelligence gathering and dissemination

• Tip reporting

• Duty and call logs

• Organization charts

• CATS hazard modeling interface

• Crystal Reports interface

• Real-time messaging and chat

WebEOC, another leading commercial CIMS, provides similar functionality2, in-

cluding: [50]

• Managing Users

• Status Boards

• Exchange information with other WebEOC customers

• Checklists, Contacts, File Library, Sessions

• Messages

• Mapping

• Audit Logs

• Import/Export Data (CSV)

• CAP Messaging

2The features listed here are broad categories of features for informational and comparison pur-poses. They are not meant to be an all-inclusive list of features.

109

• Status Board/ Form Building Tool

• Reporting Tool

• Creating and Running Simulations3

• Chat

• Remote Boards, Scroller, Alerting, Calendar, Twitter, NWS Weather

WebEOC further enables users to manage multiple incidents and daily events,

assign and track missions and tasks, provide situation reports, manage resources and

prepare incident command system (ICS) and incident action plan (IAP) reports. [50]

Given all these functions coupled with the critical nature of this software, it

is not surprising that CIMS are the subject of many research papers and several

journals (IJISCRAM, MIS Quarterly, HICSS) In this chapter, we will focus on the

data exchange needs for a core set of functions for a CIMS. At the conceptual and

the software level (and for the purposes of defining a simulated CIMS), most of these

functions can be grouped and organized into a system with the following taxonomy.

• Boards

• Links

• References

• Reports

• Communication Tools

These elements form our upper level document object model. Each of these ele-

ments can be further developed.

Boards are a vehicle for storing and viewing data. The boards enable one to enter

data quickly into the database and to view and filter the data in the database on the

screen. Boards include items such as position logs, significant events, hospital and

3We consider the ability to create and run a simulation an optional capability of a CIMS.

110

shelter status, mission/tasks, and resource requests. Boards also can include addi-

tional elements that the user can create to aid in better organization of information.

Links are used to define shortcuts to websites such as news organizations and the Na-

tional Weather Service. References aid in maintaining a common operating picture.

References include items such as disaster maps and incident action plans. Reports

aid in situational awareness and in decision making. There are several reports, such

as hospital and shelter status that are standard across organizations. Additional re-

ports can be defined by the user. Finally, communication tools include services such

as email and chat programs. In short:

• Boards

– Position Logs

– Significant Events

– Mission/Tasks

– Resource Requests

– Road Closures

– Shelters

– Hospitals

– Points of Distribution

– Additional user defined boards

• Links

– Defined by the user

• References

– Incident Action Plans

– Disaster Map

– Position Responsibilities

– Standard Operating Procedures

– Checklists

• Reports

– Road Closures

111

– Shelters

– Hospitals

– Points of Distribution

– Additional user defined reports

• Communication Tools

– Telephones

– Cellphones

– Tablets

– PDAs

– Emails

– Chats

Refining this definition one step further, we get the following document object

model for a CIMS4. See Appendix B for a complete XML specification.

4Note: for readability purposes, the element names contain spaces.

112

 113

<CIMS> <Boards>

<Position Log> <Log>

<Date/Time> <Notes>

<Significant Events> <Significant Event>

<Date/Time> <Notes> <Name>

<Mission/Tasks> <Mission/Task>

<Time> <Agency> <Mission/Task> <Priority> <Status>

<Resource Requests> <Resource Request>

<Id> <Data/time> <Resource requested> <Type> <Quantity> <Priority> <Reason needed> <Special instructions> <Requesting agency> <Status> <Requestor’s name> <Requestor’s phone number> <Address>

<Road Closures> <Road Closure>

<Road Name> <Closed date> <Closed from> <Closed to>

<Shelters> <Shelter>

<Shelter Name> <Status> <Accommodates> <Used>

<Hospitals> <Hospital>

<Hospital Name> <Status> <Accommodations> <Used>

<Points of Distribution> <Point of Distribution>

<PODS> <Status> <Notes>

 114

<Additional user created boards> <User created board>

<User defined field> <Links>

<User defined links> <References>

<Incident Action Plans> <Incident Action Plan>

<Background> <ICS 202 (Response Objectives)>

<ICS 203 (Organization List)> <ICS 204 (Assignment List)>

<ICS 206 (Communications Plan)> <ICS 220 (Air Operations Summary)>

<ICS 200 (Medical Plan)> <Map/Chart>

<Weather Forecast/Tides/Currents> <Other Attachments>

<Disaster Maps> <Disaster Map>

<Flood> <Fire> <Radioactivity> <First aid> <Other>

<Position Responsibilities> <Position Responsibility>

<Position> <Responsibilities>

<Responsibility> <Standard Operating Procedures>

<Standard Operating Procedure> <Event> <Standard Operating Procedure>

<Checklists> <Checklist>

<Procedure> <Step> <Status>

<Reports> <Road Closures>

<Road Closure> <Road name> <Closed date> <Closed from> <Closed to>

<Shelters> <Shelter>

<Shelter Name> <Status> <Accommodates> <Used>

<Hospitals> <Hospital>

<Hospital Name> <Status>

 115

<Accommodations> <Used>

<Points of Distribution> <Point of Distribution>

<POD Name> <Status> <Notes>

<Available Resources> <Resource>

<Id> <Item> <Quantity> <Cost per unit>

<Additional user created reports> <User created report>

<Add field> <Communication Tools>

<Email> <Email Message>

<To> <From> <CC> <BCC> <Subject> <Message Text>

<Chat> <Chat Session>

<Initiator> <Receiver> <Chat Text>

 

11.4.3 Proposed Schema for Exercises and Training Simulations

At the upper level, a simulation is defined by 5 main elements, name of the

exercise, player information, the state of the CIMS, the script, and performance

metrics5 Each of these elements can be further developed.

Player information is defined by the following elements:

• Person

• Role

• Exercise Name

The current state of the CIMS includes an export of the CIMS specification out-

lined above at a particular time in the simulation. Additionally, a simulation CIMS

has the following supplementary elements.

• Exercise Background

– Player Handbook

– Initial Status

• Exercise Log

Exercise background is contextual information about the exercise. The exercise

log is used for training and research purposes to log all actions in the simulation for

later analysis.

A script is a sequence of events to be practiced. Scripts are comprised of injects.

Injects are further represented by the following elements:

• Inject Number

• Time

• Receiving Agency

5These elements are derived from practitioners in the field and from the FEMA guide to exercises.[57, 42, 40]

116

• Sending Agency

• Message Type

• Message Text

• Communication Medium

• Exercise Plan Objective

• Expected Action

Performance metrics are metrics used to evaluate an organization/players per-

formance during the exercise. Player evaluation metrics include elements used to

evaluate how a player performed during the exercise. Elements of player evaluation

include:

• Player Evaluation Metrics

• Target Capabilities

• Exercise Objectives

• Evaluator Handbooks

Evaluator handbooks consist of:

• Exercise Name

• Evaluator Name

• Location

• Date

• Objective

• Performance Criterion

• Comments

Many of these elements can be developed further. Refining each element, we get

the following document object model for a simulation. See Appendix C for a complete

XML specification of a simulation.

117

<Simulation> <Name of the Exercise> <Player Information>

<Person> <Id> <Name> <User Name>

<Role> <Script Training On>

<State of the CIMS> <Exercise Background>

<Player handbooks> <Handbook>

<Purpose> <Scope> <Concept of play> <Exercise assumptions> <Exercise artificialities> <Exercise simulation> <Scenario narrative>

<Player procedures and responsibilities> <Safety and security> <Reporting> <Administrative systems>

<Initial Status> <Status Report>

<Text> <Figure>

<Boards> <Position Log>

<Log> <Date/Time> <Notes>

<Significant Events> <Significant Event>

<Date/Time> <Notes> <Name>

<Mission/Tasks> <Mission/Task>

<Time> <Agency> <Mission/Task> <Priority> <Status>

<Resource Requests> <Resource Request>

<Id> <Data/time> <Resource requested> <Type> <Quantity> <Priority> <Reason needed> <Special instructions> <Requesting agency>

<Status> <Requestor’s name>

<Requestor’s phone number> <Address>

<Road Closures> <Road Closure>

<Road Name> <Closed date> <Closed from> <Closed to>

<Shelters> <Shelter>

<Shelter Name> <Status> <Accommodates> <Used>

<Hospitals> <Hospital>

<Hospital Name> <Status> <Accommodations> <Used>

<Points of Distribution> <Point of Distribution>

<PODS> <Status> <Notes>

<Additional user created boards> <User created board>

<User defined field> <Links>

<User defined links> <References> <Incident Action Plans>

<Incident Action Plan> <Background> <ICS 202 (Response Objectives)> <ICS 203 (Organization List)> <ICS 204 (Assignment List)> <ICS 206 (Communications Plan)> <ICS 220 (Air Operations Summary)> <ICS 200 (Medical Plan)> <Map/Chart> <Weather Forecast/Tides/Currents> <Other Attachments>

<Disaster Maps> <Disaster Map>

<Flood> <Fire> <Radioactivity> <First aid> <Other>

<Position Responsibilities> <Position Responsibility>

<Position> <Responsibilities>

<Responsibility> <Standard Operating Procedures>

<Standard Operating Procedure> <Event> <Standard Operating Procedure>

<Checklists> <Checklist>

<Procedure> <Step> <Status>

<Reports> <Road Closures>

<Road Closure> <Road name> <Closed date> <Closed from> <Closed to>

<Shelters> <Shelter>

<Shelter Name> <Status> <Accommodates> <Used>

<Hospitals> <Hospital>

<Hospital Name> <Status> <Accommodations> <Used>

<Points of Distribution> <Point of Distribution>

<POD Name> <Status> <Notes>

<Available Resources> <Resource>

<Id> <Item> <Quantity> <Cost per unit>

<Additional user created reports> <User created report>

<User added field> <Communication Tools>6

                                                                                                               

6  Note: if exercise developers and researchers want to capture all communications for future research, they can simulate communication mediums such as the phone, cellphone, PDA, and radio as chat sessions.  

 118

<Email>

<Email Message> <To> <From> <CC> <BCC> <Subject> <Message Text>

<Chat> <Chat Session>

<Initiator> <Receiver> <Chat Text>

<Exercise Log> <Time> <Page> <Action>

<Script> <Injects>

<Inject Number> <Time> (supposed to receive) <Receiving Agency> <Sending Agency> <Message Type> <Message Text> <Communication Medium> <Exercise Plan Objective> <Expected Action>

<Player Evaluation Metrics> <Performance Metrics>

<Target Capabilities> <Target Capability>

<Mission area> <Activity> <Reference> <Task> <Status>

<Performance Metrics> <Performance Metric>

<Mission Area> <Activity> <Reference> <Measure> <Metric> <Status>

<Exercise Objectives> <Exercise Objective>

<Objective> <Status>

<Evaluator Handbooks> <Handbook>

<Exercise Name> <Evaluator Name> <Location>

 119

<Date> <Objective> <Performance Criterion> <Comments>

<Player Evaluations> <Time received>

<Time completed> <Individual> <Page> <Action> <Attribute type> <Attribute value> <Expected value>

<Additional Evaluation Metrics> <Percentage of injects missed> <Average response time per inject> <User defined metrics>

 

 

11.5 Limitations

Defining standards for emergency management data exchange is a challenging

task. One of the reasons that it is so challenging is because there is no concrete

definition for what components constitute a CIMS. Similarly, there is no concrete

specification for what constitutes a good exercise simulator and which parts of the

CIMS should be included as part of the simulator. In this chapter, we outline the need

for the development of such standards. This work is only the beginning of defining

a set of data exchange standards. It is not meant to be a definitive standard, but

rather a proof-of-concept specification. We call upon the emergency management

community to continue to refine and expand our model.

11.6 Implications

CIMS will continue to evolve and grow as information technology matures. In

particular, current technology enhances connectivity among once discrete, local crisis

information management systems. However, there is still much work to be done with

respect to CIMS interconnectivity. There also is work to be done in terms of data

exchange standards. Quick and accurate data exchange is critical in the middle of a

crisis. In addition, there also is a need for data exchange standards for simulation-

based training. For example, one should be able to import and export data on players

and the state of the simulation. One of the better formats for this exchange is XML.

In the future, we will see XML continue to grow as the standard for operational and

training data exchange.

11.7 Summary

The use of data exchange standards in the emergency management domain is still

in its infancy. The primary standard in use is XML. In this chapter, we examined

123

emerging XML-based data exchange standards. Next, we presented a basic XML

standard for operational CIMS. This is followed by a basic XML standard for emer-

gency management simulations. Finally, we concluded with some implications and

future directions.

124

CHAPTER 12

CONCLUSION

12.1 Overview

In this dissertation, we have described SimEOC - a virtual Emergency Operations

Center training simulator and research tool for upper-level emergency managers. In

this chapter, we conclude our discussion of SimEOC with an outline of contributions

to the scientific community. This is followed by a brief discussion on limitations and

future work.

12.2 Contributions to the Scientific Community

Through this work, we have made the following contributions to the scientific

community:

• A tool that individuals can use to (1) train emergency personnel (2) conductresearch into emergency management decision making and organizational learn-ing.

• Undergraduate and graduate participation in research.

• A series of publications about SimEOC and emergency management.

12.3 Publications

Following are works published about SimEOC.

Published Works

125

• Nikolai, C. and Madey, G. (2007). Anatomy of a toolkit: a comprehensivecompendium of various agent-based modeling toolkits on the market today.Proceedings of the Agent 2007 Conference on Complex Interaction and SocialEmergence.

• Nikolai, C., and Madey, G. (2009). Searchable taxonomies of agent based mod-eling toolkits. Proceedings of the 2009 Spring Simulation Multiconference. So-ciety for Computer Simulation International.

• Nikolai, C., and Madey, G. (2009, March). Tools of the trade: a survey ofvarious agent based modeling platforms. Journal of Artificial Societies andSocial Simulation, 12(2)2. Available at http://jasss.soc.surrey.ac.uk/12/2/2.html.

• Nikolai, C., Becerra-Fernandez, I., Prietula, M., and Madey, G. (2009). Projectensayo: designing a virtual emergency operations center. Proceedings of theIEEE International Conference on Systems, Man, and Cybernetics, San Anto-nio, TX.

• Nikolai, C., Johnson, T., Becerra-Fernandez, I., and Madey, G. (2010). Lever-aging WebEOC in support of the haitian relief effort: insights and recommen-dations. The 7th International Community on Information Systems for CrisisResponse and Management (ISCRAM) Conference, Seattle, WA.

Slide Presentations

• Nikolai, C., and Madey, G. (2009, March). Agent based modeling toolkit searchengine. Agent-Directed Simulation Symposium (ADS’09). The Society for Mod-eling and Simulation International (SCS). San Diego, CA.

• Nikolai, C., Becerra-Fernandez, I., Prietula, M., and Madey, G. (2009, Oct).Project ensayo: designing a virtual emergency operations center. Proceedingsof the IEEE International Conference on Systems, Man, and Cybernetics, SanAntonio, TX.

• Madey, G., Becerra-Fernandez, I., Nikolai, C., and Prietula, M. (2010). A train-ing and research simulator for emergency management. Institute for OperationsResearch and the Management Sciences (INFORMS), Austin, TX.

Poster Sessions and PhD Colloquiums

• Nikolai, C., Madey, G., Becerra-Fernandez, I., and Prietula, M. (2010, April).Ensayo: a collaborative cyberinformation portal for emergency managementtraining and research. Cyberinfrastructure Days, Computer Science & Engi-neering, University of Notre Dame, Notre Dame, IN.

126

• Nikolai, C., Madey, G., Becerra-Fernandez, I., and Prietula, M. (2010). Ensayo:a distributed, web-based virtual emergency operations center for training andresearch. The 7th International Community on Information Systems for CrisisResponse and Management (ISCRAM) Conference PhD Colloquium and PosterSession, Seattle, WA.

• Nikolai, C., Madey, G., Becerra-Fernandez, I., and Prietula, M. (2010). En-sayo: a virtual emergency operations center simulator for training and research.Ph.D. Colloquium and Poster Session, Winter Simulation Conference, Balti-more, MD.

Technical Reports

• Nikolai, C., Prietula, M., Madey, G., Becerra-Fernandez, I., Johnson, T., Mooney,M., and Bhandari, R. (2010). Experiences and insights using a virtual emer-gency operations center. Project Report, Computer Science & Engineering,University of Notre Dame, Notre Dame, IN.

• Nikolai, C. (2012, January). vEOC assumptions and requirements documentv.5. Project Report, Computer Science & Engineering, University of NotreDame, Notre Dame, IN.

• Nikolai, C., Johnson, T., Becerra-Fernandez, I., Prietula, M., and Madey, G.(2014). SimEOC: a distributed web-based virtual emergency operations centersimulator for training and research. Technical Report, Computer Science &Engineering, University of Notre Dame, Notre Dame, IN.

• Nikolai, C., Johnson, T., Becerra-Fernandez, I., Prietula, M., and Madey, G.(2014). Design Principles for modern crisis information management systems:from closed local systems to the web and beyond. Technical Report, ComputerScience & Engineering, University of Notre Dame, Notre Dame, IN.

• Nikolai, C., Johnson, T., Becerra-Fernandez, I., Prietula, M., and Madey, G.(2014). A call for data exchange standards for operational crisis informationmanagement systems and emergency management exercise simulators. Techni-cal Report, Computer Science & Engineering, University of Notre Dame, NotreDame, IN.

• Nikolai, C., Johnson, T., Prietula, M., and Madey, G. (2015). Games andsimulations for emergency operations centers: challenges and opportunities.Technical Report, Computer Science & Engineering, University of Notre Dame,Notre Dame, IN.

127

Field Reports

• Nikolai, C. (2010, June). Miami-Dade emergency operations center field re-search report. University of Notre Dame Zahm Research Travel Report.

12.4 Limitations

The main limitation of SimEOC is that SimEOC has not been used by researchers

or trainers to perform experiments or train emergency managers as of this date. A

second limitation is the software has been modeled off of Miami-Dade County EOC.

This means that there need to be small modifications to enable the software to serve

as a general EOC simulator. Third, we have not integrated artificial intelligence to

simulate other players. We also lack artificial intelligence in the tutoring and player

feedback capabilities. Additionally, there are several software limitations. Several sec-

tions are still in the early design stages and need to be upgraded to more advanced

functional levels. These include the learning tutor, dashboards, research metrics,

player reports, an XML backbone structure, mapping and GIS, the planning section,

communication tool fidelity, and the ability to download and install the software in

an organization.

Another limitation is multi-browser compliance. Different browsers interpret CSS,

Javascript, and PDF files differently. This creates slight bugs in layout and function-

ality between different browsers and sometimes even between different versions of the

same browser. Currently, the vEOC is compatible with Internet Explorer (IE) 7.0

as well as Firefox 2.0.18 and 3.0.4, and Safari 2.0.4, but it may have slight bugs in

later releases of IE, Firefox, and Safari. We hope to address these limitations in our

future work with SimEOC.

128

12.5 Future Work

In the future, we would like to enhance system capabilities. There are four main

areas that need enhancement: Intelligent Agents, Learning Tutor, Evaluation Met-

rics, and Dashboards. In terms of intelligent agents, we want this application to be

able to have intelligent agents able to fill in for humans when necessary (e.g. when

all of the players are not available for training or if individuals want to train selective

portions of the EOC (e.g. the public safety group or the infrastructure group)). This

will take a good deal of work to accomplish and is beyond the scope of this disser-

tation. For the learning tutor, we want this application to have an intelligent agent

to act as a tutor and give hints for players as they participate in the training. This

also will take a fair amount of work. The evaluation metrics need to be improved.

We hope that we can have a more complete suite of metrics to evaluate players on.

These metrics come from researchers and exercise trainers, so we expect to have more

detailed metrics as SimEOC gets more use. We need more advanced dashboards as

well. Another area that needs strengthening is the underlying reverse AJAX technol-

ogy. The current implementation has shortcomings with the more recent browsers.

Finally, in the future, we would like to incorporate social media and smart devices

into SimEOC as well.

12.6 Summary

In this chapter, we discussed SimEOC contributions to the scientific community.

We concluded with limitations and future work.

129

APPENDIX A

MIAMI-DADE EMERGENCY OPERATIONS CENTER FIELD RESEARCH

REPORT1

A.1 Overview

Because of their location and experience with hurricanes, the Emergency Oper-

ations Center (EOC) in Miami-Dade County, Florida, is considered one of the top

EOCs in the country and an ideal location for field research. For approximately 9

months, during Summer 2009-Spring 2011, I conducted field research with Miami-

Dade County. I went down to Florida to study emergency management, emergency

procedures within the EOC, critical decision-making at the EOC, and the culture

and customs of emergency managers. A secondary goal of this research was en-

hanced collaboration with our collaborators at Florida International University and

Emory University. In this report, I describe my experiences and results as a field

researcher with the Miami-Dade Emergency Operations Center in Miami, Florida.

A.2 Introduction

An Emergency Operations Center is a secure location where upper-level emer-

gency officials gather to prepare for, manage, and coordinate the response to an

incident (e.g. tsunami, earthquake, hurricane, pandemic). In day-to-day operations,

emergency management staff are involved in preparedness and mitigation strategies

1Material from this chapter has appeared in the following publication: Nikolai, C. (2010, June).Miami-Dade Emergency Operations Center Field Research Report. University of Notre Dame ZahmResearch Travel Report, Notre Dame, IN.

130

 

DirectorCurtis Sommerhoff

Homeland security program

manageer mdpd lieutenant

Efren lopez

deputy directorjonathan lord

external affairs coord

em governmental coord

david perez

executive secretary

lettie cogdell

public information

officerjamie hernandez

health & human services bureau

managerVolunteer

managementNixsa serrano

systems managerSSA/p

gissoheila ajabshir

planning & preparedness

bureau managertechnical hazards

Niel batista

infrastructure & recovery bureau

managerLocal mitigation

strategyFrank reddish

personnel & admin manager

Spa1finance & human

resourcesPamela broaster-

doyle

empa/empg projectgrant funded

tempmonique lopez

critical infrastructure em coordinator

raymond misomali

logistics & podsem coordinator

craig hall

recoveryem coordinator

paul vitro

dae & strategic planning

Em coordinatoranjila lebsock

cemp, coop, & evacem coordinatorsherry capers

special hazards & ccgp planning

em coordinatorroslyn viterbo

training & exercise

em coordinatortroy johnson

eoc readinessem coordinatoradrian walker

regional em planner

em specialistgrant funded

vacant

eoc readinessem coordinator

vacant

health servicesem coordinator

lorenzo sanchez

community prep & ada

em coordinatorMirtha gonzalez

psn & rhcfem coordinatorroberto cepeda

mass care & ambulance contract

em coordinatorcharles cyrille

Figure A.1. Department of Emergency Management organizationalstructure. This is the day-to-day structure of emergency management at

the Emergency Operations Center.

for future crises [39]. They are organized into Health and Human Services, Sys-

tems, Planning and Preparedness, Infrastructure and Recovery, and Personnel and

Administration. See Figure A.1.

When a disaster strikes, however, the emergency management staff drop their

day-to-day roles and take on the role assigned to them by the Incident Commander.

This role usually involves leading a section or branch of the incident command system

or ICS [85]. There are four main branches in accordance with ICS. The main sections

are operations, planning, logistics, and finance/administration. Operations is further

organized into four branches: Public Safety, Human Services, Infrastructure, and Mu-

131

Figure A.2. Incident Command System [41]. The command staff, thegeneral staff, and the agency liaisons assist the incident commander during

an emergency.

nicipal. Planning consists of Geographic Information Systems (GIS), the 311 Public

Information Call Center, and three units to aid in incident planning and documen-

tation. Finally, Logistics is divided into EOC Support and Disaster Resources. See

Figure 2.5. The operations, planning, logistics, and finance/administration sections

constitute the general staff. Leading the general staff and assuming responsibility for

the incident is the Incident Commander. See Figure 2.5. The Incident Commander

has additional support staff as well, called the command staff, which includes a public

safety officer, a public information officer, and a liaison officer [74]. Figure A.2.

132

A.3 Background

Training and exercise are critical to the success of emergency management at

the EOC: ”Building essential response capabilities nationwide requires a systematic

program to train individual teams and organizations - to include governmental, non-

governmental, private-sector, and voluntary organizations - to meet a common base-

line of performance and certification standards. Professionalism and experience are

the foundation upon which successful response is built. Rigorous, ongoing training

is thus imperative.” [165]

In the current crisis management arena, much of the training is conducted via live

or face-to-face exercises. [2]. There are a number of limitations, however, to current

training solutions. First, crises, by definition, are rare events, and therefore they do

not enable extensive training. Moreover, in the middle of a crisis, few organizations

have the time or resources to train new personnel; Their foremost concern is on stabi-

lizing the crisis, not on training individuals. [151]. Another limitation of face-to-face

solutions is that there are few experts available, and each expert is inherently con-

strained by limited time, experience, and perspective. [151] In addition, there is the

difficulty of training teams, training selective components of the incident command

hierarchy, and training upper-level managers. [62, 151]. In fact, while there are multi-

ple computer-based solutions available for first responders, current research identifies

a general lack of computer-based training that targets upper-level emergency man-

agers. [2]. Moreover, the training that does take place can be ineffective because

most instructors use subjective measures and usually end up emphasizing outcomes

over decision management processes. Finally, in face-to-face and instructor-centric

solutions, there are usually inherent time delays in the feedback as experts analyze

the students progress, compare the students actions and outcomes to the expected

actions and outcomes, and tailor the feedback to the individual [151].

133

A.4 Computer-Based Solutions to Emergency Management Training

Computer-based solutions to training, on the other hand, can adequately address

these limitations. Computer-based training allows emergency managers to train new

personnel without being in the middle of a disaster. Moreover, computer-based train-

ing allows personnel to train more frequently than they otherwise would be able to

in live and face-to-face exercises. In addition, they enable distributed access to data,

resources, communication, and training. Computers also enable teams to train selec-

tive portions of the emergency management hierarchy. Finally, whereas feedback has

delays in non-computer solutions, feedback can be immediate in a computer-based

system.

A.5 Crisis Information Management and Training System

Recognizing the need for computer-aided training and incident management,

Miami-Dade, along with several other counties in the region and throughout the

state of Florida, began using WebEOC in early 2009. WebEOC, a product released

by ESi Acquisition Inc., is a web-based crisis information management system that

enables agencies and staff within the EOC to share real-time information in a secure

manner [36]. One of the main features of WebEOC is that anyone with proper ad-

ministrative rights can create a board, which is the main vehicle for collaborating and

sharing information. Boards are interfaces with various views that enable individuals

with proper access rights to input, modify, and view data that is stored in a backend

database. Administrators assign various levels of access rights for each board and

each registered user. See Figure A.3.

Another feature of WebEOC is a training simulator. The simulator, however, is

not user-friendly, and it is cumbersome to set up and run exercises. Setup involves

going through hundreds or thousands of entries from various boards and modifying

134

 

Figure A.3. A WebEOC status board.

them to create the simulation. See Figure A.4.

Because of this, and because of the lack of current training tools, Miami-Dade

county invited me down to Miami to conduct this research and to collaborate with

them on this tool.

A.6 Ensayo

In this research, I am designing a socio-technical emulator and training facility

for upper-level emergency managers. This tool is important because it enables emer-

gency managers to train under crisis conditions in a virtual arena. There are several

key features of this software. First, it targets upper-level emergency managers, here-

after referred to as emergency managers. Second, it allows emergency managers to

access databases, to coordinate emergency response, and to train in a virtual envi-

ronment. Since our system is web-based and distributed, it allows mangers to train

from anywhere from any computer that has a web browser. Second, this system has

135

 

Figure A.4. An example simulation in WebEOC. The user has to configurehundreds of injects for the exercise. In addition, if the structure of the

boards have been changed since the original gathering of the data, then theinjects may not reflect this change [52].

136

special feedback mechanisms that tell the emergency manager the effects of his/her

decisions. These mechanisms take the form of charts and graphs of critical data that

vary depending on decisions made. Third, this system contains inject-driven scenar-

ios. This is a simulation engine that sends injects to the trainee electronically.

In addition to a training facility, this system also servers as a research tool for

cognitive scientists to study the decision-making process under emergency conditions.

This tool is important because current research practices in emergency management

consist of waiting for a crisis to occur, and then questioning the emergency responders

as to their decisions and actions during the situation. This process is extremely time

consuming and subject to survey and responder biases. With Ensayo, on the other

hand, we can log and analyze each decision and as well as its effects [9, 11, 90, 115].

A.7 Field Research in Miami-Dade

Due to the recession, Miami-Dade County has been undergoing severe budget

constraints [125]. Subsequently, several positions were eliminated, including the EOC

Readiness position (see Figure A.1). Part of these responsibilities were assumed by

the Systems Management Group, and this is where I filled-in on a day-to-day basis.

Under the direction of Soheila Ajabshir, the Information Technology Systems and

Support leader and GIS Unit Leader at the EOC, I had a variety of responsibilities,

including creating and maintaining an inventory of the EOC equipment, helping to

tweak the implementation and to administer WebEOC, keeping equipment up-to-

date and software up-to-date with patches, and working with various information

technologies to make processes more efficient and effective. At the same time, I was

conducting field/action research. I set out to study emergency management and the

organizational structures and processes of the EOC.

137

A.8 Research Methodology

I began my field research as an ethnographic study of emergency managers. First,

I began with a literature review and following the ethnographic methodological rec-

ommendations of Fetterman [56], I began my ethnography in May 2009 when I em-

bedded myself in the Miami-Dade EOC. I conducted informal and formal interviews

and observed key personnel participating in training, exercises, workshops, and real

activations. I also worked side-by-side with emergency managers. From mid-May

through the end of December, I volunteered at the EOC for three days a week. The

other two days, I devoted to advancing my software development and analyzing my

experiences and findings from the week. In addition, I collected and analyzed key

documents from the EOC, including past situation reports, after action reports, and

exercises documents. In the process of this ethnography, however, I had to change

my style to field research, and from this, I have discovered several insights.

A.9 Lessons Learned and Insights Gained

1. Take time before beginning to review ethnographic proceduresWhen doing an ethnographic study, it is imperative to take enough time to developgoals and a plan of action. I recommend at least two weeks of ethnographicliterature review prior to engaging in the ethnography.

2. Do not underestimate the time it takes to get adjusted to the environ-ment.In this research, I was fully immersing myself in the emergency operations center,and I underestimated the time it would take for me to adjust to the environmentand to establish social networks.

3. You never know when something good will come up, and I stress theimportance of carrying several tools on you at all times.For me, these tools were a voice recorder and a camera. For much of my timeat the EOC, events were relatively routine. Summer 2009- January 2010 in par-ticular, was especially routine because neither a hurricane nor a tropical stormthreatened southern Florida. Even if a hurricane were to occur, there would beample warning of the impending tropical depressions before they strengthened intohurricanes. The earthquake that struck Haiti on January 12, however, came with

138

little warning. Suddenly, I found myself in the midst of an activation, and I didnot have my camera with me. Luckily, my cellular telephone had a built-in camerathat I was able to use on the first day of the activation. Otherwise, I could havemissed much of the important action that happened that first day. This lessontaught me that I needed to have my camera with me at all times.

4. Be careful not to get too close to the people you are studyingDue to economic constraints, budgets were being severely cut, personnel werelosing their jobs, and the individuals who were remaining had to pick up thecorresponding workloads. Amidst this environment, I found it more and moredifficult to extract myself from working in order to interview personnel and observeemergency procedures. I had acquired vital skills that were critical to the operationof the EOC, and toward the end of my research, the people at the EOC began tosee me as an emergency manager and not as a researcher.

5. Take several weeks to adjust when you get back and write up yourreports and ethnographic results.Writing items while they are fresh in your mind and in your notes are importantto completing the study, especially before you start getting re-caught up in yourresearch. In addition taking time to adjust upon return and make sense of yournotes and analyses is just as important when returning as when beginning anethnographic study.

A.10 Results

From this experience, I gained several important results:

1. Perspective on what emergency management encompasses, both inday-to-day and in emergency operations.I gained perspective on the organizational structure and processes of the emergencymanagement staff in day-to-day operations as well as in emergency situations.

2. Certified WebEOC administratorI gained an excellent insights into WebEOC, the crisis information managementsystem in use in 36 US states [51]. I went to three advanced training seminarson WebEOC, and by the time 6 months had passed, I had enough knowledge andexperience to pass the WebEOC administrator certification examination [52].

3. Federal Emergency Management Agency (FEMA) Independent StudyPrograms I completed 16 FEMA Independent Study Courses.

• IS-00001 - Emergency Program Manager An Orientation to the Position

• IS-00100.a - Introduction to the Incident Command System, ICS-100

139

• IS-00100.HE - Introduction to the Incident Command System ICS-100 forHigher Education

• IS-00120.A - An Introduction to Exercises

• IS-00130 - Exercise Evaluation and Improvement Planning

• IS-00139 - Exercise Design

• IS-00200.a - ICS for Single Resources and Initial Action Incidents

• IS-00230.a - Fundamentals of Emergency Management

• IS-00235 - Emergency Planning

• IS-00240 - Leadership & Influence

• IS-00241 - Decision Making & Problem Solving

• IS-00242 - Effective Communication

• IS-00244 - Developing and Managing Volunteers

• IS-00800.b - National Response Framework, An Introduction

• IS-00820 - Introduction to NRF Support Annexes

• IS-00821 - Critical Infrastructure and Key Resources Support Annex

I also completed the FEMA Professional Development Series (FEMA Website2009).

4. Gathered and analyzed various documents that were needed to com-plete my projectI gathered various documents that will assist in creating the virtual EmergencyOperations Center:

• National Weather Service Updates/Advisories

• DEM&HS News Releases

• H1N1 Related Information/Updates

• County Employee Newsletters

• Drill/Exercise Messages

• Drill/Exercise Handbooks

• Drill/Exercise Inject Lists

• Photographs

• Staff Meeting Notes

• After Actions Reports

• EOC Process Documents (e.g. CEMP and staff checklists and Debris removalplan)

• GIS maps and base files

140

• Situation reports

• Staff checklists/continuity files

5. Established strong connections at the EOC and with my collaboratorsI established strong connections with the personnel at the EOC so that I cancontact them in the future for questions and system validation.

6. Experience assisting in a real crisis and two routine activations theHaiti Earthquake and the Probowl and Superbowl.In addition to participating in two functional exercises (Hurricane Suiter andOperation Cassandra) and a public learning exercise [104], I had the privilegeof assisting with the relief operations for the Haiti Earthquake of January 2010[171, 116]. I also assisted in the Probowl and Superbowl, which both were held inMiami in 2010.

7. Published and presented a discussion paper in the International Com-munity on Information Systems for Crisis Response and Management(ISCRAM)I published and presented this paper at the ISCRAM Conference in Seattle in theMay 2010 [116].

8. More insightful development on my virtual Emergency OperationsCenter.I changed my perspective and my entire model of my virtual Emergency Opera-tions Center.

• Changed mental models of emergency mangers

This concept map (Figure E.1) shows the various functions that emergency man-agers engage in on an on-going basis as well as during a crisis. The main activitiescenter around both information and people.

This concept map (Figure E.3) shows the various functions that exercise developersengage in when creating an exercise.

• Built a model of logistical request processes

This process was obtained through interviews with Craig Hall, the Logistics sectionchief at Miami-Dade. This process outlines the roles of the Logistics Section Chief,the Government Services Agency, and Department of Procurement Managementin obtaining a resource. It also outlines how Miami-Dade county attempts to fillthe request in-house first and then if this is not possible, it upchannels the requestto the state (through the EM Constellation software program).

• Improved vEOC Architecture

141

decisions

people

other agencies

NGOs

liaisons

public

incident command

federal

property

state

first responders

exercise background information

crisis information

management system

resrouces

property

Incident Command

information

plans

procedures

Emergency Managers

standard operating

proceedures

risk

damage

IAP

ICS Forms

Incident Objectives

Review

coordinate withcollaborate with

organize

filter

familiarize with

manage

use

create

fill out

create

make

visualize

share

exercise

act on

prioritize

elected officials

control access to

plan for

contingencies

assessmitigate

resources

public

protect

print

communicate with

interpret

synthesize

comply with

Figure A.5. An emergency manager concept graph.

142

controller handbooks

Exercise Developers

scripts

reports

organization as a whole

groups

individuals

evaluator handbooks

evaluation metrics

injects

players

target capabilities

past exercises

objectives

prints

determines

based on

taken from

makes

consisting of

player handouts

develops

sent to

after action reports

creates

if the individual and the organization met its exercise objectives and

metrics

based on

determines

Figure A.6. An exercise developer concept graph.

143

Start - Request from Agency

Liaison

Government Services Agency

(GSA)

Department of Procurement Management

(DPM)

Is the resource available by another

agency liaison?

Enter into EM Constellation

Enter into EM Constellation

Is the resource available by predefined contracts?

Obtain Resource

Obtain Resource

Obtain Resource

YES

Is the resource in house?

NO

YESYES

Is the resource available by

another agency liaison?

YESYESNO

NO

Give to Logistics Section Chief

NO

NO

Enter into EM Constellation

Figure A.7. The logistics resource request process.

144

!

Figure A.8. Improved vEOC architecture. The architecture has 13 mainmodules.

145

Due to my experiences in Miami-Dade, I discovered incomplete elements of theprevious architecture. Some of these elements include a decision support systemand the need for backup and redundant databases. This new architecture reflectsthese changes.

• Improved user interface

While at Miami-Dade, I discovered the need to restructure the user interface forthe emergency managers. In the original prototype, the system used a tab-basedapproach and stored the information in the clients web-browser. (see Figure A.9)Because there is a lot of information that emergency managers need some of thetime, tabbed interfaces can waste space and become overloaded. An alternative totabs, in a windows-based approach, each menu item creates a new pop-up window.This allows individuals to have more information readily available and to easilyswitch between various statuses. I also discovered the need to embed and storedata in server side databases rather than in the clients web-browser, where I wasstoring data in the initial prototype.

A.11 Summary

In conclusion, for 9 months, I conducted field research with the Miami-Dade De-

partment of Emergency Management. This research proved invaluable to me for

several reasons. I was able to study emergency management, emergency procedures

within the EOC, critical decision-making at the EOC, and the culture and customs of

emergency managers. In particular, some of the benefits gained include an increased

awareness and a more accurate mental model of what emergency mangers do on a

day-to-day basis as well as what activities they engage in emergency situations. I

learned about the crisis information management software in use at the EOC and

what individuals like and dislike about it compared with other software. I gathered

various documents that will assist me in creating the virtual emergency operations

center. I established strong connections with the EOC and also with my fellow col-

laborators. I gained experience in a real activation and two fully-functional exercises.

Finally, I gained feedback on the prototype and improved SimEOCs architecture and

user interfaces.

146

 

Figure A.9. Old User Interface. A tab-based approach.

147

 

Figure A.10. Improved user interface - a windows-based approach.

148

Figure A.11. New user interface script developer console. This is whereindividuals develop exercise scripts and injects to send to the trainees.

149

APPENDIX B

CIMS XML SPECIFICATION

150

Top Level Document Object Model for a CIMS <CIMS>

<Boards> <Links> <References> <Reports> <Communication Tools>

XML Specification 1.1 Boards A convenient way to enter and update information to the system NAME LABEL DESCRIPTION Position Log <PositionLog> Defines a log of

notable events that occurred during the shift of an individual for a particular role. Aids in document-tation of the disaster and in smooth changing of shifts during a disaster.

Begin Child Elements of Position Log Log <Log> Defines a log entry

Begin Child Elements of Log Date/Time <DateTime> The date/time that a

note is being added to the position log.

Notes <Notes> A description of the event along with any additional remarks concerning the event.

End Child Elements of Log End Child Elements of Position Log

Significant Events <SignificantEvents> Defines a significant event that needs to be documented during a disaster

Begin Child Elements of Significant Events

151

Significant Event <SignificantEvent> Defines a significant event

Begin Child Elements of Significant Event Date/Time <DateTime> The date/time that

the significant event is being documented

Notes <Notes> A description of the significant event along with any additional remarks concerning the significant event.

Name <Name> The name of the person documenting the significant event.

End Child Elements of Significant Event End Child Elements of Significant Events

Mission/Tasks <MissionTasks> Defines a mission or task to be accomplished during the exercise

Begin Child Elements of Mission/Tasks Mission/Task <MissionTask> Defines a

Mission/Task Begin Child Elements of Mission/Task Time <Time> The date/time of the

request Agency <Agency> The agency of which

the mission/task is being requested

Mission/Task <MissionTask> The mission or task that needs to be accomplished

Priority <Priority> The priority of the mission/task to be accomplished. Common values include Flash, High, Medium, and Low.

Status <Status> The status of the Mission/Task. Common values include Pending and

152

Complete. End Child Elements of Mission/Task

End Child Elements of Mission/Tasks Resource Requests <ResourceRequests> Defines requests for

additional resources needed during a disaster

Begin Child Elements of Resource Requests Resource Request <ResourceRequest> Defines a resource

request Begin Child Elements of Resource Request Id <Id> A unique internal

reference to a resource request. Aids in identifying resource requests at the computer level.

Date/Time <DateTime> The date/time at which the request is being made

Resource Requested

<ResourceRequested> The resource being requested

Type <Type> The category of the resource being requested. Common values include Type I, Type II, Type III, Type IV, and Other.

Quantity <Quantity> The quantity desired of the resource being requested

Priority <Priority> The priority of the resource request. Common values include Flash, High, Medium, and Low.

Reason Needed <ReasonNeeded> The reason the resource is being requested

Special Instructions

<SpecialInstructions> Any special instructions that should be adhered to with respect to the

153

request Requesting Agency

<RequestingAgency> The agency making the request

Status <Status> The status of the resource request. Common values include Pending, Deployed, Completed, and Cancelled.

Requestor’s Name

<RequestorsName> The name of the person within the agency making the request

Requestor’s Phone Number

<RequestorsPhoneNumber> The telephone number of the person making the request

Address <Address> The address of the agency of the person making the request

End Child Elements of Resource Request End Child Elements of Resource Requests

Road Closures <RoadClosures> Defines a report on the status of the roads during a disaster

Begin Child Elements of Road Closures Road Closure <RoadClosure> Defines a road

closure Begin Child Elements of Road Closure

Road Name <RoadName> The name of the road that is closed

Closed Date <ClosedDate> A date/time group indicating when the closure went into affect

Closed From <ClosedFrom> The name of the street at which the closure starts

Closed To <ClosedTo> The name of the street at which the closure ends

End Child Elements of Road Closure

154

End Child Elements of Road Closures Shelters <Shelters> Defines a report on

the status of the shelters during a disaster

Begin Child Elements of Shelters Shelter <Shelter> Defines a shelter.

Begin Child Elements of Shelter Shelter Name <ShelterName> The name of the

shelter Status <Status> The status of the

shelter. Common values are Open, Closed, and Full

Accommodations <Accommodations> The number of accommodations that are available

Used <Used> The number of accommodations that are being used

End Child Elements of Shelter End Child Elements of Shelters

Hospitals <Hospitals> Defines a report on the status of the hospitals during a disaster

Begin Child Elements of Hospitals Hospital <Hospital> Defines a hospital

Begin Child Elements of Hospital Hospital Name <HospitalName> The name of the

hospital Status <Status> The status of the

shelter. Common values are Open, Closed, and Full

Accommodations <Accommodations> The number of accommodations that are available

Used <Used> The number of accommodations that are being used

End Child Elements of Hospital End Child Elements of Hospitals

155

Points of Distribution <PointsOfDistribution> Defines a report on the status of a Point of Distribution (POD)

Begin Child Elements of Points of Distribution Point of Distribution <PointOfDisttribution> Defines a Point of

Distribution Begin Child Elements of Point of Distribution POD Name <PODName> The name of the

Point of Distribution Status <Status> The status of the

POD. Common values are Open, Closed, and Full

Notes <Notes> Any desired remarks concerning the POD

End Child Elements of Point of Distribution End Child Elements of Points of Distribution

Additional User Created Boards

<UserCreatedBoards> A user defined report on any other aspect the user needs

Begin Child Elements of Additional User Created Boards User Created Board <UserCreatedBoard> Defines a user

created board Begin Child Elements of user created board User Defined Field

<UserDefinedField> A user defined field for the report.

End Child Elements of user created board End Child Elements of Additional User Created Boards

1.2 Links External Links that may be useful for reference NAME LABEL DESCRIPTION

Link <Link> Defines an external link that may be useful to reference. Example links include CNN and the National Weather Service.

1.3 References Items which the user can reference if needed NAME LABEL DESCRIPTION

156

Incident Action Plans <IncidentActionPlans> Defines Incident Action Plans

Begin Child Elements of Incident Action Plans Incident Action Plan <IncidentAction-

Plan> Defines an Incident Action Plan.

Begin Child Elements of Incident Action Plan Background <Background> Describes the

background of the disaster.

ICS 202 (Response Objectives)

<ICS202> Describes objectives for the disaster

ICS 203 (Organization List)

<ICS203> Defines the organization list for the response effort

ICS 204 (Assignment List)

<ICS204> Defines a personnel assignment list for the response effort

ICS 206 (Communication Plans)

<ICS206> Defines a communications plan for the response effort

ICS 220 (Air Operations Summary)

<ICS220> Defines a summary of air operations for the response effort

ICS 200 (Medical Plan) <ICS200> Defines a medical plan for the response effort

Map/Chart <MapChart> Outlines a map or chart of the affected area.

Weather Forecast/Tides/Currents

<Weather> Highlights applicable weather for the affected area.

Other Attachments <OtherAttachments> Includes any additional applicable attachments desired.

End Child Elements of Incident Action Plan End Child Elements of Incident Action Plans

Disaster Maps <DisasterMaps> Defines maps displaying information about the disaster for the emergency managers

Begin Child Elements of Disaster Maps Map <Map> Defines a disaster map

Begin Child Elements of Map

157

Flood <Flood> A mark indicating flood activity during a disaster

Fire <Fire> A mark indicating a fire during a disaster

Radioactivity <Radioactivity> A mark indicating radioactivity during a disaster

First Aid <FirstAid> A mark indicating that first aid is needed during a disaster

Other <Other> A mark indicating other user defined activity during a disaster

End Child Elements of Map End Child Elements of Disaster Maps

Position Responsibilities <PositionRespon-sibilities>

Defines player responsibilities for the position that the player is assuming

Begin Child Elements of Position Responsibilities Position Responsibility <PositionRespon-

sibility> Defines a player responsibility

Begin Child Elements of Position Responsibility Position <Position> Defines the position

that the player is assuming during the exercise

Responsibilities <Responsibilities> Defines the responsibilities for the position

Begin Child Elements of Responsibilities Responsibility <Responsibility> Defines a responsibility

associated with the position

End Child Elements of Responsibilities End Child Elements of Position Responsibilities

Standard Operating Procedures

<StandardOperatingProcedures>

Defines standard operating procedures for the Emergency Operations Center

Begin Child Elements of Standard Operating Procedures Standard Operating <StandardOperating Defines a standard

158

Procedure Procedure> operating procedure Begin Child Elements of Standard Operating Procedure Event <Event> Defines an event for

which there are standard procedures

Standard Operating Procedure

<StandardOperatingProcedure>

Defines a standard procedures for the event

End Child Elements of Standard Operating Procedures End Child Elements of Standard Operating Procedures

Checklists <Checklists> Defines checklists to aid the individual in the position that the player is assuming

Begin Child Elements of Checklists Checklist <Checklist> Defines a checklist

Begin Child Elements of Checklist Procedure <Procedure> Defines a procedure for

which there is a checklist

Step <Step> Defines a step in the checklist

Status <Status> Defines the status of the step in the checklist. Common values include Completed, Not Completed, N/A.

End Child Elements of Checklist End Child Elements of Checklists

1.4 Reports Defines reports concerning the common operating picture for the player to reference during the simulation NAME LABEL DESCRIPTION Road Closures <RoadClosures> Defines a report on the

status of the roads during a disaster

Begin Child Elements of Road Closures Road Closure <RoadClosure> Defines a road closure

Begin Child Elements of Road Closure Road Name <RoadName> The name of the road

159

that is closed Closed Date <ClosedDate> A date/time group

indicating when the closure went into affect

Closed From <ClosedFrom> The name of the street at which the closure starts

Closed To <ClosedTo> The name of the street at which the closure ends

End Child Elements of Road Closure End Child Elements of Road Closures

Shelters <Shelters> Defines a report on the status of the shelters during a disaster

Begin Child Elements of Shelters Shelter <Shelter> Defines a shelter.

Begin Child Elements of Shelter Shelter Name <ShelterName> The name of the

shelter Status <Status> The status of the

shelter. Common values are Open, Closed, and Full

Accommodations <Accommodations> The number of accommodations that are available

Used <Used> The number of accommodations that are being used

End Child Elements of Shelter End Child Elements of Shelters

Hospitals <Hospitals> Defines a report on the status of the hospitals during a disaster

Begin Child Elements of Hospitals Hospital <Hospital> Defines a hospital

Begin Child Elements of Hospital Hospital Name <HospitalName> The name of the

hospital Status <Status> The status of the

160

shelter. Common values are Open, Closed, and Full

Accommodations <Accommodations> The number of accommodations that are available

Used <Used> The number of accommodations that are being used

End Child Elements of Hospital End Child Elements of Hospitals

Points of Distribution <PointsOfDistribution> Defines a report on the status of a Point of Distribution (POD)

Begin Child Elements of Points of Distribution Point of Distribution <PointOfDisttribution> Defines a Point of

Distribution Begin Child Elements of Point of Distribution POD Name <PODName> The name of the Point

of Distribution Status <Status> The status of the POD.

Common values are Open, Closed, and Full

Notes <Notes> Any desired remarks concerning the POD

End Child Elements of Point of Distribution End Child Elements of Points of Distribution

Additional User Created Boards

<UserCreatedBoards> A user defined report on any other aspect the user needs

Begin Child Elements of Additional User Created Boards User Created Board <UserCreatedBoard> Defines a user created

board Begin Child Elements of user created board User Defined Field <UserDefinedField> A user defined field

for the report. End Child Elements of user created board

End Child Elements of Additional User Created Boards Available Resources <AvailableResources> Defines resources that

are available to aid in the recovery effort

161

Begin Child Elements of Available Resources Resource <Resource> Defines a resource

Begin Child Elements of Available Resources Id <Id> A unique internal

reference to a resource. Aids in identifying resources at the computer level

Item <Item> The resource that is available

Quantity <Quantity> The quantity of items that are available for use

Cost Per Unit <CostPerUnit> The cost per unit of the item that is available.

End Child Elements of Resource End Child Elements of Available Resources

Additional User Created Boards

<UserCreatedBoards> A user defined report on any other aspect the user needs

Begin Child Elements of Additional User Created Boards User Created Board <UserCreatedBoard> Defines a user created

board. Begin Child Elements of Additional User Created Boards User Defined Field <UserCreatedField> A user defined field

for the report. End Child Elements of User Created Board

End Child Elements of Additional User Created Boards 1.5 Communication Tools Defines Communication Tools available for the Player NAME LABEL DESCRIPTION Email

Begin Child Elements of Email Email Message <EmailMessage> Defines an email

message. Begin Child Elements of Email Message

To <To> The recipient(s) of the email.

162

From <From> The initiator of the email. CC <CC> Additional recipient(s) of

the email BCC <BCC> Additional recipient(s) of

the email. The address of these individuals is hidden from the to and CC email recipient(s).

Subject <Subject> The subject of the email. Message Text <MessageText> The text of the email. End Child Elements of Email Message

End Child Elements of Email Chat <Chat> Defines a log of the chat

session of a player Begin Child Elements of Chat Chat Session

Begin Child Elements of Chat Session Initiator <Initiator> The initiator of the chat

session Receiver <Receiver> The person receiving the

invitation to chat from the initiator

Chat Text <ChatText> The text of the chat session

End Child Elements of Chat Session End Child Elements of Chat

163

APPENDIX C

SIMULATION XML SPECIFICATION

164

Top Level Document Object Model for a Simulation <Simulation>

<Name of the Exercise> <Player Information> <State of the CIMS> <Script> <Player Evaluation Metrics>

XML Specification 1.1 Name of the Exercise Data and Metadata about an Exercise NAME LABEL DESCRIPTION Name of Exercise <ExerciseName> Defines the name of the

exercise being simulated. 1.2 Player Information Data and Metadata About the Players NAME LABEL DESCRIPTION Person <Person> Defines a Player of the

Simulation Begin Child Elements of Person

Id <Id> A unique internal reference to a player. Aids in identifying individuals at the computer level.

Name <Name> The name of the player User Name <UserName> Defines a unique user

name for the player End Child Elements of Person Role <Role> Indicates what role the

player assumes during the exercise. For example, Operations Section Chief.

Script Training On <Script> The name of the script on which the player is training

1.3 CIMS Data and Metadata about the CIMS and the State of the CIMS NAME LABEL DESCRIPTION

165

Exercise Background <ExerciseBack-ground>

Defines background information for the players concerning the exercise.

Begin Child Elements of Exercise Background Player Handbooks <PlayerHandbooks> Defines a Player

Handbook for an exercise. Begin Child Elements of Player Handbooks Handbook

Begin Child Elements of Handbook Purpose <Purpose> Defines the purpose of the

handbook Scope <Scope> Defines the scope of the

exercise Concept of Play

<ConceptOfPlay> Defines the concept of play for the exercise

Exercise Assumptions

<ExerciseAssum-ptions>

Defines any assumption made with respect to the exercise.

Exercise Artificialities

<ExerciseArtifi-cialities>

Defines any artificialities that are to be accepted as part of the exercise

Exercise Simulation

<ExerciseSimula-tion>

Defines any areas that need to be simulated to compensate for non-participating organizations, individuals, and field units that would actually be deployed in a real-world response.

Scenario Narrative

<ScenarioNarrative> Defines weather information and all relevant background information that would be available to the exercise players at the start of the exercise.

Player Procedures and Responsibilities

<PlayerProceduresAndResponsibilities>

Defines how the exercise will be conducted, including player procedures for beginning the exercise and ensuring play continues. This

166

should include the specific roles and responsibilities of the exercise players, their interaction with the control/simulation team and the evaluation team, and procedures for dealing with problems which may arise during the exercise.

Safety and Security

<SafetyAnd-Security>

Defines safety procedures, including safety concerns and points of contact on safety issues. Provides an overview of security measures such as access control, site restrictions, badge procedures, and incident reporting related to the exercise participants.

Reporting <Reporting> Defines data produced by players (e.g., staff duty logs, staff officer action logs/reports, minutes from staff meetings, and telephone conversation records).

Administrative Systems

<Administrative-Systems>

Defines support provided to players such as copying, word processing, office supplies, and chart paper. Also defines set-up of room to conduct the functional exercise, including name tags, plans, SOPs, chart paper, office supplies, audiovisual equipment, and refreshments for participants, etc.

End Child Elements of Handbook End Child Elements of Player Handbooks

Initial Status <InitialStatus> Defines initial status

167

for an exercise Begin Child Elements of Initial Status Status Report <StatusReport> Defines an initial

status report Begin Child Elements of Status Report Text <Text> The text of the report Figure <Figure> A figure End Child Elements of Status Report

End Child Elements of Initial Status End Child Elements of Exercise Background

Boards <Boards> A convenient way to enter and update information to the system

Begin Child Elements of Boards Position Log <PositionLog> Defines a log of

notable events that occurred during the shift of an individual for a particular role. Aids in documentation of the disaster and in smooth changing of shifts during a disaster.

Begin Child Elements of Position Log Log <Log> Defines a log entry

Begin Child Elements of Log Date/Time <DateTime> The date/time that a

note is being added to the position log.

Notes <Notes> A description of the event along with any additional remarks concerning the event.

End Child Elements of Log End Child Elements of Position Log

Significant Events <SignificantEvents> Defines a significant event that needs to be documented during a disaster

Begin Child Elements of Significant Events

168

Significant Event <SignificantEvent> Defines a significant event

Begin Child Elements of Significant Event Date/Time <DateTime> The date/time that the

significant event is being documented

Notes <Notes> A description of the significant event along with any additional remarks concerning the significant event.

Name <Name> The name of the person documenting the significant event.

End Child Elements of Significant Event End Child Elements of Significant Events

Mission/Tasks <MissionTasks> Defines a mission or task to be accomplished during the exercise

Begin Child Elements of Mission/Tasks Mission/Task <MissionTask> Defines a

Mission/Task Begin Child Elements of Mission/Task Time <Time> The date/time of the

request Agency <Agency> The agency of which

the mission/task is being requested

Mission/Task <MissionTask> The mission or task that needs to be accomplished

Priority <Priority> The priority of the mission/task to be accomplished. Common values include Flash, High, Medium, and Low.

Status <Status> The status of the Mission/Task. Common values include Pending and

169

Complete. End Child Elements of Mission/Task

End Child Elements of Mission/Tasks Resource Requests <ResourceRequests> Defines requests for

additional resources needed during a disaster

Begin Child Elements of Resource Requests Resource Request <ResourceRequest> Defines a resource

request Begin Child Elements of Resource Request Id <Id> A unique internal

reference to a resource request. Aids in identifying resource requests at the computer level.

Date/Time <DateTime> The date/time at which the request is being made

Resource Requested <ResourceRequested> The resource being requested

Type <Type> The category of the resource being requested. Common values include Type I, Type II, Type III, Type IV, and Other.

Quantity <Quantity> The quantity desired of the resource being requested

Priority <Priority> The priority of the resource request. Common values include Flash, High, Medium, and Low.

Reason Needed <ReasonNeeded> The reason the resource is being requested

Special Instructions <SpecialInstructions> Any special instructions that should be adhered to with respect to the

170

request Requesting Agency <RequestingAgency> The agency making

the request Status <Status> The status of the

resource request. Common values include Pending, Deployed, Completed, and Cancelled.

Requestor’s Name <RequestorsName> The name of the person within the agency making the request

Requestor’s Phone Number

<RequestorsPhoneNumber>

The telephone number of the person making the request

Address <Address> The address of the agency of the person making the request

End Child Elements of Resource Request End Child Elements of Resource Requests

Road Closures <RoadClosures> Defines a report on the status of the roads during a disaster

Begin Child Elements of Road Closures Road Closure <RoadClosure> Defines a road

closure Begin Child Elements of Road Closure

Road Name <RoadName> The name of the road that is closed

Closed Date <ClosedDate> A date/time group indicating when the closure went into affect

Closed From <ClosedFrom> The name of the street at which the closure starts

Closed To <ClosedTo> The name of the street at which the closure ends

End Child Elements of Road Closure End Child Elements of Road Closures

171

Shelters <Shelters> Defines a report on the status of the shelters during a disaster

Begin Child Elements of Shelters Shelter <Shelter> Defines a shelter.

Begin Child Elements of Shelter Shelter Name <ShelterName> The name of the

shelter Status <Status> The status of the

shelter. Common values are Open, Closed, and Full

Accommodations <Accommodations> The number of accommodations that are available

Used <Used> The number of accommodations that are being used

End Child Elements of Shelter End Child Elements of Shelters

Hospitals <Hospitals> Defines a report on the status of the hospitals during a disaster

Begin Child Elements of Hospitals Hospital <Hospital> Defines a hospital

Begin Child Elements of Hospital Hospital Name <HospitalName> The name of the

hospital Status <Status> The status of the

shelter. Common values are Open, Closed, and Full

Accommodations <Accommodations> The number of accommodations that are available

Used <Used> The number of accommodations that are being used

End Child Elements of Hospital End Child Elements of Hospitals

Points of Distribution <PointsOfDistribution> Defines a report on

172

the status of a Point of Distribution (POD)

Begin Child Elements of Points of Distribution Point of Distribution <PointOfDisttribution> Defines a Point of

Distribution Begin Child Elements of Point of Distribution POD Name <PODName> The name of the

Point of Distribution Status <Status> The status of the

POD. Common values are Open, Closed, and Full

Notes <Notes> Any desired remarks concerning the POD

End Child Elements of Point of Distribution End Child Elements of Points of Distribution

Additional User Created Boards

<UserCreatedBoards> A user defined report on any other aspect the user needs

Begin Child Elements of Additional User Created Boards User Created Board <UserCreatedBoard> Defines a user

created board Begin Child Elements of user created board User Defined Field <UserDefinedField> A user defined field

for the report. End Child Elements of user created board

End Child Elements of Additional User Created Boards End Child Elements of Boards Links <Links> External Links that

may be useful for reference

Begin Child Elements of Links Link <Link> Defines an external

link that may be useful to reference. Example links include CNN and the National Weather Service.

End Child Elements of Links References <References> Items which the user

can reference if

173

needed Begin Child Elements of References Incident Action Plans <IncidentActionPlans> Defines Incident

Action Plans Begin Child Elements of Incident Action Plans Incident Action Plan <IncidentActionPlan> Defines an Incident

Action Plan. Begin Child Elements of Incident Action Plan Background <Background> Describes the

background of the disaster.

ICS 202 (Response Objectives)

<ICS202> Describes objectives for the disaster

ICS 203 (Organization List)

<ICS203> Defines the organization list for the response effort

ICS 204 (Assignment List)

<ICS204> Defines a personnel assignment list for the response effort

ICS 206 (Communication Plans)

<ICS206> Defines a communications plan for the response effort

ICS 220 (Air Operations Summary)

<ICS220> Defines a summary of air operations for the response effort

ICS 200 (Medical Plan)

<ICS200> Defines a medical plan for the response effort

Map/Chart <MapChart> Outlines a map or chart of the affected area.

Weather Forecast/Tides/Currents

<Weather> Highlights applicable weather for the affected area.

Other Attachments <OtherAttachments> Includes any additional applicable attachments desired.

End Child Elements of Incident Action Plan End Child Elements of Incident Action Plans

Disaster Maps <DisasterMaps> Defines maps displaying information about the

174

disaster for the emergency managers

Begin Child Elements of Disaster Maps Map <Map> Defines a disaster

map Begin Child Elements of Map Flood <Flood> A mark indicating

flood activity during a disaster

Fire <Fire> A mark indicating a fire during a disaster

Radioactivity <Radioactivity> A mark indicating radioactivity during a disaster

First Aid <FirstAid> A mark indicating that first aid is needed during a disaster

Other <Other> A mark indicating other user defined activity during a disaster

End Child Elements of Map End Child Elements of Disaster Maps

Position Responsibilities <PositionResponsibilities>

Defines player responsibilities for the position that the player is assuming

Begin Child Elements of Position Responsibilities Position Responsibility <PositionResponsibility> Defines a player

responsibility Begin Child Elements of Position Responsibility Position <Position> Defines the position

that the player is assuming during the exercise

Responsibilities <Responsibilities> Defines the responsibilities for the position

Begin Child Elements of Responsibilities Responsibility <Responsibility> Defines a

responsibility associated with the position

175

End Child Elements of Responsibilities End Child Elements of Position Responsibilities

Standard Operating Procedures

<StandardOperatingProcedures>

Defines standard operating procedures for the Emergency Operations Center

Begin Child Elements of Standard Operating Procedures Standard Operating Procedure

<StandardOperatingProcedure>

Defines a standard operating procedure

Begin Child Elements of Standard Operating Procedure Event <Event> Defines an event for

which there are standard procedures

Standard Operating Procedure

<StandardOperatingProcedure>

Defines a standard procedures for the event

End Child Elements of Standard Operating Procedures End Child Elements of Standard Operating Procedures

Checklists <Checklists> Defines checklists to aid the individual in the position that the player is assuming

Begin Child Elements of Checklists Checklist <Checklist> Defines a checklist

Begin Child Elements of Checklist Procedure <Procedure> Defines a procedure

for which there is a checklist

Step <Step> Defines a step in the checklist

Status <Status> Defines the status of the step in the checklist. Common values include Completed, Not Completed, N/A.

End Child Elements of Checklist End Child Elements of Checklists

End Child Elements of References Reports <Reports> Defines reports

concerning the common operating picture for the player

176

to reference during the simulation

Begin Child Elements of Reports Road Closures <RoadClosures> Defines a report on

the status of the roads during a disaster

Begin Child Elements of Road Closures Road Closure <RoadClosure> Defines a road

closure Begin Child Elements of Road Closure Road Name <RoadName> The name of the road

that is closed Closed Date <ClosedDate> A date/time group

indicating when the closure went into affect

Closed From <ClosedFrom> The name of the street at which the closure starts

Closed To <ClosedTo> The name of the street at which the closure ends

End Child Elements of Road Closure End Child Elements of Road Closures

Shelters <Shelters> Defines a report on the status of the shelters during a disaster

Begin Child Elements of Shelters Shelter <Shelter> Defines a shelter.

Begin Child Elements of Shelter Shelter Name <ShelterName> The name of the

shelter Status <Status> The status of the

shelter. Common values are Open, Closed, and Full

Accommodations <Accommodations> The number of accommodations that are available

Used <Used> The number of accommodations that are being used

177

End Child Elements of Shelter End Child Elements of Shelters

Hospitals <Hospitals> Defines a report on the status of the hospitals during a disaster

Begin Child Elements of Hospitals Hospital <Hospital> Defines a hospital

Begin Child Elements of Hospital Hospital Name <HospitalName> The name of the

hospital Status <Status> The status of the

shelter. Common values are Open, Closed, and Full

Accommodations <Accommodations> The number of accommodations that are available

Used <Used> The number of accommodations that are being used

End Child Elements of Hospital End Child Elements of Hospitals

Points of Distribution <PointsOfDistribution> Defines a report on the status of a Point of Distribution (POD)

Begin Child Elements of Points of Distribution Point of Distribution <PointOfDisttribution> Defines a Point of

Distribution Begin Child Elements of Point of Distribution POD Name <PODName> The name of the

Point of Distribution Status <Status> The status of the

POD. Common values are Open, Closed, and Full

Notes <Notes> Any desired remarks concerning the POD

End Child Elements of Point of Distribution End Child Elements of Points of Distribution

Additional User Created Boards

<UserCreatedBoards> A user defined report on any other aspect

178

the user needs Begin Child Elements of Additional User Created Boards User Created Board <UserCreatedBoard> Defines a user

created board Begin Child Elements of user created board User Defined Field <UserDefinedField> A user defined field

for the report. End Child Elements of user created board

End Child Elements of Additional User Created Boards Available Resources <AvailableResources> Defines resources

that are available to aid in the recovery effort

Begin Child Elements of Available Resources Resource <Resource> Defines a resource

Begin Child Elements of Available Resources Id <Id> A unique internal

reference to a resource. Aids in identifying resources at the computer level

Item <Item> The resource that is available

Quantity <Quantity> The quantity of items that are available for use

Cost Per Unit <CostPerUnit> The cost per unit of the item that is available.

End Child Elements of Resource End Child Elements of Available Resources

Additional User Created Boards

<UserCreatedBoards> A user defined report on any other aspect the user needs

Begin Child Elements of Additional User Created Boards User Created Board <UserCreatedBoard> Defines a user

created board. Begin Child Elements of Additional User Created Boards User Defined Field <UserCreatedField> A user defined field

for the report. End Child Elements of User Created Board

End Child Elements of Additional User Created Boards

179

End Child Elements of Reports Communication Tools <CommunicationTools> Defines

Communication Tools available for the Player

Begin Child Elements of Communication Tools Email

Begin Child Elements of Email Email Message <EmailMessage> Defines an email

message. Begin Child Elements of Email Message

To <To> The recipient(s) of the email.

From <From> The initiator of the email.

CC <CC> Additional recipient(s) of the email

BCC <BCC> Additional recipient(s) of the email. The address of these individuals is hidden from the to and CC email recipient(s).

Subject <Subject> The subject of the email.

Message Text <MessageText> The text of the email. End Child Elements of Email Message

End Child Elements of Email Chat <Chat> Defines a log of the

chat session of a player

Begin Child Elements of Chat Chat Session

Begin Child Elements of Chat Session Initiator <Initiator> The initiator of the

chat session Receiver <Receiver> The person receiving

the invitation to chat from the initiator

180

Chat Text <ChatText> The text of the chat session

End Child Elements of Chat Session End Child Elements of Chat

End Child Elements of Communication Tools Exercise Log <ExerciseLog> Defines a log of the

actions and expected actions for the player during the exercise.

Begin Child Elements of Exercise Log Log Defines a log

Begin Child Elements of Exercise Log Time <Time> A date/time group

indicating when an action occurred

User <User> The name of the player executing the action

Page <Page> The name of the page (place) in the CIMS that the action is taking place on

Action <Action> The action that was executed on the page

End Child Elements of Log End Child Elements of Exercise Log

1.4 Script Data and Metadata about an Exercise

NAME LABEL DESCRIPTION Script <Script> Defines a script for

the exercise. Begin Child Elements of Script Injects <Injects> Defines injects for

the script Begin Child Elements of Injects Inject Number <InjectNumber> A unique identifier

for the inject. Aids in identifying injects at the human and at the computer level

181

Time <Time> Defines the time that the inject is supposed to arrive for the player.

Receiving Agency <ReceivingAgency> Defines the receiving agency of the inject

Sending Agency <SendingAgency> Defines the agency sending the inject.

Message Type <MessageType> Defines the type of message being received. Common values include Task, Resource Request, Informational, and Informational Request.

Message Text <MessageText> Defines the text of the message being received

Communication Medium

<CommunicationMed-ium>

Defines the communication medium over which the inject is received. Common values include face-to-face, telephone, cellphone, radio, and PDA.

Exercise Plan Objective

<ExercisePlanObjective>

Defines the exercise plan objective being assessed for the inject being received.

Expected Action <ExpectedAction> Defines the expected action of the player in response to the inject.

End Child Elements of Injects End Child Elements of Script

1.5 Player Evaluation Metrics Data and Metadata Concerning Player Evaluations NAME LABEL DESCRIPTION Player Evaluation Metrics <PlayerEvaluation-

Metrics> Defines metrics of performance for the

182

players. Helps to determine how the player did during the simulation.

Begin Child Elements of Performance Metrics Target Capabilities <TargetCapabilities> Defines target

capabilities for the exercise.

Begin Child Elements of Target Capabilities Target Capability Defines a target

capability Begin Child Elements of Target Capability

Mission Area <MissionArea> Defines the area of focus for the target capability.

Activity <Activity> Defines the activity of focus for the target capability

Reference <Reference> Defines the standardized reference for the mission area and activity for the target capability

Tasks <Tasks> Defines the tasks to be accomplished for the target capability

Status <Status> Defines the status of the user actions with respect to the target capability. Common values include Incomplete, Partially Complete, Fully Complete, and Not Applicable.

End Child Elements of Target Capability End Child Elements of Target Capabilities Performance Metrics <PerformanceMetrics> Defines metrics with

which the player’s performance can be evaluated.

Begin Child Elements of Performance Metrics

183

Performance Metric Defines a performance metric

Begin Child Elements of Performance Metric Mission Area <MissionArea> Defines the area of

focus for the target capability.

Activity <Activity> Defines the activity of focus for the target capability

Reference <Reference> Defines the standardized reference for the mission area and activity for the target capability

Measure <Measure> Defines the timeframe in which the performance metric should be accomplished.

Metric <Metric> Defines the expected action that should be accomplished within the timeframe allotted.

Status <Status> Defines the status of the user actions with respect to the target capability. Common values include Incomplete, Partially Complete, Fully Complete, and Not Applicable.

End Child Elements of Performance Metrics End Child Elements of Performance Metrics Exercise Objectives <ExerciseObjectives> Defines objectives to

be accomplished during the exercise.

Begin Child Elements of Exercise Objectives Exercise Objective Defines an exercise

objective Begin Child Elements of Exercise Objectives

184

Objective <Objective> Defines the objective to be accomplished during the exercise.

Status <Status> Defines the status of the user actions with respect to the target capability. Common values include Incomplete, Partially Complete, Fully Complete, and Not Applicable.

End Child Elements of Exercise Objective End Child Elements of Exercise Objectives

End Child Elements of Performance Metrics Evaluator Handbooks <EvaluatorHandbooks> Defines Evaluator

Handbooks for the exercise.

Begin Child Elements of Evaluator Handbooks Handbook Defines an Evaluator

Handbook Begin Child Elements of Handbook Exercise Name <ExerciseName> Defines the name of

the exercise being simulated.

Evaluator Name <EvaluatorName> Defines the name of the person evaluating the player during the exercise

Location <Location> Defines the location of the exercise.

Date <Date> Defines the date of the exercise

Objective <Objective> Defines the exercise objective being evaluated

Performance Criterion

<PerformanceCriterion> Defines the expected action that should be accomplished within the timeframe allotted during the exercise

Comments <Comments> Defines any additional comments

185

of the evaluator concerning the player and the exercise objective.

End Child Elements of Handbook End Child Elements of Evaluator Handbooks

Player Evaluations <PlayerEvaluation> Defines how to evaluate a player

Begin Child Elements of Player Evaluations Time Received <TimeReceived> The date/time that an

inject was received Time Completed <TimeCompleted> The date/time that the

user completed all actions in response to the inject

Individual <Individual> The person taking the action

Page <Page> An reference to the screen page that the player was on at the time of the action.

Action <Action> The action that the player took in response to an event in the simulation

Attribute Type <AttributeType> The type of action the player took in response to an event in the simulation

Attribute Value <AttributeValue> The value of the action that the player took in response to an event in the simulation.

Expected Action <ExpectedAction> The expected action that the player is supposed to take in response to an event in the simulation

Evaluation <Evaluation> An assessment of the player’s action in response to an event

End Child Elements of Player Evaluations

186

Additional Player Evaluation Metrics

<AdditionalPlayerEvaluationMetrics>

Defines Additional Player Evaluation Metrics

Begin Child Elements of Additional Player Evaluations Percentage of Injects Missed

<InjectsMissed> Percentage of injects missed

Average Response Time Per Inject

<AvgResponseTime> Average response time per inject

User Defined Metrics <UserDefined> Defines additional user defined metrics

Begin Child Elements of User Defined Metrics User Metric <UserMetric> Defines a user metric End Child Elements of User Defined Metrics

End Child Elements of Additional Player Evaluations

 

187

APPENDIX D

PROCESSES AND FLOWCHARTS

Following are flow diagrams for various processes in the EOC. The diagrams

shaded blue and white indicate actual process flows within the EOC. The diagrams

shaded yellow and white indicate the parts of the actual process have been imple-

mented in SimEOC. The yellow shading indicates activities that have been imple-

mented, and the white portion indicates activities that have not been implemented.

188

D.1 Liaison Process Flow

Is there an SOP?

Disseminate Information,

Assign Tasks

Wait for Resource

Resource Request

Status Input

EOC Activates

(which level?)

Emergency

Status Input

Analyze

Do I know how to

handle this?

Do I have the resources to handle this?

Do I have the authority to handle this?

Consult SOPConsult Branch

Directors, Other Liaisons

Resource Aquired?

User Resource to accomplish mission/task

yes

no

noyes

no

yes

yes no

yes no

Determine Course of

Action

Figure D.1. Liaison Flowgraph

189

D.2 Logistics Resource Request Process Flow

Start - Request from Agency

Liaison

Government Services Agency

(GSA)

Department of Procurement Management

(DPM)

Is the resource available by another

agency liaison?

Enter into EM Constellation

Enter into EM Constellation

Is the resource available by predefined contracts?

Obtain Resource

Obtain Resource

Obtain Resource

YES

Is the resource in house?

NO

YESYES

Is the resource available by

another agency liaison?

YESYESNO

NO

Give to Logistics Section Chief

NO

NO

Enter into EM Constellation

Figure D.2. Logistics Resource Request Process

190

D.3 Logistics Resource Request Process Flow (SimEOC implementation)

Start - Request from Agency

Liaison

Government Services Agency

(GSA)

Department of Procurement Management

(DPM)

Is the resource available by another

agency liaison?

Enter into EM Constellation

Enter into EM Constellation

Is the resource available by predefined contracts?

Obtain Resource

Obtain Resource

Obtain Resource

YES

Is the resource in house?

NO

YESYES

Is the resource available by

another agency liaison?

YESYESNO

NO

Give to Logistics Section Chief

NO

NO

Enter into EM Constellation

Figure D.3. Logistics Resource Request Process (SimEOC Implementation)

191

D.4 Exercise Developer Process Flow

Exercise Developer

Develop Target

Capabilities

Develop Metrics for

each Target Capability

+

Exercise Evaluation

GuidesPlayer

Handbooks

Develop Status

Updates

Develop Exercise

Objectives

Develop an Exercise

Script

Develop Initial Status

Develop Exercise Handouts

Develop Miami-Dade Resources

Figure D.4. Exercise Developer Flowgraph

192

D.5 Exercise Developer Process Flow (SimEOC Implementation)

Exercise Developer

Develop Target

Capabilities

Develop Metrics for

each Target Capability

+

Exercise Evaluation

GuidesPlayer

Handbooks

Develop Status

Updates

Develop Exercise

Objectives

Develop an Exercise

Script

Develop Initial Status

Develop Exercise Handouts

Develop Miami-Dade Resources

Figure D.5. Exercise Developer Flowgraph (SimEOC Implementation)

193

D.6 Exercise Controller Process Flow

Incorrect Username or Password?

login to vEOC

yes

Monitor the Flow of the Exercise

User

Open the Exercise Controller

Rearrange Injects as Needed

Start the Exercise

+

Next Block Fast TimePausePlay/Resume Stop/Terminate

Rearrange Injects as Needed

Rearrange Injects as Needed

Rearrange Injects as Needed

Rearrange Injects as Needed

Rearrange Injects as Needed

Figure D.6. Exercise Controller Flowgraph

194

D.7 Exercise Controller Process Flow (SimEOC Implementation)

Incorrect Username or Password?

login to vEOC

yes

Monitor the Flow of the Exercise

User

Open the Exercise Controller

Rearrange Injects as Needed

Start the Exercise

+

Next Block Fast TimePausePlay/Resume Stop/Terminate

Rearrange Injects as Needed

Rearrange Injects as Needed

Rearrange Injects as Needed

Rearrange Injects as Needed

Rearrange Injects as Needed

Figure D.7. Exercise Controller Flowgraph (SimEOC Implementation)

195

D.8 Exercise Design Lifecycle

Exercise Complete?

Conduct the Exercise

no

User

Hotwash

Review Exercise

Evaluation Guides

Develop Player

Reports

+

First Evaluation of the Exercise

First Evaluation of Organization

Develop After Action

Reports

First Evaluation of

Individual Positions

+

Evaluation of the Exercise

Evaluation of Organization

Evaluation of Individual Positions

Figure D.8. Exercise Design Lifecycle

196

D.9 Exercise Design Lifecycle (SimEOC Implementation)

Exercise Complete?

Conduct the Exercise

no

User

Hotwash

Review Exercise

Evaluation Guides

Develop Player

Reports

+

First Evaluation of the Exercise

First Evaluation of Organization

Develop After Action

Reports

First Evaluation of

Individual Positions

+

Evaluation of the Exercise

Evaluation of Organization

Evaluation of Individual Positions

Figure D.9. Exercise Design Lifecycle (SimEOC Implementation)

197

APPENDIX E

CONCEPT MAPS

Following are concept maps for various positions in the EOC. The colored dia-

grams indicate actual mental models of the position. The diagrams shaded yellow

and white indicate the parts of the mental models that have been implemented in

SimEOC. The yellow shading indicates sections that have been implemented, and

the white portion indicates sections that have not been implemented.

E.1 Emergency Manager Concept Map

198

decisions

people

other agencies

NGOs

liaisons

public

incident command

federal

property

state

first responders

exercise background information

crisis information

management system

resrouces

property

Incident Command

information

plans

procedures

Emergency Managers

standard operating

proceedures

risk

damage

IAP

ICS Forms

Incident Objectives

Review

coordinate withcollaborate with

organize

filter

familiarize with

manage

use

create

fill out

create

make

visualize

share

exercise

act on

prioritize

elected officials

control access to

plan for

contingencies

assessmitigate

resources

public

protect

print

communicate with

interpret

synthesize

comply with

Figure E.1. Emergency Manager Concept Map

199

E.2 Emergency Manager Concept Map (SimEOC Implementation)

decisions

people

other agencies

NGOs

liaisons

public

incident command

federal

property

state

first responders

exercise background information

crisis information

management system

resrouces

property

Incident Command

information

plans

procedures

Emergency Managers

standard operating

proceedures

risk

damage

IAP

ICS Forms

Incident Objectives

Review

coordinate withcollaborate with

organize

filter

familiarize with

manage

use

create

fill out

create

make

visualize

share

exercise

act on

prioritize

elected officials

control access to

plan for

contingencies

assessmitigate

resources

public

protect

print

communicate with

interpret

synthesize

comply with

Figure E.2. Emergency Manager Concept Map (SimEOC Implementation)

200

E.3 Exercise Developer Concept Map

controller handbooks

Exercise Developers

scripts

reports

organization as a whole

groups

individuals

evaluator handbooks

evaluation metrics

injects

players

target capabilities

past exercises

objectives

prints

determines

based on

taken from

makes

consisting of

player handouts

develops

sent to

after action reports

creates

if the individual and the organization met its exercise objectives and

metrics

based on

determines

Figure E.3. Exercise Developer Concept Map

201

E.4 Exercise Developer Concept Map (SimEOC Implementation)

controller handbooks

Exercise Developers

scripts

reports

organization as a whole

groups

individuals

evaluator handbooks

evaluation metrics

injects

players

target capabilities

past exercises

objectives

prints

determines

based on

taken from

makes

consisting of

player handouts

develops

sent to

after action reports

creates

if the individual and the organization met its exercise objectives and

metrics

based on

determines

Figure E.4. Exercise Developer Concept Map (SimEOC Implementation)

202

E.5 Exercise Controller Concept Map

Exercise Controllers

also acts as an exercise evaluator flow of the

exercise

who receives injectswhen

injects are received

in what order

injects are received

moderatessometimes

by controlling

which injects are received

Figure E.5. Exercise Controller Concept Map

203

E.6 Exercise Controller Concept Map (SimEOC Implementation)

Exercise Controllers

also acts as an exercise evaluator flow of the

exercise

who receives injectswhen

injects are received

in what order

injects are received

moderatessometimes

by controlling

which injects are received

Figure E.6. Exercise Controller Concept Map (SimEOC Implementation)

204

E.7 Exercise Evaluator Concept Map

Exercise Evaluators

reports

after action report

particular individuals

in the organization

writes

watches

if the individual and the organization met its exercise objectives and

metrics assigned to them

determines

feedback

gives

for the

based on

target capabilities

past exercises

objectives

taken from

Figure E.7. Exercise Evaluator Concept Map

205

E.8 Exercise Evaluator Concept Map (SimEOC Implementation)

Exercise Evaluators

reports

after action report

particular individuals

in the organization

writes

watches

if the individual and the organization met its exercise objectives and

metrics assigned to them

determines

feedback

gives

for the

based on

target capabilities

past exercises

objectives

taken from

Figure E.8. Exercise Evaluator Concept Map (SimEOC Implementation)

206

E.9 Planning Concept Map

Planners

Incident Action Plans

(IAPs)

situational meetings

emergency operationsmoderate

create

for

emergency managers

forcrises

during normal

operations

plan for crises

during

Figure E.9. Planner Concept Map

207

E.10 Planning Concept Map (SimEOC Implementation)

Planners

Incident Action Plans

(IAPs)

situational meetings

emergency operationsmoderate

create

for

emergency managers

forcrises

during normal

operations

plan for crises

during

Figure E.10. Planner Concept Map (SimEOC Implementation)

208

APPENDIX F

SYSTEM CAPABILITIES

209

1. Trainee

1.1. Main Panel 1.1.1. Exercise Background

1.1.1.1. View EOC Layout 1.1.1.2. View Player Handbook 1.1.1.3. Initial Status

1.1.1.3.1. View Starting Status 1.1.2. Boards

1.1.2.1. Position Log 1.1.2.1.1. Post a Position Log 1.1.2.1.2. Edit a Position Log 1.1.2.1.3. Sort the Position Log 1.1.2.1.4. Delete a Position Log

1.1.2.2. Significant Events 1.1.2.2.1. Post a Significant Event 1.1.2.2.2. Edit a Significant Event 1.1.2.2.3. Delete a Significant Event

1.1.2.3. Status Boards 1.1.2.3.1. Infrastructure 1.1.2.3.2. Human Services 1.1.2.3.3. Public Safety

1.1.2.4. Mission/Tasks 1.1.2.1. Create a Mission/Task 1.1.2.2. Edit/Update Mission/Task 1.1.2.3. Sort Mission/Tasks 1.1.2.4. Delete Mission/Task

1.1.2.5. Resource Requests 1.1.2.5.1. Submit a Resource Request 1.1.2.5.2. Edit/Update a Resource Request 1.1.2.5.3. Delete a Resource Request 1.1.2.5.4. Check the status of a Resource Request

1.1.3. Links 1.1.3.1. View Links

1.1.4. References 1.1.4.1. Incident Action Plan

1.1.4.1.1. View Incident Action Plan 1.1.4.2. Disaster Map

1.1.4.2.1. View the Disaster Map

210

1.1.4.2.2. Edit/Update Disaster Map 1.1.4.2.3. Clear Disaster Map 1.1.4.2.4. Save Disaster Map

1.1.4.3. Position Responsibilities 1.1.4.3.1. View position responsibilities

1.1.5. Reports 1.1.5.1. Road Closures

1.1.5.1.1. Check Status of Road Closures 1.1.5.1.2. Create Status of Road Closures 1.1.5.1.3. Edit/Update status of Road Closures

1.1.5.1.4. Delete status of Road Closures 1.1.5.2 Shelters

1.1.5.2.1. Check Status of Shelters 1.1.5.2.2. Create Status of Shelters 1.1.5.2.3. Edit/Update status of Shelters 1.1.5.2.4. Delete status of Shelters

1.1.5.3. Points of Distribution (PODs) 1.1.5.3.1. Check Status of PODs 1.1.5.3.2. Create Status of PODs 1.1.5.3.3. Edit/Update status of PODs 1.1.5.3.4. Delete status of PODs

1.1.5.4. Hospitals 1.1.5.4.1. Check Status of Hospitals 1.1.5.4.2. Create Status of Hospitals 1.1.5.4.3. Edit/Update status of Hospitals 1.1.5.4.4. Delete status of Hospitals

1.1.5.5. Lives Data 1.1.5.5.1. View Lives Data 1.1.5.5.2. Add Lives Data 1.1.5.5.3. Edit/Update Lives Data 1.1.5.5.4. Delete Lives Data

1.1.6. Logistics 1.1.6.1. Miami Dade Resources

1.1.6.1.1. View Miami Dade Resources 1.1.6.1.2. Add a Miami Dade Resource 1.1.6.1.3. Edit/Update Miami Dade Resources 1.1.6.1.4. Delete Miami Dade Resources

1.1.6.2. Resources 1.1.6.2.1. Acquire a Contract Resource

1.1.6.2.1.1. In-house

211

1.1.6.2.1.2. Out-of-house 1.1.6.2.2. Approve a Resource Request

1.1.6.2.2.1. Update a Resource Request 1.1.6.2.2.2. Change the status of resource request 1.1.6.2.2.3. Add logistical notes to resource request

1.2. Exercise Panel 1.2.1. Disaster Assistant

1.2.1.1. Ask a Question to the Disaster Assistant 1.2.2. Dashboards

1.2.2.1. Cost to County 1.2.2.1.1. View dashboard data

1.2.2.2. Casualties 1.2.2.2.1. View dashboard data

1.2.3. Received Injects 1.2.3. View Received injects

1.2.4. Communication Tools 1.2.4.1. Chat

1.2.4.1.1. Face-to-Face 1.2.4.1.2. Telephone 1.2.4.1.3. Radio 1.2.4.1.4. Cellphone 1.2.4.1.5. PDA

1.2.5. Scripting 1.2.5.1. The exercise begins (Play) 1.2.5.2. The exercise is paused (Pause) 1.2.5.3. The exercise is terminated (Stop) 1.2.5.4. The exercise is in Normal Time (Normal Time) 1.2.5.5. The exercise is in Fast Time (Fast Time) 1.2.5.6. Drop the next block (Next Block)

1.2.6. Injects 1.2.6.1. Acknowledge inject 1.2.6.2. Clarify an inject

1.2.7. Global Updates 1.2.7.1. Receive a global update

1.2.8. Exercise Log (not seen in player console) 1.2.8.1. Log chats 1.2.8.2. Log significant events 1.2.8.3. Log user actions during the exercise 1.2.8.4. Log position logs

212

2. Exercise Developer/Exercise Controller

2.1. Exercise Development Tools 2.1.1. Handbook Developer

2.1.1.1. Update the handbook developer 2.1.2. Player Responsibilities Developer

2.1.2.1. Add player responsibilities 2.1.2.2. Delete player responsibilities

2.1.3. Initial Status Developer 2.1.3.1. Create starting status 2.1.3.1.1. Update starting status

2.1.3.1.1.1. Insert and Update Text 2.1.3.1.1.2. Insert and Update Figures

2.1.3.1.2. Create multiple starting status reports 2.1.4. Target Capabilities Developer

2.1.4.1. Target Capabilities 2.1.4.1.1. Add new target capabilities

2.1.4.1.1.1. Add target capability from database 2.1.4.1.3. Delete target capabilities from script

2.1.4.2. Target Capability Metrics 2.1.4.2.1. Add target capability metrics to script

2.1.4.2.1.1. Add new target capability metric 2.1.4.2.1.2. Add target capability metric from database

2.1.4.2.3. Delete target capability metrics from script 2.1.4.3. Exercise Objectives

2.1.4.3.1. Create exercise objectives 2.1.4.3.3. Delete exercise objectives

2.1.5. Script Developer 2.1.5.1. Create a Script 2.1.5.2. Edit Script

2.1.5.2.1. Injects 2.1.5.2.1.1. Add inject from Database 2.1.5.2.1.2. Add New Inject 2.1.5.2.1.3. Delete an inject from the script 2.1.5.2.1.4. Edit an inject

2.1.5.3. Delete Script  2.1.5.4. Import/Upload Script (in CSV format) 2.1.5.6. Master Inject Database

213

2.1.5.6.1. Add New Inject to the Database 2.1.5.6.2. Delete an inject from the Database

2.1.6. Database Control Tools 2.1.6.1. Reset the tables for a script 2.1.6.2. Archive the script 2.1.6.3. Restore a script from the archives

2.2. Exercise Control and Tracking 2.2.1. Exercise Controller

2.2.1.1. Control the Exercise 2.2.1.1.1. Start Exercise 2.2.1.1.2. Pause Exercise 2.2.1.1.3. Terminate Exercise 2.2.1.1.4. Next Block 2.2.1.1.5. Fast Time 2.2.1.1.7. Delete Injects on the Fly

2.2.1.2. Exercise Log 2.2.1.2.1 View the exercise log 2.2.1.2.2. Filter the exercise log 2.2.1.2.3. Download as a CSV

2.2.1.3. Communication Tools 2.2.1.3.1 Chat 2.2.1.3.1. Face-to-Face

2.2.1.3.2. Telephone 2.2.1.3.3. Radio 2.2.1.3.4. Cellphone 2.2.1.3.5. PDA 2.3. Exercise Reports and Evaluation

2.3.1. Player Reports 2.3.1.1. View Player Reports 2.3.1.2. Filter Player Reports

2.3.2. Evaluation Metrics 2.3.2.1. View Evaluation Metrics

2.4. External Links 2.4.1. View external links

2.5. References 2.5.1. Disaster Map

2.5.1.1. View the Disaster Map 2.5.1.2. Edit/Update Disaster Map 2.5.1.3. Clear Disaster Map 2.5.1.4. Save Disaster Map

214

2.5.2. Miami Dade Resources 2.5.2.1. Add a Miami Dade Resource 2.5.2.2. View Miami Dade Resources 2.5.2.4. Delete Miami Dade Resources

3. Researcher

3.1. Researcher Tools 3.2. Evaluation Metrics

3.2.1. Exercise Reports 3.2.1.1. View evaluation metrics

3.2.2. Exercise Log 3.2.2.1. View the exercise log 3.2.2.2. Filter the exercise log

3.3. Player Reports 3.3.1. View player report

3.4. Position Logs 3.4.1. View position log

3.5. References 3.5.1. Disaster Map

3.5.1.1. View the Disaster Map 4. Evaluator

4.1. Exercise Evaluation Tools 4.1.1. View Exercise Evaluation Guide 4.1.2. Edit Exercise Evaluation Guide 4.1.3. View Exercise Evaluation Guide Analysis Sheet 4.1.4. Edit Exercise Evaluation Guide Analysis Sheet 4.2. References 4.2.1. View Player Responsibilities 4.2.2. View Target Capabilities 4.2.3. View Disaster Map 4.3. Communication Tools 4.3.1 Chat 4.3.1.1. Face-to-Face 4.3.1.2. Telephone 4.3.1.3. Radio 4.3.1.4. Cellphone 4.3.1.5. PDA

215

5. Administrator

5.1. User Profiles 5.1.1. Create User Logins

5.1.2. Edit User Logins 5.1.3. Delete User Logins

5.2. System Roles 5.2.1. Add Agency 5.2.2. Delete Agency 5.2.3. Add Department 5.2.4. Delete Department

     

216

APPENDIX G

TABLE SCHEMA

217

 

action +-------+--------------+------+-----+---------+----------------+ | Field | Type | Null | Key | Default | Extra | +-------+--------------+------+-----+---------+----------------+ | id | int(11) | NO | PRI | NULL | auto_increment | | page | varchar(100) | YES | | NULL | | +-------+--------------+------+-----+---------+----------------+ actionattribute +-----------+-------------+------+-----+---------+----------------+ | Field | Type | Null | Key | Default | Extra | +-----------+-------------+------+-----+---------+----------------+ | id | int(11) | NO | PRI | NULL | auto_increment | | action_id | int(11) | YES | MUL | NULL | | | type | varchar(50) | YES | MUL | NULL | | | value | text | YES | | NULL | | +-----------+-------------+------+-----+---------+----------------+ actionattributetype +-------+-------------+------+-----+---------+----------------+ | Field | Type | Null | Key | Default | Extra | +-------+-------------+------+-----+---------+----------------+ | id | int(11) | NO | PRI | NULL | auto_increment | | name | varchar(20) | YES | UNI | NULL | | +-------+-------------+------+-----+---------+----------------+ actiontype +-------+-------------+------+-----+---------+----------------+ | Field | Type | Null | Key | Default | Extra | +-------+-------------+------+-----+---------+----------------+ | id | int(11) | NO | PRI | NULL | auto_increment | | name | varchar(20) | YES | | NULL | | +-------+-------------+------+-----+---------+----------------+ afteraction +-------------+--------------+------+-----+---------+----------------+ | Field | Type | Null | Key | Default | Extra | +-------------+--------------+------+-----+---------+----------------+ | id | int(11) | NO | PRI | NULL | auto_increment | | in_datetime | datetime | YES | | NULL | |

218

 

| in_script | varchar(20) | YES | MUL | NULL | | | in_user | varchar(30) | YES | MUL | NULL | | | in_role | varchar(40) | YES | MUL | NULL | | | issue | varchar(850) | YES | | NULL | | | means | varchar(250) | YES | | NULL | | | assignee | varchar(250) | YES | | NULL | | | progress | varchar(850) | YES | | NULL | | | archived | int(11) | YES | | 0 | | +-------------+--------------+------+-----+---------+----------------+ aiml +-------------+--------------+------+-----+---------+----------------+ | Field | Type | Null | Key | Default | Extra | +-------------+--------------+------+-----+---------+----------------+ | id | int(11) | NO | PRI | NULL | auto_increment | | aiml | text | NO | | NULL | | | pattern | varchar(255) | NO | | NULL | | | thatpattern | varchar(255) | NO | | NULL | | | template | text | NO | | NULL | | | topic | varchar(255) | NO | | NULL | | | filename | varchar(255) | NO | | NULL | | +-------------+--------------+------+-----+---------+----------------+ aiml_userdefined +----------+--------------+------+-----+-------------------+----------------+ | Field | Type | Null | Key | Default | Extra | +----------+--------------+------+-----+-------------------+----------------+ | id | int(11) | NO | PRI | NULL | auto_increment | | aiml | text | NO | | NULL | | | pattern | varchar(255) | NO | | NULL | | | template | varchar(255) | NO | | NULL | | | userid | int(11) | NO | | NULL | | | botid | int(11) | NO | | NULL | | | date | timestamp | NO | | CURRENT_TIMESTAMP | | +----------+--------------+------+-----+-------------------+----------------+ authorized_users +-------------+--------------+------+-----+---------+----------------+ | Field | Type | Null | Key | Default | Extra | +-------------+--------------+------+-----+---------+----------------+

219

 

| name | varchar(30) | NO | UNI | | | | password | varchar(40) | YES | | NULL | | | id | int(11) | NO | PRI | NULL | auto_increment | | email | varchar(255) | YES | | NULL | | | is_admin | tinyint(1) | YES | | 0 | | | is_research | tinyint(1) | YES | | 0 | | | is_exdev | tinyint(1) | YES | | 0 | | | is_regular | tinyint(1) | YES | | 0 | | +-------------+--------------+------+-----+---------+----------------+ botpersonality +-------+--------------+------+-----+---------+----------------+ | Field | Type | Null | Key | Default | Extra | +-------+--------------+------+-----+---------+----------------+ | id | int(11) | NO | PRI | NULL | auto_increment | | bot | tinyint(4) | NO | MUL | 0 | | | name | varchar(255) | NO | | | | | value | text | NO | | NULL | | +-------+--------------+------+-----+---------+----------------+ branchlog +-------------+--------------+------+-----+---------+----------------+ | Field | Type | Null | Key | Default | Extra | +-------------+--------------+------+-----+---------+----------------+ | id | int(11) | NO | PRI | NULL | auto_increment | | in_datetime | datetime | YES | | NULL | | | in_script | varchar(20) | YES | MUL | NULL | | | in_user | varchar(30) | YES | MUL | NULL | | | in_role | varchar(40) | YES | MUL | NULL | | | datetime | datetime | YES | | NULL | | | notes | varchar(850) | YES | | NULL | | | name | varchar(25) | YES | | NULL | | | archived | int(11) | YES | | 0 | | +-------------+--------------+------+-----+---------+----------------+ contract_resources +----------+--------------+------+-----+---------+----------------+ | Field | Type | Null | Key | Default | Extra | +----------+--------------+------+-----+---------+----------------+ | id | int(6) | NO | PRI | NULL | auto_increment |

220

 

| resource | varchar(50) | YES | | NULL | | | supplier | varchar(100) | YES | | NULL | | | fein | varchar(12) | YES | | NULL | | | address | varchar(100) | YES | | NULL | | | contact | varchar(50) | YES | | NULL | | | phone | varchar(15) | YES | | NULL | | | phone2 | varchar(15) | YES | | NULL | | | fax | varchar(15) | YES | | NULL | | | email | varchar(50) | YES | | NULL | | | contract | varchar(100) | YES | | NULL | | +----------+--------------+------+-----+---------+----------------+ conversation_log +-----------+-----------+------+-----+-------------------+----------------+ | Field | Type | Null | Key | Default | Extra | +-----------+-----------+------+-----+-------------------+----------------+ | id | int(11) | NO | PRI | NULL | auto_increment | | input | text | NO | | NULL | | | response | text | NO | | NULL | | | userid | int(11) | NO | | NULL | | | bot_id | int(11) | NO | | NULL | | | timestamp | timestamp | NO | | CURRENT_TIMESTAMP | | +-----------+-----------+------+-----+-------------------+----------------+ customroles +----------+--------------+------+-----+---------+----------------+ | Field | Type | Null | Key | Default | Extra | +----------+--------------+------+-----+---------+----------------+ | id | int(11) | NO | PRI | NULL | auto_increment | | script | varchar(100) | YES | MUL | NULL | | | name | varchar(100) | YES | | NULL | | | owner | varchar(100) | YES | | NULL | | | archived | int(11) | YES | | 0 | | +----------+--------------+------+-----+---------+----------------+ disastermap +-------------+-------------+------+-----+---------+----------------+ | Field | Type | Null | Key | Default | Extra | +-------------+-------------+------+-----+---------+----------------+ | id | int(11) | NO | PRI | NULL | auto_increment |

221

 

| in_datetime | datetime | YES | | NULL | | | in_script | varchar(20) | YES | MUL | NULL | | | in_user | varchar(30) | YES | MUL | NULL | | | in_role | varchar(40) | YES | MUL | NULL | | | type | varchar(20) | YES | | NULL | | | coords | text | YES | | NULL | | | archived | int(11) | YES | | 0 | | +-------------+-------------+------+-----+---------+----------------+ eeganalysis +-------------------+--------------+------+-----+---------+----------------+ | Field | Type | Null | Key | Default | Extra | +-------------------+--------------+------+-----+---------+----------------+ | id | int(11) | NO | PRI | NULL | auto_increment | | obssummary | text | YES | | NULL | | | in_script | varchar(250) | YES | | NULL | | | in_area | varchar(250) | YES | | NULL | | | s1title | varchar(250) | YES | | NULL | | | s2title | varchar(250) | YES | | NULL | | | s3title | varchar(250) | YES | | NULL | | | i3title | varchar(250) | YES | | NULL | | | i2title | varchar(250) | YES | | NULL | | | i1title | varchar(250) | YES | | NULL | | | i1lessonslearned | varchar(250) | YES | | NULL | | | i2lessonslearned | varchar(250) | YES | | NULL | | | i3lessonslearned | varchar(250) | YES | | NULL | | | s3lessonslearned | varchar(250) | YES | | NULL | | | s2lessonslearned | varchar(250) | YES | | NULL | | | s1lessonslearned | varchar(250) | YES | | NULL | | | s1analysis | varchar(250) | YES | | NULL | | | s2analysis | varchar(250) | YES | | NULL | | | s3analysis | varchar(250) | YES | | NULL | | | i3analysis | varchar(250) | YES | | NULL | | | i2analysis | varchar(250) | YES | | NULL | | | i1analysis | varchar(250) | YES | | NULL | | | i1refs | varchar(250) | YES | | NULL | | | i2refs | varchar(250) | YES | | NULL | | | i3refs | varchar(250) | YES | | NULL | | | s3refs | varchar(250) | YES | | NULL | | | s2refs | varchar(250) | YES | | NULL | |

222

 

| s1refs | varchar(250) | YES | | NULL | | | s1recommendations | varchar(250) | YES | | NULL | | | s2recommendations | varchar(250) | YES | | NULL | | | s3recommendations | varchar(250) | YES | | NULL | | | i3recommendations | varchar(250) | YES | | NULL | | | i2recommendations | varchar(250) | YES | | NULL | | | i1recommendations | varchar(250) | YES | | NULL | | | s1relactivity | varchar(250) | YES | | NULL | | | s2relactivity | varchar(250) | YES | | NULL | | | s3relactivity | varchar(250) | YES | | NULL | | | i3relactivity | varchar(250) | YES | | NULL | | | i2relactivity | varchar(250) | YES | | NULL | | | i1relactivity | varchar(250) | YES | | NULL | | | archived | int(11) | YES | | 0 | | +-------------------+--------------+------+-----+---------+----------------+ eegcapabilities +------------+---------+------+-----+---------+----------------+ | Field | Type | Null | Key | Default | Extra | +------------+---------+------+-----+---------+----------------+ | id | int(11) | NO | PRI | NULL | auto_increment | | caparea | text | YES | | NULL | | | capdescr | text | YES | | NULL | | | capoutcome | text | YES | | NULL | | +------------+---------+------+-----+---------+----------------+ eocpositions +-----------------+--------------+------+-----+---------+-------+ | Field | Type | Null | Key | Default | Extra | +-----------------+--------------+------+-----+---------+-------+ | id | int(11) | YES | | NULL | | | agency | varchar(200) | YES | MUL | NULL | | | owner | varchar(200) | YES | | NULL | | | displaypriority | int(11) | YES | | NULL | | +-----------------+--------------+------+-----+---------+-------+

223

 

exobj +-------------+--------------+------+-----+---------+----------------+ | Field | Type | Null | Key | Default | Extra | +-------------+--------------+------+-----+---------+----------------+ | id | int(11) | NO | PRI | NULL | auto_increment | | in_datetime | datetime | YES | | NULL | | | in_script | varchar(255) | YES | MUL | NULL | | | in_user | varchar(30) | YES | MUL | NULL | | | objective | text | YES | | NULL | | | archived | int(11) | YES | | 0 | | +-------------+--------------+------+-----+---------+----------------+ hospitals +----------------+-------------+------+-----+---------+----------------+ | Field | Type | Null | Key | Default | Extra | +----------------+-------------+------+-----+---------+----------------+ | id | int(11) | NO | PRI | NULL | auto_increment | | in_datetime | datetime | YES | | NULL | | | in_script | varchar(20) | YES | MUL | NULL | | | in_user | varchar(30) | YES | MUL | NULL | | | in_role | varchar(40) | YES | MUL | NULL | | | name | varchar(50) | YES | | NULL | | | status | varchar(10) | YES | | NULL | | | accommodations | int(11) | YES | | NULL | | | used | int(11) | YES | | NULL | | | archived | int(11) | YES | | 0 | | +----------------+-------------+------+-----+---------+----------------+ injects +-----------------+--------------+------+-----+---------+----------------+ | Field | Type | Null | Key | Default | Extra | +-----------------+--------------+------+-----+---------+----------------+ | id | int(11) | NO | PRI | NULL | auto_increment | | in_datetime | datetime | YES | | NULL | | | in_script | varchar(30) | YES | MUL | NULL | | | in_user | varchar(30) | YES | MUL | NULL | | | in_role | varchar(40) | YES | MUL | NULL | | | time | int(11) | YES | | NULL | | | sender | varchar(250) | YES | | NULL | | | messagetype | varchar(250) | YES | | NULL | |

224

 

| messagetext | varchar(850) | YES | | NULL | | | comm_medium | varchar(250) | YES | | NULL | | | explanobj | varchar(100) | YES | | NULL | | | expectedaction | varchar(850) | YES | MUL | NULL | | | page | varchar(100) | YES | | NULL | | | rcompcomplexity | varchar(10) | YES | | NULL | | | rtasknovelty | varchar(10) | YES | | NULL | | | rtaskroutine | varchar(10) | YES | | NULL | | | rtaskdiff | varchar(10) | YES | | NULL | | | rtasksigni | varchar(10) | YES | | NULL | | | archived | int(11) | YES | | 0 | | +-----------------+--------------+------+-----+---------+----------------+ injects_explanobj +-----------+--------------+------+-----+---------+----------------+ | Field | Type | Null | Key | Default | Extra | +-----------+--------------+------+-----+---------+----------------+ | id | int(11) | NO | PRI | NULL | auto_increment | | inject_id | int(11) | YES | MUL | NULL | | | name | varchar(50) | YES | | NULL | | | value | varchar(200) | YES | | NULL | | +-----------+--------------+------+-----+---------+----------------+ injects_research +-----------+-------------+------+-----+---------+----------------+ | Field | Type | Null | Key | Default | Extra | +-----------+-------------+------+-----+---------+----------------+ | id | int(11) | NO | PRI | NULL | auto_increment | | inject_id | int(11) | YES | MUL | NULL | | | name | varchar(50) | YES | | NULL | | | value | int(11) | YES | | NULL | | +-----------+-------------+------+-----+---------+----------------+ livesdata +-------------+--------------+------+-----+---------+----------------+ | Field | Type | Null | Key | Default | Extra | +-------------+--------------+------+-----+---------+----------------+ | id | int(11) | NO | PRI | NULL | auto_increment | | in_datetime | datetime | YES | | NULL | | | in_script | varchar(20) | YES | MUL | NULL | |

225

 

| in_user | varchar(30) | YES | MUL | NULL | | | in_role | varchar(40) | YES | MUL | NULL | | | saved | int(11) | YES | | NULL | | | injured | int(11) | YES | | NULL | | | lost | int(11) | YES | | NULL | | | date | datetime | YES | | NULL | | | location | varchar(100) | YES | | NULL | | | archived | int(11) | YES | | 0 | | +-------------+--------------+------+-----+---------+----------------+ log +-------------+--------------+------+-----+---------+----------------+ | Field | Type | Null | Key | Default | Extra | +-------------+--------------+------+-----+---------+----------------+ | id | int(11) | NO | PRI | NULL | auto_increment | | in_datetime | datetime | YES | | NULL | | | in_script | varchar(20) | YES | MUL | NULL | | | in_user | varchar(30) | YES | MUL | NULL | | | in_role | varchar(100) | YES | MUL | NULL | | | action | int(11) | YES | MUL | NULL | | | type | varchar(20) | YES | MUL | NULL | | | archived | int(11) | YES | | 0 | | +-------------+--------------+------+-----+---------+----------------+ masterexobj +-----------+---------+------+-----+---------+----------------+ | Field | Type | Null | Key | Default | Extra | +-----------+---------+------+-----+---------+----------------+ | id | int(11) | NO | PRI | NULL | auto_increment | | objective | text | YES | | NULL | | +-----------+---------+------+-----+---------+----------------+ masterhandbook +-----------------+-------------+------+-----+---------+-------+ | Field | Type | Null | Key | Default | Extra | +-----------------+-------------+------+-----+---------+-------+ | script | varchar(50) | NO | PRI | NULL | | | purpose | text | YES | | NULL | | | concept | text | YES | | NULL | | | assumptions | text | YES | | NULL | |

226

 

| artificialities | text | YES | | NULL | | | simulation | text | YES | | NULL | | | narrative | text | YES | | NULL | | | procedures | text | YES | | NULL | | | safety | text | YES | | NULL | | | communications | text | YES | | NULL | | | reporting | text | YES | | NULL | | | admin | text | YES | | NULL | | | archived | int(11) | YES | | 0 | | +-----------------+-------------+------+-----+---------+-------+ masterinitstatus +----------+--------------+------+-----+---------+----------------+ | Field | Type | Null | Key | Default | Extra | +----------+--------------+------+-----+---------+----------------+ | script | varchar(100) | NO | | NULL | | | status | text | YES | | NULL | | | filename | varchar(50) | YES | | NULL | | | filetype | varchar(50) | YES | | NULL | | | filesize | int(11) | YES | | NULL | | | filedata | mediumblob | YES | | NULL | | | id | int(11) | NO | PRI | NULL | auto_increment | | archived | int(11) | YES | | 0 | | +----------+--------------+------+-----+---------+----------------+ masterinjectlist +----------------+--------------+------+-----+---------+----------------+ | Field | Type | Null | Key | Default | Extra | +----------------+--------------+------+-----+---------+----------------+ | id | int(11) | NO | PRI | NULL | auto_increment | | in_role | varchar(250) | YES | | NULL | | | sender | varchar(250) | YES | | NULL | | | messagetype | varchar(250) | YES | | NULL | | | messagetext | varchar(850) | YES | | NULL | | | comm_medium | varchar(250) | YES | | NULL | | | explanobj | varchar(100) | YES | | NULL | | | expectedaction | varchar(850) | YES | | NULL | | +----------------+--------------+------+-----+---------+----------------+

227

 

mastermetrics +----------+--------------+------+-----+---------+----------------+ | Field | Type | Null | Key | Default | Extra | +----------+--------------+------+-----+---------+----------------+ | id | int(11) | NO | PRI | NULL | auto_increment | | area | varchar(200) | YES | | NULL | | | activity | varchar(200) | YES | | NULL | | | ref | varchar(20) | YES | | NULL | | | measure | varchar(100) | YES | | NULL | | | metric | text | YES | | NULL | | +----------+--------------+------+-----+---------+----------------+ mastertcl +----------+--------------+------+-----+---------+----------------+ | Field | Type | Null | Key | Default | Extra | +----------+--------------+------+-----+---------+----------------+ | id | int(11) | NO | PRI | NULL | auto_increment | | area | varchar(200) | YES | | NULL | | | activity | varchar(200) | YES | | NULL | | | ref | varchar(20) | YES | | NULL | | | task | text | YES | | NULL | | +----------+--------------+------+-----+---------+----------------+ metrics +-------------+--------------+------+-----+---------+----------------+ | Field | Type | Null | Key | Default | Extra | +-------------+--------------+------+-----+---------+----------------+ | id | int(11) | NO | PRI | NULL | auto_increment | | in_datetime | datetime | YES | | NULL | | | in_script | varchar(255) | YES | MUL | NULL | | | in_user | varchar(30) | YES | MUL | NULL | | | area | varchar(200) | YES | | NULL | | | activity | varchar(200) | YES | | NULL | | | ref | varchar(20) | YES | | NULL | | | measure | varchar(100) | YES | | NULL | | | metric | text | YES | | NULL | | | archived | int(11) | YES | | 0 | | | status | varchar(100) | YES | | NULL | | +-------------+--------------+------+-----+---------+----------------+

228

 

missiontasks +---------------+--------------+------+-----+---------+----------------+ | Field | Type | Null | Key | Default | Extra | +---------------+--------------+------+-----+---------+----------------+ | id | int(11) | NO | PRI | NULL | auto_increment | | in_datetime | datetime | YES | | NULL | | | in_script | varchar(20) | YES | MUL | NULL | | | in_user | varchar(30) | YES | MUL | NULL | | | in_role | varchar(40) | YES | MUL | NULL | | | assignee | varchar(250) | YES | | NULL | | | missiontask | text | YES | | NULL | | | priority | varchar(15) | YES | | NULL | | | status | varchar(250) | YES | | NULL | | | priorityorder | int(11) | YES | | NULL | | | archived | int(11) | YES | | 0 | | +---------------+--------------+------+-----+---------+----------------+ myprogramo +-----------+--------------+------+-----+-------------------+----------------+ | Field | Type | Null | Key | Default | Extra | +-----------+--------------+------+-----+-------------------+----------------+ | id | int(11) | NO | PRI | NULL | auto_increment | | uname | varchar(10) | NO | UNI | NULL | | | pword | varchar(255) | NO | | NULL | | | lastip | varchar(25) | NO | | NULL | | | lastlogin | timestamp | NO | | CURRENT_TIMESTAMP | | +-----------+--------------+------+-----+-------------------+----------------+ observerpositions +-------+-------------+------+-----+---------+----------------+ | Field | Type | Null | Key | Default | Extra | +-------+-------------+------+-----+---------+----------------+ | id | int(11) | NO | PRI | NULL | auto_increment | | name | varchar(30) | YES | | NULL | | +-------+-------------+------+-----+---------+----------------+

229

 

pods +-------------+--------------+------+-----+---------+----------------+ | Field | Type | Null | Key | Default | Extra | +-------------+--------------+------+-----+---------+----------------+ | id | int(11) | NO | PRI | NULL | auto_increment | | in_datetime | datetime | YES | | NULL | | | in_script | varchar(20) | YES | MUL | NULL | | | in_user | varchar(30) | YES | MUL | NULL | | | in_role | varchar(40) | YES | MUL | NULL | | | name | varchar(50) | YES | | NULL | | | status | varchar(10) | YES | | NULL | | | description | varchar(850) | YES | | NULL | | | archived | int(11) | YES | | 0 | | +-------------+--------------+------+-----+---------+----------------+ positionlog +-------------+-------------+------+-----+---------+----------------+ | Field | Type | Null | Key | Default | Extra | +-------------+-------------+------+-----+---------+----------------+ | id | int(11) | NO | PRI | NULL | auto_increment | | in_datetime | datetime | YES | | NULL | | | in_script | varchar(20) | YES | MUL | NULL | | | in_user | varchar(30) | YES | MUL | NULL | | | in_role | varchar(40) | YES | MUL | NULL | | | notes | text | YES | | NULL | | | archived | int(11) | YES | | 0 | | +-------------+-------------+------+-----+---------+----------------+ resourcelog +------------------+---------------+------+-----+---------+----------------+ | Field | Type | Null | Key | Default | Extra | +------------------+---------------+------+-----+---------+----------------+ | id | int(11) | NO | PRI | NULL | auto_increment | | in_datetime | datetime | YES | | NULL | | | in_script | varchar(20) | YES | MUL | NULL | | | in_user | varchar(30) | YES | MUL | NULL | | | in_role | varchar(40) | YES | MUL | NULL | | | name | varchar(50) | YES | | NULL | | | initial_quantity | int(11) | YES | | NULL | | | quantity | int(11) | YES | | NULL | |

230

 

| cost | decimal(10,2) | YES | | NULL | | | date | date | YES | | NULL | | | archived | int(11) | YES | | 0 | | +------------------+---------------+------+-----+---------+----------------+ resourcerequests +------------------+--------------+------+-----+---------+----------------+ | Field | Type | Null | Key | Default | Extra | +------------------+--------------+------+-----+---------+----------------+ | id | int(11) | NO | PRI | NULL | auto_increment | | in_datetime | datetime | YES | | NULL | | | in_script | varchar(20) | YES | MUL | NULL | | | in_user | varchar(30) | YES | MUL | NULL | | | in_role | varchar(40) | YES | MUL | NULL | | | resource | varchar(250) | YES | | NULL | | | type | varchar(250) | YES | | NULL | | | quantity | varchar(30) | YES | | NULL | | | priority | varchar(15) | YES | | NULL | | | need | varchar(850) | YES | | NULL | | | other | varchar(100) | YES | | NULL | | | requestingagency | varchar(250) | YES | | NULL | | | requestor | varchar(250) | YES | | NULL | | | phone | varchar(100) | YES | | NULL | | | address | varchar(850) | YES | | NULL | | | status | varchar(250) | YES | | NULL | | | priorityorder | int(11) | YES | | NULL | | | notes | varchar(250) | YES | | NULL | | | unit | varchar(20) | YES | | NULL | | | archived | int(11) | YES | | 0 | | +------------------+--------------+------+-----+---------+----------------+ responses +-------------+--------------+------+-----+---------+----------------+ | Field | Type | Null | Key | Default | Extra | +-------------+--------------+------+-----+---------+----------------+ | id | int(11) | NO | PRI | NULL | auto_increment | | in_datetime | datetime | YES | | NULL | | | in_script | varchar(20) | YES | MUL | NULL | | | in_user | varchar(30) | YES | MUL | NULL | | | in_role | varchar(40) | YES | MUL | NULL | |

231

 

| injectnum | int(11) | YES | | NULL | | | discussant | varchar(250) | YES | | NULL | | | response | varchar(850) | YES | | NULL | | | archived | int(11) | YES | | 0 | | +-------------+--------------+------+-----+---------+----------------+ responsibilities +-------------+-------------+------+-----+---------+----------------+ | Field | Type | Null | Key | Default | Extra | +-------------+-------------+------+-----+---------+----------------+ | id | int(11) | NO | PRI | NULL | auto_increment | | in_datetime | datetime | YES | | NULL | | | in_script | varchar(20) | YES | MUL | NULL | | | in_user | varchar(30) | YES | MUL | NULL | | | in_role | varchar(40) | YES | MUL | NULL | | | item | text | YES | | NULL | | | archived | int(11) | YES | | 0 | | +-------------+-------------+------+-----+---------+----------------+ roadclosures +-------------+--------------+------+-----+---------+----------------+ | Field | Type | Null | Key | Default | Extra | +-------------+--------------+------+-----+---------+----------------+ | id | int(11) | NO | PRI | NULL | auto_increment | | in_datetime | datetime | YES | | NULL | | | in_script | varchar(20) | YES | MUL | NULL | | | in_user | varchar(30) | YES | MUL | NULL | | | in_role | varchar(40) | YES | MUL | NULL | | | roadname | varchar(250) | YES | | NULL | | | closedfrom | varchar(250) | YES | | NULL | | | archived | int(11) | YES | | 0 | | | closedto | varchar(250) | YES | | NULL | | | closed | datetime | YES | | NULL | | +-------------+--------------+------+-----+---------+----------------+ scripts +-------------+-------------+------+-----+---------+----------------+ | Field | Type | Null | Key | Default | Extra | +-------------+-------------+------+-----+---------+----------------+ | id | int(11) | NO | PRI | NULL | auto_increment |

232

 

| in_datetime | datetime | YES | | NULL | | | name | varchar(30) | YES | | NULL | | | in_user | varchar(40) | YES | | NULL | | | active | tinyint(1) | YES | | NULL | | | archived | int(11) | YES | | 0 | | +-------------+-------------+------+-----+---------+----------------+ shelters +----------------+-------------+------+-----+---------+----------------+ | Field | Type | Null | Key | Default | Extra | +----------------+-------------+------+-----+---------+----------------+ | id | int(11) | NO | PRI | NULL | auto_increment | | in_datetime | datetime | YES | | NULL | | | in_script | varchar(20) | YES | MUL | NULL | | | in_user | varchar(30) | YES | MUL | NULL | | | in_role | varchar(40) | YES | MUL | NULL | | | name | varchar(50) | YES | | NULL | | | status | varchar(10) | YES | | NULL | | | accommodations | int(11) | YES | | NULL | | | used | int(11) | YES | | NULL | | | archived | int(11) | YES | | 0 | | +----------------+-------------+------+-----+---------+----------------+ sigevents +-------------+--------------+------+-----+---------+----------------+ | Field | Type | Null | Key | Default | Extra | +-------------+--------------+------+-----+---------+----------------+ | id | int(11) | NO | PRI | NULL | auto_increment | | in_datetime | datetime | YES | | NULL | | | in_script | varchar(20) | YES | MUL | NULL | | | in_user | varchar(30) | YES | MUL | NULL | | | in_role | varchar(40) | YES | MUL | NULL | | | notes | varchar(850) | YES | | NULL | | | name | varchar(40) | YES | | NULL | | | archived | int(11) | YES | | 0 | | +-------------+--------------+------+-----+---------+----------------+

233

 

smresourcelog +----------+---------------+------+-----+---------+----------------+ | Field | Type | Null | Key | Default | Extra | +----------+---------------+------+-----+---------+----------------+ | id | int(11) | NO | PRI | NULL | auto_increment | | name | varchar(50) | YES | | NULL | | | quantity | int(11) | YES | | NULL | | | cost | decimal(10,2) | YES | | NULL | | +----------+---------------+------+-----+---------+----------------+ spellcheck +--------------+--------------+------+-----+---------+----------------+ | Field | Type | Null | Key | Default | Extra | +--------------+--------------+------+-----+---------+----------------+ | id | int(11) | NO | PRI | NULL | auto_increment | | missspelling | varchar(100) | NO | | NULL | | | correction | varchar(100) | NO | | NULL | | +--------------+--------------+------+-----+---------+----------------+ tcl +-------------+--------------+------+-----+---------+----------------+ | Field | Type | Null | Key | Default | Extra | +-------------+--------------+------+-----+---------+----------------+ | id | int(11) | NO | PRI | NULL | auto_increment | | in_datetime | datetime | YES | | NULL | | | in_script | varchar(20) | YES | MUL | NULL | | | in_user | varchar(30) | YES | MUL | NULL | | | area | varchar(200) | YES | | NULL | | | activity | varchar(200) | YES | | NULL | | | ref | varchar(20) | YES | | NULL | | | task | text | YES | | NULL | | | status | varchar(20) | YES | | NULL | | | archived | int(11) | YES | | 0 | | +-------------+--------------+------+-----+---------+----------------+ undefined_defaults +-------------+--------------+------+-----+---------+----------------+ | Field | Type | Null | Key | Default | Extra | +-------------+--------------+------+-----+---------+----------------+ | id | int(11) | NO | PRI | NULL | auto_increment |

234

 

| bot | int(11) | NO | | NULL | | | pattern | varchar(255) | NO | | NULL | | | replacement | varchar(255) | NO | | NULL | | +-------------+--------------+------+-----+---------+----------------+ unknown_inputs +-----------+-----------+------+-----+-------------------+----------------+ | Field | Type | Null | Key | Default | Extra | +-----------+-----------+------+-----+-------------------+----------------+ | id | int(11) | NO | PRI | NULL | auto_increment | | input | text | NO | | NULL | | | userid | int(11) | NO | | NULL | | | timestamp | timestamp | NO | | CURRENT_TIMESTAMP | | +-----------+-----------+------+-----+-------------------+----------------+ users +----------------+--------------+------+-----+---------------------+----------------+ | Field | Type | Null | Key | Default | Extra | +----------------+--------------+------+-----+---------------------+----------------+ | id | int(11) | NO | PRI | NULL | auto_increment | | session_id | varchar(255) | NO | | NULL | | | chatlines | int(11) | NO | | NULL | | | ip | varchar(100) | NO | | NULL | | | referer | text | NO | | NULL | | | browser | text | NO | | NULL | | | date_logged_on | timestamp | NO | | CURRENT_TIMESTAMP | | | last_update | timestamp | NO | | 0000-00-00 00:00:00 | | +----------------+--------------+------+-----+---------------------+----------------+ wps +----------------+--------------+------+-----+---------+----------------+ | Field | Type | Null | Key | Default | Extra | +----------------+--------------+------+-----+---------+----------------+ | id | int(11) | NO | PRI | NULL | auto_increment | | in_datetime | datetime | YES | | NULL | | | in_script | varchar(20) | YES | MUL | NULL | | | in_user | varchar(30) | YES | MUL | NULL | | | in_role | varchar(40) | YES | MUL | NULL | | | sheltername | varchar(250) | YES | | NULL | | | status | varchar(250) | YES | | NULL | |

235

 

| wateravailabel | decimal(4,2) | YES | | NULL | | | archived | int(11) | YES | | 0 | | +----------------+--------------+------+-----+---------+----------------+

236

APPENDIX H

DIRECTORY LISTING

237

./exercisedev/smpanel.php  

./exercisedev/resources/resourceschoose.php  

./exercisedev/resources/processresourceadd.php  

./exercisedev/resources/resourcesview.php  

./exercisedev/resources/processresourcedelete.php  

./exercisedev/authentication.php  

./exercisedev/targetcapdev/processtargetcapadd.php  

./exercisedev/targetcapdev/processmetricsadd.php  

./exercisedev/targetcapdev/exobjdeletemaster.php  

./exercisedev/targetcapdev/targetcapview.php  

./exercisedev/targetcapdev/targetcapchoose.php  

./exercisedev/targetcapdev/processexobjdelfrommaster.php  

./exercisedev/targetcapdev/processtargetcapdel.php  

./exercisedev/targetcapdev/processexobjadd.php  

./exercisedev/targetcapdev/exobjadd.php  

./exercisedev/targetcapdev/processexobjdel.php  

./exercisedev/targetcapdev/targetcapupd.php  

./exercisedev/targetcapdev/processtargetcapupd.php  

./exercisedev/targetcapdev/exobjdelete.php  

./exercisedev/targetcapdev/processmetricsdel.php  

./exercisedev/targetcapdev/metricsadd.php  

./exercisedev/targetcapdev/targetcapadd.php  

./exercisedev/initstatusdev/initstatusdev.php  

./exercisedev/initstatusdev/initstatusdevchoose.php  

./exercisedev/initstatusdev/form.php  

./exercisedev/initstatusdev/processinitstatus.php  

./exercisedev/exerciselog/inject_template.php  

./exercisedev/exerciselog/logchoose.php  

./exercisedev/exerciselog/logdl.php  

./exercisedev/exerciselog/logdisplay.php  

./exercisedev/handbookdev/handbookdevchoose.php  

./exercisedev/handbookdev/processhandbook.php  

./exercisedev/handbookdev/handbookdeveloper.php  

./exercisedev/respondev/respondevchoose.php  

./exercisedev/respondev/processrespon.php  

./exercisedev/dbconfig.php  

./exercisedev/log.php  

./exercisedev/databasectl/processdatabase.php  

./exercisedev/databasectl/databaserestore.php  

./exercisedev/databasectl/databasectl.php  

./exercisedev/databasectl/databasecreate.php  

./exercisedev/databasectl/databasereset.php  

./exercisedev/databasectl/databasearchive.php  

./exercisedev/disastermap/disastermapchoose.php  

./exercisedev/disastermap/disastermap_sav.php  

./exercisedev/disastermap/disastermap.php  

238

./exercisedev/evaluationmetrics/evaluationmetrics.php  

./exercisedev/evaluationmetrics/openevaluation.php  

./exercisedev/exercisecontroller/exercisecontrolpanel.php  

./exercisedev/exercisecontroller/communicationtools/chatbegin.php  

./exercisedev/exercisecontroller/communicationtools/dojochat.php  

./exercisedev/exercisecontroller/communicationtools/chatreceive.php  

./exercisedev/exercisecontroller/processexerciselogin.php  

./exercisedev/exercisecontroller/exerciselogin.php  

./exercisedev/exercisecontroller/dojoconnection.php  

./exercisedev/exercisecontroller/scriptloggedinas.php  

./exercisedev/scriptdev/injects/injects.php  

./exercisedev/scriptdev/injects/newForm.php  

./exercisedev/scriptdev/injects/form.php  

./exercisedev/scriptdev/injects/controller.php  

./exercisedev/scriptdev/injects/createFromDB.php  

./exercisedev/scriptdev/injects/newChoice.php  

./exercisedev/scriptdev/injects/editForm.php  

./exercisedev/scriptdev/injects/injectsActions.php  

./exercisedev/scriptdev/injects/masterinjectlist.php  

./exercisedev/scriptdev/closescript.php  

./exercisedev/scriptdev/processscriptadd.php  

./exercisedev/scriptdev/processscriptdelete.php  

./exercisedev/scriptdev/scripting.php  

./exercisedev/scriptdev/uploader/controllercsv.php  

./exercisedev/scriptdev/uploader/uploadercsv.php  

./exercisedev/scriptdev/uploader/controller.php  

./exercisedev/scriptdev/uploader/uploader.php  

./exercisedev/scriptdev/uploader/inject.class.php  

./exercisedev/playerreports/playerreport.php  

./exercisedev/playerreports/openplayerreport.php  

./exercisedev/exercisecontrol/framedcontroller4.php  

./exercise/authentication.php  

./exercise/exercisepanel.php  

./exercise/receivedinjects/receivedinjects.php  

./exercise/dbconfig.php  

./exercise/log.php  

./exercise/dojoconnection.php  

./exercise/programo/bot/default.php  

./exercise/programo/bot/getsetvars.php  

./exercise/programo/bot/response_handler.php  

./exercise/programo/bot/config.php  

./exercise/programo/bot/tag_functions.php  

./exercise/programo/bot/debugging.php  

./exercise/programo/bot/chat.php  

./exercise/programo/bot/check_aiml_part.php  

./exercise/programo/index.php  

239

./exercise/livesdash/livesdash.php  

./exercise/chat/chatbegin.php  

./exercise/chat/dojochat.php  

./exercise/chat/chatreceive.php  

./exercise/cost/cost_to_county.php  

./logoutAdmin.php  

./evaluator/authentication.php  

./evaluator/targetcap/targetcapview.php  

./evaluator/observercontroller/communicationtools/chatbegin.php  

./evaluator/observercontroller/communicationtools/dojochat.php  

./evaluator/observercontroller/communicationtools/chatreceive.php  

./evaluator/observercontroller/processexerciselogin.php  

./evaluator/observercontroller/observercontrolpanel.php  

./evaluator/observercontroller/exerciselogin.php  

./evaluator/observercontroller/dojoconnection.php  

./evaluator/observercontroller/scriptloggedinas.php  

./evaluator/dbconfig.php  

./evaluator/log.php  

./evaluator/disastermap/disastermap.php  

./evaluator/observerpanel.php  

./evaluator/responsibilities/respondev.php  

./evaluator/responsibilities/respondevchoose.php  

./evaluator/eeg/eeg.php  

./evaluator/eeg/statusupdatemetric.php  

./evaluator/eeg/statusupdate.php  

./evaluator/eeg/processtargetcapupdmetric.php  

./evaluator/eeg/eeganalysis.php  

./evaluator/eeg/processtargetcapupd.php  

./evaluator/eeg/processeeganalysis.php  

./main/missiontasks/processmissiontask.php  

./main/missiontasks/missiontaskadd.php  

./main/missiontasks/missiontaskview.php  

./main/missiontasks/processmissiontaskedit.php  

./main/missiontasks/processmissiontaskedititem.php  

./main/missiontasks/processmissiontaskdelete.php  

./main/resourcerequests/processrequestupdate.php  

./main/resourcerequests/resourcerequestedit.php  

./main/resourcerequests/resourcerequests.php  

./main/resourcerequests/processrequestdelete.php  

./main/authentication.php  

./main/livesdata/processlivesdata.php  

./main/livesdata/livesdata.php  

./main/livesdata/livesdataedit.php  

./main/livesdata/processlivesdata_delete.php  

./main/template.php  

./main/dbconfig.php  

240

./main/log.php  

./main/handbook/handbookdisplay.php  

./main/logistics/logistics_edit.php  

./main/logistics/processrequestedit.php  

./main/logistics/acquireitem.php  

./main/logistics/logistics.php  

./main/disastermap/disastermapchoose.php  

./main/disastermap/disastermap_sav.php  

./main/disastermap/disastermap.php  

./main/pods/pods.php  

./main/pods/processpods.php  

./main/pods/processpods_update.php  

./main/pods/processpods_add.php  

./main/pods/processpods_delete.php  

./main/pods/pods_edit.php  

./main/mainpanel.php  

./main/positionlog/position_log_edit.php  

./main/positionlog/position_log.php  

./main/positionlog/processpositionlogedit.php  

./main/positionlog/processpositionlog.php  

./main/positionlog/processpositionlogdelete.php  

./main/statusboards/statusboards.php  

./main/roadclosures/roadclosures.php  

./main/roadclosures/processroadclosures_add.php  

./main/roadclosures/roadclosures_edit.php  

./main/roadclosures/processroadclosures_delete.php  

./main/roadclosures/processroadclosures_update.php  

./main/significantevents/significantevents.php  

./main/significantevents/significantevents_edit.php  

./main/significantevents/processsignificanteventsedit.php  

./main/significantevents/processsignificantevents.php  

./main/significantevents/processsignificanteventsdelete.php  

./main/initialstatus/processlivesdata.php  

./main/initialstatus/initstatuspic.php  

./main/initialstatus/initialstatus.php  

./main/initialstatus/processinitstatus.php  

./main/resourcelog/processresourceadd.php  

./main/resourcelog/processresourceedit.php  

./main/resourcelog/resources_edit.php  

./main/resourcelog/resources_add.php  

./main/resourcelog/processresourcedelete.php  

./main/resourcelog/resources_view.php  

./main/resourcelog/resourcerequest.php  

./main/responsibilities/responsibilities.php  

./main/shelters/processshelters_delete.php  

./main/shelters/shelters.php  

241

./main/shelters/processshelters_add.php  

./main/shelters/processshelters_update.php  

./main/shelters/shelters_edit.php  

./main/hospitals/hospitals_edit.php  

./main/hospitals/processhospitals_add.php  

./main/hospitals/processhospitals_update.php  

./main/hospitals/hospitals.php  

./main/hospitals/processhospitals_delete.php  

./badlogin.php  

./RegularLoginObserver.php  

./respondev.php  

./authentication.php  

./logoutObserver.php  

./access-­‐denied.php  

./RegularLogin.php  

./template.php  

./dbconfig.php  

./log.php  

./observerindex.php  

./chosenobserver.php  

./RegularLoginAdmin.php  

./logout.php  

./pickplayer.php  

./logjs.php  

./chosen.php  

./research/authentication.php  

./research/exerciselog/inject_template.php  

./research/exerciselog/logchoose.php  

./research/exerciselog/logdl.php  

./research/exerciselog/logdisplay.php  

./research/dbconfig.php  

./research/rcpanel.php  

./research/log.php  

./research/viewscripts/viewscript.php  

./research/viewscripts/viewscriptchoose.php  

./research/disastermap/disastermapchoose.php  

./research/disastermap/disastermap_sav.php  

./research/disastermap/disastermap.php  

./research/evaluationmetrics/evaluationmetrics.php  

./research/evaluationmetrics/openevaluation.php  

./research/positionlog/position_log_edit.php  

./research/positionlog/position_log.php  

./research/positionlog/processpositionlogedit.php  

./research/positionlog/openpositionlog.php  

./research/positionlog/processpositionlog.php  

./research/positionlog/processpositionlogdelete.php  

242

./research/playerreports/playerreport.php  

./research/playerreports/openplayerreport.php  

./logoutRC.php  

./pickobserver.php  

./logoutSM.php  

./testing.php  

./tutorial.php  

./common/authentication.php  

./common/exerciselog/inject_template.php  

./common/exerciselog/logchoose.php  

./common/exerciselog/logdl.php  

./common/exerciselog/logdisplay.php  

./common/dbconfig.php  

./common/log.php  

./common/disastermap/disastermapchoose.php  

./common/disastermap/disastermap_sav.php  

./common/disastermap/disastermap.php  

./common/evaluationmetrics/evaluationmetrics.php  

./common/evaluationmetrics/openevaluation.php  

./common/positionlog/position_log_edit.php  

./common/positionlog/position_log.php  

./common/positionlog/processpositionlogedit.php  

./common/positionlog/processpositionlog.php  

./common/positionlog/processpositionlogdelete.php  

./common/playerreports/playerreport.php  

./common/playerreports/openplayerreport.php  

./index.php  

./RegularLoginSM.php  

./admin/injects.php  

./admin/objectivesActions.php  

./admin/authentication.php  

./admin/adminpanel.php  

./admin/objectives.php  

./admin/userActions.php  

./admin/injectsActions.php  

./admin/customroles/controller.php  

./admin/customroles/customrolesowner.php  

./admin/customroles/deletedepartment.php  

./admin/customroles/customrolesdepartment.php  

./admin/customroles/deleteowner.php  

./admin/index.php  

./admin/users.php  

./RegularLoginRC.php  

243

APPENDIX I

MASTER TEST PLAN

244

 

       

       

   

vEOC    

MASTER  TEST  PLAN              

Version  1    

May  2010

08  Fall  

Master  Test  Plan  

245

 

   

1. Functionality  Testing    Performed   for   testing   of:   all   the   links   in   web   pages,   checking   the   database  connections,   forms   used   in   the   web   pages   for   submitting   or   getting  information  from  user  &  Cookie  testing.  

1.1.  Testing  all  the  “Links”:  • Test  the  outgoing  links  from  all  the  pages  from  specific  domain      under  

test.  • Test  all  internal  links.  • Test  links  jumping  on  the  same  pages.  • Test   links   used   to   send   the   email   to   admin   or   other   users   from  web  

pages.  • Test  to  check  if  there  are  any  orphan  pages.  • Lastly  in  link  checking,  check  for  broken  links  in  all  above-­‐mentioned  

links.      1.2. Testing  of  the  forms  on  the  web  pages:  Forms  are  the  essential  and  integral  part  of  any  web  site.  Forms  are  used  to  get  information  from  users  and  to  keep  interaction  with  them.  The  following  should  be  checked  on  the  forms:  

• Check  all  the  validations  on  each  field.  • Check  for  the  default  values  of  fields.  • Wrong  inputs  to  the  fields  in  the  forms.  • Options  to  create  forms  if  any,  form  delete,  view  or  modify  the  forms.  • Check  that  no  empty  forms  are  created.  • There   are   different   field   validations   like   email-­‐id’s,   user   financial  

information,  date,  etc  All  the  above  validations  should  be  checked  in  a  manual  or  an  automated  way.              

246

 

   1.3. Cookie  Testing:  Cookies   are   small   files   stored   on   user   machine   that   are   basically   used   to  maintain  the  sessions  such  as  the  ‘login  sessions’.  

• Test   the   application   by   enabling   or   disabling   the   cookies   in   your  browser  options  

• Test  if  the  cookies  are  encrypted  before  writing  to  user  machine  • During   the   test   for   session   cookies   (i.e.   cookies   expire   after   the  

sessions  ends)  check  for  login  sessions  and  user  stats  after  session  end  • Check  effect  on  application  security  by  deleting  the  cookies  

 1.4. Validation  (HTML/CSS/PHP):  

• HTML/CSS  validation  is  very  important  for  optimizing  the  website  for  search  engines.  

• The  site  has  full  and  correct  Doctype  • The  site  uses  character  set  • The  site  uses  valid  XHTML  • The  site  uses  valid  CSS  • The  site  has  no  unnecessary  ids  or  classes  • The  site  uses  well  structured  code  • The  site  has  no  broken  links  • The  site  has  no  JavaScript  errors  

   1.5. Validation  Checklists  Tables:         HTML  Validation  

Pass   Fail   Description       Any  exceptions  to  W3C  HTML  V4.0  

standards  have  been  approved  and  documented  

    HTML  code  is  W3C  HTML  V4.0  compliant  (barring  any  approved  exceptions)  

    Web  page  renders  correctly  when  viewed  with  opera  5.0  browser  

    Comments  and  change  control  logs  are  not  included  in  the  HTML  sent  to  the  client  

     

247

 

         

Image  Validation  Pass   Fail   Description  

    The  image  adds  value  to  the  website       If   the   image   is   animated   it   links   to   the  

appropriate  page       The   image   is   stored   in   the   most  

appropriate   format   (e.g.   .GIF   files   for  buttons  and  .JPG  files  for  photos)  

    If  a  GIF   file,   the   image  size   is  a  multiple  of  8  pixels  

    The   visual   size   of   the   image   is  appropriate   for   the   size  of   the  viewable  screen   (it   does   not   occupy   too  much  or  too  little  of  the  screen  real  estate)  

    The   physical   file   size   of   the   image   is   as  small  as  possible  without  compromising  the  quality  of   the   image   i.e.   the   file  was  saved   using   the   optimum   compression  ratio.  

    An  appropriate  ALT  Tag  is  included  with  this  image.  

    The   WIDTH   &   HIGHT   (expressed   as  page  %  and  not  absolute  pixel  sizes)  tags  have  been  specified  for  this  image  

    The   image   is   not   copyrighted   or  trademarked  by  someone  else  

    The   total   size   of   the   image   on   the   page  does  not  exceed  50kbytes  

    There   is   not   more   than   one   animated  image  on  this  page  

    Photographic   images   aside   (e.g.   JPG  images)   no   more   than   256   colors   are  used  on  this  web  page  

    Any  image  maps  used  are  client  side  (as  opposed  to  server  side)    

 Font  Validation  

Pass   Fail   Description  

248

 

    The  font  is  proportional       The  primary  font  is  Verdana,  with  Ariel  and  

Sans-­‐Serif  specified  as  alternates       The  browsers  base  font  size  is  not  altered       Only  relative  font  sizes  are  used  (e.g.  small  

medium  and  large)  rather  than  specific  point  sizes  

     No  more  than  3  font  sizes  are  used  on  the  web  page  

    Symbol  fonts  are  used  only  when  absolutely  necessary  

    If  symbol  fonts  are  used  they  are  properly  mapped  to  the  private  area  of  the  developers  Unicode  

    Browser  default  colors  are  not  overridden        

Printer  Friendly  Validation  Pass   Fail   Description  

    The   test   on   the   web   page   is   formatted  correctly   when   printed   via   a   72   dpi   printer  using  letter  and  A4  paper  sizes  

     The   content   of   the   web   page   is   clearly  readable   when   printed   with   a   black   and  white  printer  

     The   content   of   the   web   page   is   clearly  readable  when  printed  with  a  colored  printer  

    The  background  of  the  webpage  is  white         Only  dark  colors  are  used  for  the  text  on  the  

web  page             Style  Sheet  Validation  

Pass   Fail   Description       The  style  sheet  is  W3C  level  1  compliant       The  style  sheet  is  correctly  interpreted  by  the  

4X  generation  of  the  web  browsers       The  style  sheet  complies  with  printer  friendly  

standards       The  style  sheet  complies  with  the  font  

standards  

249

 

    The  style  sheet  is  defined  as  an  external  CSS  style    

    Web  pages  do  not  modify  the  style  sheet  dynamically  

    Web  pages  that  use  the  style  sheet  provide  acceptable  rendering  when  viewed  by  the  browsers  that  do  not  support  CSS  or  have  CSS  turned  off  by  the  client  

       

Table  Validation  Pass   Fail   Description  

    There  are  no  unwanted  spaces  or  carrier  returns  in  the  table  

    No  cell  is  overpopulated  with  too  much  verbiage  

    Every  cell  in  the  table  is  populated  (i.e.  no  null  values)  as  some  browsers  collapse  empty  cells.  Extra  scrutiny  should  be  applied  if  the  information  is  imported  from  a  database  dynamically  

    The  WIDTH  &  HIGHT  Tags  were  specified  for  all  cells  using  screen  %  instead  of  absolute  pixels  wherever  possible  

 Style  Guide  and  Template  Adherence  

Pass   Fail   Description       The  web  page  follows  (except  where  

documented/  approved)  the  style  guidelines  documented    

    The  web  page  was  based  on  the  most  appropriate  web  page  template  

   

Plug  in  Validation  Pass   Fail   Description  

    The  website  (after  requesting  the  clients  permission)  lists  the  plug  ins  and  the  versions  to  view  all  the  content  on  the  site  

    The  web  site  able  to  detect  whether  or  not  the  required  plug-­‐ins  are  installed  in  the  

250

 

client  side       In  the  event  that  the  web  site  is  unable  to  

accurately  determine  whether  or  not  a  plug-­‐  in  is  installed  or  not,  the  website  contains  an  area  that  tells  the  user  how  to  proceed  

   

Test  Station  Validations  Pass   Fail   Description  

    Different  versions  of  the  same  brand  of  browsers  are  installed  in  different  instances  of  an  operating  system  

    Only  general  release  software  is  used.  No  OEM.SP  or  beta  versions  are  used  with  the  exception  of  any  required  Y2K  patches  that  are  necessary  for  this  to  work  post  Y2K  

    All  of  the  installations  use  the  installation  defaults  for  directory  names,  cache  sizes,  fonts  plug-­‐ins  etc.  

        Packaged  Application  Validation  

Pass   Fail   Description       Product  documentation  the  exact  order  in  

which  the  components  should  be  installed  and  the  configuration  settings  that  are  required  or  recommended  

    Product  documentation  explains  how  to  uninstall  the  product  cleanly  

    Product  documentation  adequately  describes  when  and  how  the  data  files  or  database  should  be  reorganized  

    Automatic  updates  install  and  operate  correctly  on  all  of  the  supported  platforms  

    Automatic  updates  install  and  operate  correctly  when  other  application  have  been  added  /removed  before  and  after  the  update  is  performed  

     

Links  and  URL  Validation  Pass   Fail   Description  

251

 

    The  link  is  not  broken  and  goes  to  the  most  appropriate  location  

    If  the  link  is  in  an  internal  link  it  uses  all  lower  case  characters  

    If  this  link  is  an  internal  link  it  uses  relative  addressing  (i.e.  it  does  not  use  an  absolute  address)  

    If  this  link  is  an  internal  link,  it  does  not  launch  a  new  browser  window  unless  it’s  a  help  page  

    If  this  link  is  an  external  link,  it  does  launch  a  new  browser  window  

    This  link  adds  value  to  the  website,  Links  with  little  value  add  to  the  maintenance  load  (especially  external  links)  and  potentially  make  a  webpage  less  usable  

    The  browsers  GO/HISTORY  list  is  updated  correctly  after  using  this  link.  Some  developers  manipulate  the  browsers  history  and  thereby  degrade  the  website’s  usability  

    When  using  the  BACK  button  the,  previously  entered  data  is  not  lost  

    The  link  text  does  not  wrap  to  two  lines,  this  may  confuse  visitors  into  thinking  that  there  are  two  links  instead  of  one.  

     

Redirect  Validation  Pass   Fail   Description  

    The  default  400,401,402,403  and  404  –error  pages  have  been  developed  and  properly  configured  on  the  production  Web  server(s)  

    If  the  link  is  being  redirected,  it  goes  to  the  correct  final  destination  and  is  not  redirected  

    If  a  link  points  to  a  directory  (instead  of  a  specific  web  page)  the  link  ends  with  a  slash  

     

252

 

Bookmark/Favorite  Validation  Pass   Fail   Description  

    Every  web  page  has  a  bookmark  that  accurately  reflects  the  contents  of  the  webpage  

    No  bookmark  is  longer  than  32  characters,  since  browsers  typically  truncate  the  display  of  verbose  descriptions  

    Each  bookmark  must  start  with  “VEOC-­‐”  

     

Using  The  Browsers  That  Most  Of  The  Clients  Have  Pass   Fail   Description  

    Pages  using  framesets  are  displayed  correctly  

    Frames  are  not  resizable       Pages  within  the  framesets  can  be  

bookmarked       The  back  button  recalls  the  URL  of  the  last  

frame  viewed       The  initial  frameset  is  downloaded  in  an  

acceptable  period  of  time       Pages  using  framesets  can  be  printed  

correctly  or  an  alternate  page  is  available  for  printing  

    Nested  framesets  (if  used)  have  sufficient  screen  real  estate  assigned  to  each  frame  

    All  external  links  launch  new  browser  windows  (i.e.  third  party  web  sites  are  not  embedded  inside  VEOC  frame  set)  

    Search  engines  can  find  all  of  the  contents  within  the  framesets    

   

Website  Organization  Testing  Checklist  Pass   Fail   Description  

    “Core”  web  pages  can  be  located  within  4  clicks  

    All  the  web  pages  in  the  website  can  be  found  by  casually  browsing  the  website  (i.e.  

253

 

no  need  to  resort  to  a  site  map  or  a  search  engine)  

    Information  on  the  site  can  be  found  using  the  search  strategies  that  a  visitor  might  consider  

    The  web  site  does  not  contain  any  orphaned  files  (i.e.  files  that  cannot  be  reached  by  following  any  path  from  the  home  page)  

        Web  Site  Map  Validation  

Pass   Fail   Description       All  “core”  web  pages  can  be  found  using  the  

site  map       Only  “core”  web  pages  are  located  on  the  

site  map       Web  pages  are  listed  in  an  appropriate  

hierarchy       Links  are  all  functional  and  go  to  the  

correct  pages        

Search  Engine  Testing-­‐Validate  Accuracy  And  Performance  Under  Normal  And  Stress  Loads  

Pass   Fail   Description       The  first  set  of  results  is  returned  within  5  

seconds  (excluding  internet  transmission  times)  

    The  result  is  sorted  appropriately  (e.g.  alphabetically  or  by  %  likelihood)  

    The  search  engine  functions  correctly  when  a  user  enters  common  words  that  are  likely  to  generate  a  huge  no.  of  hits  such  as  “a”,”  the  ”  or  “VEOC”  

    The  search  engine  functions  correctly  when  the  user  enters  non-­‐existent  words  that  are  unlikely  to  generate  any  valid  answers  such  as  “hggfkh”,  “hjjgj”  or  null  requests  

    The  search  engine  ignores  the  source  code  used  to  build  a  web  page  and  only  indexes,  the  content  of  the  web  page  (e.g.  requesting  

254

 

information  on  “JavaScript”  will  only  return  documents  that  reference  JavaScript,  not  all  of  the  web  pages  that  use  JavaScript  in  their  source  code)  

    The  search  engine  does  not  index  sensitive  words  such  as  “secret”  or  “fraud”  etc  

    The  search  engine  functions  correctly  when  you  enter  a  search  string  with  a  maximum  number  of  characters  plus  one  

    The  search  engine  functions  correctly  when  you  enter  multiple  word  requests  with  or  without  the  Boolean  operators  “and”,  “or”,  “not”,  “+”  or  “–“  

    The  search  engine  functions  correctly  when  you  enter  one  or  more  wildcards  

    If  fuzzy  login  is  enabled,  the  search  engine  offers  alternate  suggestions  for  zero  hit  requests  based  on  searches  using  a  spellchecked  version  of  the  initial  search  string  

   

Link–Checking  Tools  Testing  Checklist  Pass   Fail   Description  

    External  links  can  be  checked  but  (optionally)  cannot  be  scanned  any  further    

    When  encountering  a  recursive  loop,  the  tool  does  not  go  into  a  death  spiral  

    Tools  do  not  ignore  duplicate  links       The  tool  is  able  to  handle  dynamic  links       The  tool  is  able  to  handle  framesets       The  tool  is  able  to  handle  cookies  

(session/persistent)       The  tool  can  handle  pages  that  require  user  

inputs  (e.g.  forms)       The  tool  facilitates  identifying  suspiciously  

large  or  small  pages       The  tool  specifies  identifying  absolute  links       The  tool  facilitates  identifying  the  pages  that  

are  too  deep      

255

 

Validating  Forms  On  A  Web  Site  Pass   Fail   Description  

    All  data  entry  fields  have  HTML  size  attribute  set  correctly  (size  is  used  to  specify  the  width  of  the  field)  

    All  the  data  entry  fields  have  the  HTML  MAXLENGTH  SET  correctly  (max  length  of  characters  a  user  can  enter)  

    If  radio  controls  are  used  a  default  is  always  selected  

    All  required  fields  use  a  visual  cue  to  indicate  to  the  user  that  the  field  is  mandatory  

    If  a  form  uses  a  drop  down  data  entry  field  (control)  the  options  are  sorted  appropriately  and  the  fields  is  wide  enough  to  display  all  of  the  options  

    Data  is  not  lost  when  the  user  clicks  the  browsers  back  button  (and  subsequently  forward)  buttons  through  a  series  of  forms  

    Data  is  not  lost  when  the  user  clicks  the  browsers  forward  button  (and  subsequently  back)  buttons  midway  through  a  series  of  forms  

    Data  is  not  lost  when  the  user  clicks  the  GO/HISTORY  buttons  to  revisit  previous  forms  

    Data  is  not  lost  when  the  user  clicks  the  bookmark  or  favorite  midway  through  a  series  of  forms  

    Data  is  not  lost  when  the  user  clicks  the  browser  reload  button  midway  through  a  series  of  forms  

    Data  is  not  lost  when  the  user  resizes  the  browser  window  

    Duplicate  data  is  not  added  to  the  database  when  a  user  presses  any  combination  of  the  forward,  back,  go/history,  bookmark/favorite,  reload,  resize  buttons  midway  through  a  series  of  forms  

    The  browser  places  the  cursor  on  the  most  appropriate  field/control  where  the  form  is  first  viewed  

256

 

    Using  the  browsers  tab  key  allows  the  client  to  tab  through  the  input  fields  on  the  form  in  a  top  to  bottom,  left  to  right  order  

    If  the  form  data  is  send  back  to  the  web  server  using  the  HTTP  GET  command,  the  data  is  not  truncated  

   1.5.1. Client  vs.  Server  Side  Validation:    

Validating  Data  on  a  Form  Pass   Fail   Description  

    All  data  entry  fields  are  checked  for  invalid  data.  An  appropriate  error  message  is  displayed  if  the  data  is  found  to  be  invalid  

    All  validations  are  performed  (error  messages  displayed)  in  a  top-­‐down,  left-­‐right  fashion  

    All  required  fields  are  checked  on  the  client  side  

    Whenever  possible,  all  fields  co-­‐dependencies  are  checked  on  the  client  side  

    All  basic  data  checks  are  performed  on  the  client  side  

    All  client-­‐side  checks  are  rechecked  on  the  server-­‐side  

   

Validating  DHTML  Pages  Pass   Fail   Description  

    DHTML  is  appropriate  for  most  of  the  user  browsers  

    All  the  DHTML  code  conforms  to  the  W3C  DHTML  standard  

    The  pages  are  displayed  and  viewed  correctly  in  different  browsers  

   

Validating  Pop-­‐ups  Pass   Fail   Descriptions  

    Website  is  able  to  detect  the  browser  that  has  disabled  or  (does  not  support)  JavaScript  /java/ActiveX  and  provides  the  

257

 

user  with  an  appropriate  message       The  pop-­‐up  follows  the  web  GUI  standard       The  pop-­‐up  is  not  too  large  for  the  parent  

window  and  its  initial  screen  positioning  is  appropriate  

     

Streaming  Content  Checklist  Pass   Fail   Description  

    The  streaming  quote  server  and  the  network  is  able  to  handle  the  expected  demand  for  this  service  

    Clients  are  able  to  suspend/restart  this  service  without  needing  to  unsubscribe  /  re-­‐subscribe  

    Clients  are  able  to  adjust  the  frequency  of  updates  to  cater  the  different  client  side  bandwidths  

     

Common  Gateway  Interface  (CGI)  Script  Validation  Pass   Fail   Description  

    The  CGI  script  is  able  to  parse  input  parameters  containing  quotation  marks,  carriage  returns,  ampersand  symbols,  dollar  signs,  question  marks  and  other  control  characters  

    The  CGI  script  is  robust  enough  to  handle,  missing  and  out  of  range  parameters  

    The  CGI  script  is  robust  enough  to  handle  null  values  being  returned  from  the  database  

    The  CGI  script  is  robust  enough  to  handle  “no  record  found  ”  code  being  returned  by  the  database  

    The  CGI  script  is  robust  enough  to  handle  a  “duplicate  record  inserted  ”  code  being  returned  by  the  database  

    The  CGI  is  robust  enough  to  handle  multiple  records  being  returned  by  the  database  

258

 

    The  CGI  script  is  robust  enough  to  handle  a  database  timeout  code  being  returned  by  the  database  

    The  web  server  has  sufficient  resources  to  handle  the  expected  number  of  the  CGI  scripts  that  are  likely  to  be  initiated    

     

Data  Integrity  Validation  Pass   Fail   Description  

    A  new  record  is  inserted  into  the  database       A  new  record  can  be  accurately  read  from  

the  database       The  record  is  accurately  updated  into  the  

database       A  record  is  completely  deleted  from  the  

database      1.5.2. Server  Side  Validation:    

Server-­‐Side  Includes  Validation  Pass   Fail   Description  

    All  SSI  and  XSSI  selection  criteria  are  accurately  documented  and  each  include  file  contains  a  “start  of  file”  and  “end  of  file  ”  comment  

    No  JSSI  files  are  used       The  appropriate  content  is  displayed  and  

formatted  correctly  for  each  of  the  possible  selection  criteria  

    No  “include”  file  references  another  “include  ”  file.  While  technically  possible,  this  programming  style  can  be  difficult  to  debug  and  can  also  impact  performance  

   

Dynamic  Server  Page  Validation  Pass   Fail   Description  

    The  dynamically  generated  page  is  not  a  candidate  for  being  replaced  by  one  or  more  static  pages  

259

 

    Developers  used  a  single  language  for  all  scripts  within  all  dynamically  generated  web  page  

    No  “template”  file  references  another  “template”  file.  While  technically  possible,  this  programming  style  can  be  difficult  to  debug  and  can  also  impact  performance    

    All  DSP  templates  have  been  inspected  by  at  least  one  senior  developer  who  was  not  the  author  of  the  template  

    All  high  frequency  pages  have  been  generated  and  manually  tested  

    All  high  risk  pages  have  been  generated  and  manually  tested  

       

Cookie  Validation  Pass   Fail   Description  

When  cookies  are:       Disabled  before  accessing  the  site  before,  

either  one  of  the  tow  things  happens:  § The  site  works  correctly  § The  site  issues  warning  messages  

telling  the  visitor  that  cookies  turned  on  can  access  the  site.  

    Disabled  midway  through  a  transaction,  the  site  is  able  to  detect  the  situation  and  handle  it  gracefully  

    Deleted  mid  way  through  a  transaction    When  cookie  is  edited  and  some  parameters  are:  

    Added,  the  site  detects  the  situation  and  handles  it  gracefully  

    Deleted,  the  site  detects  the  situation  and  handles  it  gracefully  

    Swapped,  the  site  detects  the  situation  and  handles  it  gracefully  

    Set  to  null,  the  site  detects  the  situation  and  handles  it  gracefully  

    Some  parameters  are  edited  and  set  to  invalid  values,  the  site  detects  the  situation  and  handles  it  gracefully  

260

 

Other  Validation  Tests  include  the  following       When  the  clients  PC  memory  or  disk  cache  

is  cleared  midway  through  the  transaction,  the  site  detects  the  situation  and  handles  it  gracefully.  Sessions  cookies  are  stored  in  the  memory  and  typically  don’t  get  saved  to  the  hard  disk.  Persistent  cookies  may  need  to  be  deleted  manually  

    When  control  characters  or  special  operating  system  commands  are  added  to  a  cookie,  the  site  detects  the  situation  and  handles  it  gracefully  

    When  multiple  entries  for  a  website  are  added  to  the  browser’s  cookies.txt  file,  the  site  detects  the  situation  and  handles  it  gracefully  

    When  the  user  identification  field  in  the  cookie  is  changed  midway  through  a  transaction,  the  site  detects  the  situation  and  handles  it  gracefully.  Consider  replacing  the  regular  user-­‐id  account  with  values  such  as  admin,  test,  super  user  or  guest    

   

Maintaining  a  Session  Pass   Fail   Description  

    The  web  application  is  capable  of  maintaining  a  single  session  through  multiple  browsers  running  on  the  same  client  

    The  web  application  is  capable  of  simultaneously  accessing  the  same  account  through  multiple  clients  

    Adequate  database  locking  capabilities  have  been  documented  in  the  specification  and  have  been  properly  implemented  

    The  web  application  time/date  stamps  transactions  using  the  clock  on  the  web  server,  not  the  clock  on  the  client  

    The  web  application  is  able  to  handle  a  user  disabling  cookies  (session  and/or  persistent)  midway  through  a  transaction  

261

 

    The  web  application  is  able  to  handle  a  user  clearing  the  cache  (disk  or  memory)  midway  through  a  session  

    The  web  application  is  able  to  handle  a  user  disabling  JavaScript  and/or  VBScript  midway  through  a  transaction  

    The  web  application  is  able  to  handle  a  user  disabling  the  java  applets  and/or  ActiveX  controls  midway  through  a  session  

    The  web  application  is  able  to  handle  a  user  deleting  the  query  portion  of  the  website’s  URL  midway  through  a  session  

    The  load  balancer  (if  used)  is  able  to  maintain  a  session  

     

Managing  Concurrent  Users  Pass   Fail   Description  

    Server  memory  is  freed  when  a  user  completes  a  session  or  transaction  

    Network  connections  are  closed  when  a  user  completes  a  session  or  a  transaction  

    Disk  space  is  freed  when  a  user  completes  a  session  or  a  transaction  

     

Site  Level  Usability  Validation  Pass   Fail   Description  

    There  are  no  framesets  in  the  website.  Framesets  can  be  difficult  to  navigate,  take  too  long  to  download  and  cause  print  problems  

    Content  is  structured  in  terms  of  simple  hierarchies  

    The  user  mental  mode  is  consistent  across  the  entire  website.  Webpage  controls  behavior  and  aesthetics  remain  consistent  

    The  amount  of  time  (based  on  the  number  of  pages)  needed  to  complete  a  multi  page  task  is  perceived  by  the  user  

 

262

 

 Page  Level  Usability  Validation  

Pass   Fail   Description       Graphics  and  other  bandwidth  intensive  

elements  are  kept  to  a  minimum       Key  functions  such  as  search  and  help  

buttons  are  easy  to  find       There  are  no  competing  options  that  might  

confuse  or  cause  him  or  her  to  make  an  error  

    The  content  is  current  and  the  previous  content  is  available  via  an  archive  

    Related  information  on  the  same  page  has  been  grouped,  thereby  minimizing  eye  movement  

    Critical  information  has  not  been  placed  on  the  lower  portion  of  the  webpage.  If  the  position  of  this  information  requires  the  user  to  scroll  down,  most  visitors  are  unlikely  to  ever  read  it  

    Content  makeup  50%  to  80%  of  the  screen  real  estate    

    Vertical  scrolling  has  been  kept  to  a  minimum,  especially  on  navigational  pages  

    When  viewed  via  the  anticipated  clients  hardware/software,  the  page  fits  without  the  need  of  a  horizontal  scrollbar  

    When  printed  via  the  anticipated  clients,  the  page  prints  without  being  truncated  

    Name  and  logo  of  the  emergency  center  is  visible  on  the  page  

    Browser  (e.g.  HTML,  JavaScript  etc)  features  that  have  been  available  for  less  than  1year  have  not  been  used.  A  significant  number  of  users  use  browsers  that  are  less  than  1  year  old  

    No  popup  that  open  new  browser  windows  are  launched  

    All  links  ad  graphics  have  a  TITLE  or  ALT  tag  defined.  Decorative  images  (e.g.  white  space  or  formatting  borders)  should  have  a  blank  tag  defined  

    URL’s  are  all  lower  case  

263

 

    There  are  no  areas  of  large  bright  colors       No  more  than  4  colors  (ignoring  graphics)  

have  been  used  on  the  page       The  page  background  color  is  not  dark       All  controls  have  been  outlined  in  black  for  

clarity,  unless  they  are  exceptionally  small  i.e.  less  than  16x16  pixels  

    Browsers  default  link  colors  have  not  been  overridden  or  altered  

    Page  object  size  have  been  specified  as  %  of  available  screen,  rather  than  a  fixed  pixel  size  

    Text  has  not  been  placed  inside  graphic  files.  This  approach  takes  longer  to  download,  can  be  more  work  to  translate  for  multilingual  websites  and  may  have  quality  issues  with  low  resolution  displays  such  as  WebTV  

    If  using  CSS,  the  web  page’s  presentation  is  still  turned  off  or  not  available  

    Three  (3)  alternative  fonts  (in  the  same  order)  have  been  specified  for  all  text  

    Font  sizes  have  been  specified  as  relative  sizes  (e.g.  heading  1)  rather  than  as  absolute  sizes  

     

Readability  Validation  Pass   Fail   Description  

    A  random  selection  of  passages  all  scored  16  when  measured  using  the  fry  algorithm  

    A  random  selection  of  passages  all  scored  less  than  25  when  measured  using  the  fry  algorithm  

   

Language  Validation  Pass   Fail   Description  

    No  presentation  problems  occur  when  page  is  displayed  in  English  

    No  local  slangs  are  used  anywhere  on  the  page  

264

 

    No  offensive  terms  (when  translated)  are  used  

    Character  sets  for  foreign  languages  are  displayed  correctly  

    Foreign  currencies  are  displayed  correctly  and  converted  if  necessary  

    Date  and  time  formats  are  displayed  correctly  for  the  target  countries  (e.g.  20.01.00  for  European  versus  01/20/00  for  U.S)  

    Address  formats  are  displayed  correctly       Translated  words  have  been  placed  in  the  

correct  order  on  each  webpage,  unlike  American  and  European  languages  that  read  from  left  to  right,  some  languages  read  from  top  to  bottom  and  others  read  from  right  to  left  

    Each  webpage  can  be  viewed  using  a  browser  without  any  special  modifications  (e.g.  the  user  doesn’t  have  to  install  any  non-­‐standard  fonts)  

    Alphabetic  lists  are  sorted  correctly  for  each  language  

    Supporting  documentation  has  been  translated  to  English  (e.g.  help  systems,  error  messages  order  manuals  audio  and  video  clips)  

    The  colors  and  symbols  used  on  this  website  has  a  consistent  meaning  across  all  of  the  required  languages  (e.g.  red  implies  danger  in  North  America  and  happiness  in  China)  

    Databases  are  setup  to  allow  non  standard  alphabets  (e.g.  double  byte  characters)  

        Color  Validation    

Pass   Fail   Description       The  colors  used  on  this  website  are  

friendly  to  color  blind  users       The  colors  used  on  the  website  are  

accurately  displayed  when  using  the  

265

 

minimum  expected  number  of  colors  on  a  client  

    All  colors  used  on  this  website  are  browser  safe  

     

Screen  Size  And  Pixel  Resolution  Validation  Pass   Fail   Description  

    The  website  has  been  designed  to  fit  the  requirements  of  the  lowest  likely  screen  size  and  pixel  resolution  used  by  a  client.  If  the  client’s  capabilities  vary  significantly  then  multiple  websites  have  been  developed  to  accommodate  each  client.  

    The  appearance  of  each  web  page  has  been  tested  using  different  browsers/versions  to  ensure  that  the  page  is  displayed  as  intended  (e.g.  no  horizontal  scroll  bars)  

       

Accessibility  Validation  Pass   Fail   Description       ALT  tags  are  included  with  all  

images  and  TITLE  tags  are  included  with  all  links    

    Color  should  be  used  as  a  sole  means  of  conveying  information  

    Web  pages  that  make  use  of  style  sheets  should  still  be  usable  in  browsers  that  do  not  support  or  have  turned  off  this  functionality  

    Techniques  that  cause  screen  flicker  are  not  used    

    If  image  maps  are  used  an  alternate  list  of  corresponding  lists  is  provided    

    If  applets  or  scripts  are  used,  the  web  page  is  still  usable  if  the  functionality  Is  turned  off  or  not  supported  by  the  browser    

266

 

    If  video  files  are  used  sub  titles  are  also  available  

    If  audio  files  are  used,  transcripts  are  also  provided.  In  addition  to  viewers  with  hearing  difficulties,  some  viewers  may  not  have  speakers  installed.  

    The  web  page  is  understandable  when  heard  through  an  audio  only  browser  

    The  web  page  is  understandable  when  viewed  through  a  text  only  browser  

    Multiple  key  combinations  can  be  entered  sequentially  or  are  mapped  to  a  single  key  

   

Privacy  Validation  Pass   Fail     Description       The  website  has  a  legally  valid  privacy  

statement  posted       The  website  is  approved  by  at  least  one  

external  auditor       The  third  party  seal  of  approval  is  

accurately  displayed  alongside  the  privacy  statement    

     

Acceptable  Response  Times  Time  in  Seconds   Description  of  Action  

Less  than  0.1  seconds   This  is  the  limit  for  having  the  user  feel  that  the  system  is  reacting  instantaneously.  That  is  no  feedback  regarding  the  time  delay  is  necessary  other  than  displaying  the  results.  Example  action  includes  button  clicks  or  client  side  dropdown  menu’s    

Less  than  1.0  second   This  is  the  limit  for  the  users  flow  of  thought  to  be  uninterrupted  even  though  the  user  will  notice  a  slight  delay.  Normally  no  feedback  is  necessary  for  delays  between  0.1  and  1.0  seconds.  The  user  may  lose  the  feeling  of  operating  directly  on  the  data  .Example  actions  include  ,the  initiation  of  page  

267

 

navigation  or  java  applet  execution  Less  than  10  seconds   This  is  the  maximum  amount  of  time  that  can  lapse  

while  keeping  the  users  attention  focused  on  the  dialogue.  Example  actions  include  completion  of  page  navigation  

   

Average  Response  Time  Under  Normal  Conditions  Pass   Fail   Description       95%  of  the  web  pages  download  in  less  

than  10  seconds  when  using  a  28kbps  modem  from  any  location  within  the  continental  US  

    Orders  are  completed  within  2  minutes  of  the  user  requests  

    Confirmations  of  the  request  actions  are  made  within  30  seconds  

   

Determining  Stress  Points  Maximum  Value     Description  

  Determine  the  maximum  number  of  requests/actions  per  second  the  website  can  handle    

  Determine  the  maximum  number  of  session  initiations  per  hour  that  the  website  can  be  expected  to  support  

  Determine  the  maximum  number  of  concurrent  users  that  the  website  can  be  expected  to  support  

     

System  Approaches  Maximum  Capacity  Pass   Fail   Description  

At  80%  Capacity       Until  the  system  returns  to  normal  

operating  conditions,  new  clients  who  try  to  log  on  will  be  given  a  message  to  try  again  later  

    Inactive  clients  will  be  given  a  warning  message  that  they  may  be  dropped  from  the  system  and  not  be  permitted  to  log  on  again  until  conditions  return  to  normal    

    Non  critical  services  will  be  shut  down  in  the  order  of  least  to  most  important  

268

 

    Pager  or  email  notification  of  potential  gridlock  is  sent  to  technical  support  personnel  

At  90%  Capacity       Inactive  clients  will  be  logged  off       Backup  websites  will  be  activated       Pager  or  email  notification  of  potential  

gridlock  are  resent  to  technical  support  personnel    

At  100+%  capacity       The  system  does  not  allows  any  new  

requests  to  be  initiated       The  system  does  not  reboots  itself       The  system  does  not  shut  down  security  

services       The  system  does  not  suspend  any  

transaction  logging       The  system  does  not  gridlock       Hardware  components  do  no  fuse  or  

meltdown       Page  or  email  notification  of  impending  

gridlock  are  sent  to  technical  support  At  any  Capacity  

    The  system  maintains  its  functional  integrity  

     

Performance  During  Spikes  Pass   Fail   Description       No  user  who  were  logged  in  to  the  website  

prior  to  the  spike  are  dropped       Transactions/Requests/Actions  that  were  

started  before  the  spike  are  still  in  progress  and  successfully  completed  after  the  spike  

    New  users  are  able  to  login  to  the  website  during  and  after  the  spike  

    Security  services  remain  active  during  the  spike  

 

269

 

 1.6. Database  Testing:  

• Check  for  data  integrity  and  errors  while  you  edit,  delete,  modify  the  forms  or  do  any  database  related  functionality.      

• Check  if  all  the  database  queries  are  executing  correctly  and  data  is  retrieved  correctly  and  also  updated  correctly.    

     

2. Usability  Testing    2.1. Objective:  The  goals  of  usability  testing  include  establishing  a  baseline  of  user  performance,  establishing  and  validating  user  performance  measures,  and  identifying  potential  design  concerns  to  be  addressed  in  order  to  improve  the  efficiency,  productivity,  and  end-­‐user  satisfaction.    The  usability  test  objectives  are:  

• To  determine  design  inconsistencies  and  usability  problem  areas  within  the  user  interface  and  content  areas.  Potential  sources  of  error  may  include:  

o Navigation  errors  –  failure  to  locate  functions,  excessive  keystrokes  to  complete  a  function,  failure  to  follow  recommended  screen  flow.  

o Presentation  errors  –  failure  to  locate  and  properly  act  upon  desired  information  in  screens,  selection  errors  due  to  labeling  ambiguities.  

o Control  usage  problems  –  improper  toolbar  or  entry  field  usage.  

• Exercise  the  application  or  web  site  under  controlled  test  conditions  with  representative  users.  Data  will  be  used  to  access  whether  usability  goals  regarding  an  effective,  efficient,  and  well-­‐received  user  interface  have  been  achieved.  

• Establish  baseline  user  performance  and  user-­‐satisfaction  levels  of  the  user  interface  for  future  usability  evaluations.  

 2.2. Basic  Usability:  

• The  site  should  have  clear  hierarchy    • Headings  clearly  indicate  the  structure  of  the  document  • Navigation  should  be  easy  to  understand  • Navigation  is  consistent  throughout  the  site  • The  site  uses  underlined  links  • The  site  uses  consistent  and  appropriate  language  

270

 

• The  site  has  easy  to  find  sitemap  and  contact  page  • The  site  has  a  search  tool  • The  site  has  a  link  to  home  page  on  every  page  • The  site  has  clearly  defined  visited  links  

   2.3. Methodology:    

• Describe  briefly  the  number  of  participants  • The  setting  of  the  usability  test  sessions  • Tools  used  to  facilitate  the  participant's  interaction  with  the  

application  (ex:  browser)  • The  measures  to  be  collected,  such  as  demographic  information,  

satisfaction  assessment,  and  suggestions  for  improvement    2.3.1. Participants:  

• Thoroughly  describe  the  number  of  participants  expected,  how  they  will  be  recruited,  characteristics  of  their  eligibility,  and  expected  skills/knowledge.  

 • The  participants'  responsibilities  will  be  to  attempt  to  complete  a  

set  of  representative  task  scenarios  presented  to  them  in  as  efficient  and  timely  a  manner  as  possible,  and  to  provide  feedback  regarding  the  usability  and  acceptability  of  the  user  interface.    The  participants  will  be  directed  to  provide  honest  opinions  regarding  the  usability  of  the  application,  and  to  participate  in  post-­‐session  subjective  questionnaires  and  debriefing.  

 • Describe  how  the  team  will  select  test  participants  to  meet  stated  

requirements.  Explain  if  participants  will  have  certain  skills  and/or  background  requirements,  if  they  will  be  familiar  with  the  evaluation  tasks,  or  have  experience  with  performing  certain  tasks.  

 2.3.2. Training:  

• The  participants  will  receive  and  overview  of  the  usability  test  procedure,  equipment  and  software  

• The  parts  of  the  test  environment  or  testing  situation  that  may  be  nonfunctional  

 2.3.3. Procedure:  [Lab  Testing]  

271

 

Participants  will  take  part  in  the  usability  test  at  [Florida  International  University]  in  [Emergency  Operation  Center].  A  [type  of  computer]  with  the  Web  site/Web  application  and  supporting  software  will  be  used  in  a  typical  office  environment.  The  facilitator  seated  in  the  same  office  will  monitor  the  participant’s  interaction  with  the  Web  site/Web  application.  Note  takers  and  data  logger(s)  will  monitor  the  sessions  in  observation  room,  connected  by  video  camera  feed.  The  test  sessions  will  be  videotaped.      

The  facilitator  will  brief  the  participants  on  the  Web  site/Web  application  and  instruct  the  participant  that  they  are  evaluating  the  application,  rather  than  the  facilitator  evaluating  the  participant.  Participants  will  sign  an  informed  consent  that  acknowledges:  the  participation  is  voluntary,  that  participation  can  cease  at  any  time,  and  that  the  session  will  be  videotaped  but  their  privacy  of  identification  will  be  safeguarded.  The  facilitator  will  ask  the  participant  if  they  have  any  questions.    

Participants  will  complete  a  pretest  demographic  and  background  information  questionnaire.  The  facilitator  will  explain  that  the  amount  of  time  taken  to  complete  the  test  task  will  be  measured  and  that  exploratory  behavior  outside  the  task  flow  should  not  occur  until  after  task  completion.  At  the  start  of  each  task,  the  participant  will  read  aloud  the  task  description  from  the  printed  copy  and  begin  the  task.  Time-­‐on-­‐task  measurement  begins  when  the  participant  starts  the  task.      

The  facilitator  will  instruct  the  participant  to  ‘think  aloud’  so  that  a  verbal  record  exists  of  their  interaction  with  the  Web  site/Web  application.  The  facilitator  will  observe  and  enter  user  behavior,  user  comments,  and  system  actions  in  the  data  logging  application  [describe  how  these  metrics  will  be  recorded  if  a  data  logging  application  is  not  used.]    

After  each  task,  the  participant  will  complete  the  post-­‐task  questionnaire  and  elaborate  on  the  task  session  with  the  facilitator.  After  all  task  scenarios  are  attempted,  the  participant  will  complete  the  post-­‐test  satisfaction  questionnaire.      

[For  Remote  Testing]  Participants  will  take  part  in  the  usability  test  via  remote  screen-­‐sharing  technology.  The  participant  will  be  seated  at  their  workstation  in  their  work  environment.  Verbal  communication  will  be  supported  via  telephone.    

272

 

The  facilitator  will  brief  the  participant  and  instruct  that  he  or  she  is  evaluating  the  Web  site/Web  application,  rather  than  the  facilitator  evaluating  the  participant.  Participants  will  complete  a  pretest  demographic  and  background  information  questionnaire.  Sessions  will  begin  when  the  facilitator  answers  all  participant  questions.  The  facilitator  will  inform  the  participant  that  time-­‐on-­‐task  will  be  measured  and  that  exploratory  behavior  outside  the  task  flow  should  not  occur  until  after  task  completion.    

The  facilitator  will  instruct  the  participant  to  read  aloud  the  task  description  from  the  printed  copy  and  begin  the  task.  Time-­‐on-­‐task  measure  will  begin.  The  facilitator  will  encourage  the  participants  to  ‘think  aloud’  and  that  a  verbal  record  will  exist  of  the  task-­‐system  interaction.  The  facilitator  will  observe  and  enter  user  behavior  and  comments,  and  system  interaction  in  a  data  logging  application.    

After  each  task,  the  participant  will  complete  the  post-­‐task  questionnaire  and  elaborate  on  the  task  session.  After  all  tasks  have  been  attempted,  the  participant  will  complete  a  post-­‐test  satisfaction  questionnaire.  

 2.3.4. Roles:  The   roles   involved   in   a  usability   test   are   as   follows.  An   individual  may  play  multiple  roles  and  tests  may  not  require  all  roles.    

2.3.4.1. Trainer:  • Provide  training  overview  prior  to  usability  testing  

 2.3.4.2. Facilitator:  • Provides  overview  of  study  to  participants  • Defines  usability  and  purpose  of  usability  testing  to  participants  • Assists  in  conduct  of  participant  and  observer  debriefing  sessions  • Responds  to  participant's  requests  for  assistance    

2.3.4.3. Data  Logger:  • Records  participant’s  actions  and  comments  

 2.3.4.4. Test  Observers:  • Silent  observer  • Assists  the  data  logger  in  identifying  problems,  concerns,  coding  

bugs,  and  procedural  errors  • Serve  as  note  takers  

 2.3.4.5. Test  Participants:  • Provides  overview  of  study  to  participants  

273

 

• Defines  usability  and  purpose  of  usability  testing  to  participants  • Assists  in  conduct  of  participant  and  observer  debriefing  sessions  • Responds  to  participant's  requests  for  assistance  

 2.4. Usability  Metrics  Usability  metrics  refers  to  user  performance  measured  against  specific  performance  goals  necessary  to  satisfy  usability  requirements.    Scenario  completion  success  rates,  adherence  to  dialog  scripts,  error  rates,  and  subjective  evaluations  will  be  used.    Time-­‐to-­‐completion  of  scenarios  will  also  be  collected.    

 2.4.1. Scenario  Completion  Each  scenario  will  require,  or  request,  that  the  participant  obtains  or  inputs  specific  data  that  would  be  used  in  course  of  a  typical  task.    The  scenario  is  completed  when  the  participant  indicates  the  scenario's  goal  has  been  obtained  (whether  successfully  or  unsuccessfully)  or  the  participant  requests  and  receives  sufficient  guidance  as  to  warrant  scoring  the  scenario  as  a  critical  error.  

 2.4.2. Critical  Errors  Critical  errors  are  deviations  at  completion  from  the  targets  of  the  scenario.    Obtaining  or  otherwise  reporting  of  the  wrong  data  value  due  to  participant  workflow  is  a  critical  error.  Participants  may  or  may  not  be  aware  that  the  task  goal  is  incorrect  or  incomplete.    Independent  completion  of  the  scenario  is  a  universal  goal;  help  obtained  from  the  other  usability  test  roles  is  cause  to  score  the  scenario  a  critical  error.    Critical  errors  can  also  be  assigned  when  the  participant  initiates  (or  attempts  to  initiate)  and  action  that  will  result  in  the  goal  state  becoming  unobtainable.    In  general,  critical  errors  are  unresolved  errors  during  the  process  of  completing  the  task  or  errors  that  produce  an  incorrect  outcome.    2.4.3. Non-­‐critical  Errors  Non-­‐critical  errors  are  errors  that  are  recovered  from  by  the  participant  or,  if  not  detected,  do  not  result  in  processing  problems  or  unexpected  results.    Although  non-­‐critical  errors  can  be  undetected  by  the  participant,  when  they  are  detected  they  are  generally  frustrating  to  the  participant.  

 These  errors  may  be  procedural,  in  which  the  participant  does  not  complete  a  scenario  in  the  most  optimal  means  (e.g.,  excessive  steps  and  keystrokes).    These  errors  may  also  be  errors  of  confusion  (ex.,  initially  selecting  the  wrong  function,  using  a  user-­‐interface  control  incorrectly  such  as  attempting  to  edit  an  un-­‐editable  field).  

 

274

 

Noncritical  errors  can  always  be  recovered  from  during  the  process  of  completing  the  scenario.    Exploratory  behavior,  such  as  opening  the  wrong  menu  while  searching  for  a  function,  [will,  will  not  (edit  Procedure)]  is  coded  as  a  non-­‐critical  error.    2.4.4. Subjective  Evaluations  Subjective  evaluations  regarding  ease  of  use  and  satisfaction  will  be  collected  via  questionnaires,  and  during  debriefing  at  the  conclusion  of  the  session.  The  questionnaires  will  utilize  free-­‐form  responses  and  rating  scales.  

 2.4.5. Scenario  Completion  Time  (time  on  task)  The  time  to  complete  each  scenario,  not  including  subjective  evaluation  durations,  will  be  recorded.    2.5. Usability  Goals  The  usability  goals  are  as  follows:    2.6. Completion  Rate  Completion  rate  is  the  percentage  of  test  participants  who  successfully  complete  the  task  without  critical  errors.    A  critical  error  is  defined  as  an  error  that  results  in  an  incorrect  or  incomplete  outcome.    In  other  words,  the  completion  rate  represents  the  percentage  of  participants  who,  when  they  are  finished  with  the  specified  task,  have  an  "output"  that  is  correct.    Note:  If  a  participant  requires  assistance  in  order  to  achieve  a  correct  output  then  the  task  will  be  scored  as  a  critical  error  and  the  overall  completion  rate  for  the  task  will  be  affected.  

 A  completion  rate  of  [100%/enter  completion  rate]  is  the  goal  for  each  task  in  this  usability  test.    2.7. Error-­‐free  rate  Error-­‐free  rate  is  the  percentage  of  test  participants  who  complete  the  task  without  any  errors  (critical  or  non-­‐critical  errors).    A  non-­‐critical  error  is  an  error  that  would  not  have  an  impact  on  the  final  output  of  the  task  but  would  result  in  the  task  being  completed  less  efficiently.  

 An  error-­‐free  rate  of  [80%/enter  error-­‐free  rate]  is  the  goal  for  each  task  in  this  usability  test.    2.8. Time  on  Task  (TOT)  The  time  to  complete  a  scenario  is  referred  to  as  "time  on  task".    It  is  measured  from  the  time  the  person  begins  the  scenario  to  the  time  he/she  signals  completion.    

 

275

 

2.9. Subjective  Measures  Subjective  opinions  about  specific  tasks,  time  to  perform  each  task,  features,  and  functionality  will  be  surveyed.    At  the  end  of  the  test,  participants  will  rate  their  satisfaction  with  the  overall  system.    Combined  with  the  interview/debriefing  session,  these  data  are  used  to  assess  attitudes  of  the  participants.    2.9.1. Problem  Severity    To  prioritize  recommendations,  a  method  of  problem  severity  classification  will  be  used  in  the  analysis  of  the  data  collected  during  evaluation  activities.    The  approach  treats  problem  severity  as  a  combination  of  two  factors  -­‐  the  impact  of  the  problem  and  the  frequency  of  users  experiencing  the  problem  during  the  evaluation.  

 2.9.2. Impact  Impact  is  the  ranking  of  the  consequences  of  the  problem  by  defining  the  level  of  impact  that  the  problem  has  on  successful  task  completion.    There  are  three  levels  of  impact:  

• High  -­‐  prevents  the  user  from  completing  the  task  (critical  error)  • Moderate  -­‐  causes  user  difficulty  but  the  task  can  be  completed  (non-­‐

critical  error)  • Low  -­‐  minor  problems  that  do  not  significantly  affect  the  task  

completion  (non-­‐critical  error)  

 2.9.3. Frequency  Frequency  is  the  percentage  of  participants  who  experience  the  problem  when  working  on  a  task.  

 • High:  30%  or  more  of  the  participants  experience  the  problem  • Moderate:  11%  -­‐  29%  of  participants  experience  the  problem  • Low:  10%  or  fewer  of  the  participants  experience  the  problem  

   2.9.4. Problem  Severity  Classification  The  identified  severity  for  each  problem  implies  a  general  reward  for  resolving  it,  and  a  general  risk  for  not  addressing  it,  in  the  current  release.  

 Severity  1  -­‐  High  impact  problems  that  often  prevent  a  user  from  correctly  completing  a  task.    They  occur  in  varying  frequency  and  are  characteristic  of  calls  to  the  Help  Desk.    Reward  for  resolution  is  typically  exhibited  in  fewer  Help  Desk  calls  and  reduced  redevelopment  costs.  

276

 

 Severity  2  -­‐  Moderate  to  high  frequency  problems  with  moderate  to  low  impact  are  typical  of  erroneous  actions  that  the  participant  recognizes  needs  to  be  undone.    Reward  for  resolution  is  typically  exhibited  in  reduced  time  on  task  and  decreased  training  costs.  

 Severity  3  -­‐  Either  moderate  problems  with  low  frequency  or  low  problems  with  moderate  frequency;  these  are  minor  annoyance  problems  faced  by  a  number  of  participants.    Reward  for  resolution  is  typically  exhibited  in  reduced  time  on  task  and  increased  data  integrity.  

 Severity  4  -­‐  Low  impact  problems  faced  by  few  participants;  there  is  low  risk  to  not  resolving  these  problems.  Reward  for  resolution  is  typically  exhibited  in  increased  user  satisfaction.  

277

 

 3. Compatibility  Testing    3.1. Browser  Compatibility:    Some  applications  are  very  dependent  on  browsers.  Different  browsers  have  different  configurations  and  settings  that  the  web  page  should  be  compatible  with.  The  web  site  coding  should  be  cross  browser  platform  compatible.  If  the  site  is  using  JavaScript  or  AJAX  it  calls  for  UI  functionality,  performing  security  checks  or  validations  then  give  more  stress  on  browser  compatibility  testing  of  the  web  application.  Test  web  application  on  different  browsers  like  Internet  explorer,  Firefox,  Netscape  navigator,  AOL,  Safari,  Opera  browsers  with  different  versions.  

3.2. OS  Compatibility:    Some  functionality  in  the  web  application  may  not  be  compatible  with  all  operating  systems.  All  new  technologies  used  in  web  development  like  graphics  designs,  interface  calls  like  different  API’s  may  not  be  available  in  all  Operating  Systems.  Testing  the  web  application  on  different  operating  systems  like  Windows,  Unix,  MAC,  Linux,  Solaris  with  different  OS  flavors.  

3.3. Mobile  Browsing:  This  is  new  technology  age.  Mobile  browsing  will  be  the  future  for  Internet  browsing.  Testing  the  web  pages  on  mobile  browsers  is  highly  important.  Compatibility  issues  may  be  there  on  mobile.  Currently  ,the  system  is  not  designed  for  mobile  browsing,  although  this  is  an  area  we  can  add  in  future.  

3.4. Printing  Options:    Website  page-­‐printing  options:  make  sure  fonts,  page  alignment,  page  graphics  get  printed  properly.  Pages  should  fit  to  paper  size  or  as  per  the  size  mentioned  in  printing  option.  

4. Performance  Testing:  Web  application  should  sustain  to  heavy  load.  Web  performance  testing  should  include:  Web  Load  Testing  &  Web  Stress  Testing  

4.1. Web  Load  Testing:    Test  application  performance  on  different  Internet  connection  speeds.  In  web  load  testing  test  if  many  users  are  accessing  or  requesting  the  same  page.  Can  system  sustain  in  peak  load  times?  Site  should  handle  many  simultaneous  user  requests,  large  input  data  from  users,  Simultaneous  connection  to  DB,  heavy  load  on  specific  pages  etc.  

278

 

4.2. Stress  Testing:    Generally  stress  means  stretching  the  system  beyond  its  specification  limits.  Web  stress  testing  is  performed  to  break  the  site  by  giving  stress  and  checked  how  system  reacts  to  stress  and  how  system  recovers  from  crashes.  Stress  is  generally  given  on  input  fields,  login  and  sign  up  areas.  

In  web  performance  testing  web  site  functionality  on  different  operating  systems,  different  hardware  platforms  are  checked  for  software,  hardware  memory  leakage  errors.  

5. Security  Testing  • Test  by  pasting  internal  URL  directly  into  browser  address  bar  without  

login.  Internal  pages  should  not  open.  • If  you  are  logged  in  using  username  and  password  and  browsing  internal  

pages  then  try  changing  URL  options  directly.  I.e.  If  you  are  checking  some  publisher  site  statistics  with  publisher  site  ID=  123.  Try  directly  changing  the  URL  site  ID  parameter  to  different  site  ID,  which  is  not  related  to,  logged  in  user.  Access  should  deny  for  this  user  to  view  others  stats.  

• Try  some  invalid  inputs  in  input  fields  like  login  username,  password,  and  input  text  boxes.  Check  the  system  reaction  on  all  invalid  inputs.  

• Web  directories  or  files  should  not  be  accessible  directly  unless  given  download  option.  

• Test  the  CAPTCHA  for  automates  scripts  logins.  • Test  if  SSL  is  used  for  security  measures.  If  used  proper  message  should  

get  displayed  when  user  switch  from  non-­‐secure  http://  pages  to  secure  https://  pages  and  vice  versa.  

• All  transactions,  error  messages,  security  breach  attempts  should  get  logged-­‐in  log  files  somewhere  on  web  server.  

279

APPENDIX J

FUNCTIONAL TESTING CHECKLIST

280

 

 

 Functional  Testing  Checklist    

1. Trainee

1.1. Login 1.1.1. Main browser login message 1.1.2. Invalid Login 1.2. User Tutorial 1.2.1. View user tutorial 1.3. Exercise Login 1.3.1. Position Login 1.3.1. Select Role to Be 1.3.2. Select Script to Use for Exercise

1.4. Main Panel 1.4.1. Collapsible Menus 1.4.2. Exercise Background

1.4.2.1. View EOC Layout 1.4.2.1.1. View to interactive floor

plan 1.4.2.2. View Player Handbook 1.4.2.3. Initial Status 1.4.2.3.1. Check Starting Status 1.4.2.3.2. Check multiple starting

statuses 1.4.3 Boards 1.4.3.1. Position Log 1.4.3.1.1. Post a Position Log 1.4.3.1.2. Edit a Position Log 1.4.3.1.3. Sort the Position Log 1.4.3.1.4. Delete a Position Log 1.4.3.2. Significant Events 1.4.3.2.1. Post a Significant Event 1.4.3.2.2. Edit a Significant Event 1.4.3.2.3. Delete a Significant Event 1.4.3.3. Status Board 1.4.3.3.1. Infrastructure 1.4.3.3.2. Human Services 1.4.3.3.3. Public Safety 1.4.3.4. Mission/Tasks 1.4.3.1. Create a Mission/Task 1.4.3.2. Edit/Update Mission/Task

281

 

 

1.4.3.3. Sort Mission/Tasks 1.4.3.4. Delete Mission/Task 1.4.3.5. Resource Requests 1.4.3.5.1. Submit a Resource Request 1.4.3.5.2. Edit/Update a Resource

Request 1.4.3.5.3. Delete a Resource Request 1.4.3.5.4. Check the status of a

Resource Request 1.4.4. Links 1.4.4.1. View Links 1.4.5. References 1.4.5.1. Incident Action Plan 1.4.5.1.1. View Incident Action Plan 1.4.5.2. Disaster Map 1.4.5.2.1. View the Disaster Map 1.4.4.2.2. Edit/Update Disaster Map 1.4.5.2.3. Clear Disaster Map 1.4.5.2.4. Save Disaster Map 1.4.5.3. Position Responsibilities 1.4.5.3.1. View position

responsibilities 1.4.6. Reports 1.4.6.1. Road Closures 1.4.6.1.1. Check Status of Road

Closures 1.4.6.1.2. Create Status of Road

Closures 1.4.6.1.3. Edit/Update status of

Road Closures 1.4.6.1.4. Delete status of Road

Closures 1.4.6.1.5. Dynamic status updates 1.4.6.2 Shelters 1.4.6.2.1. Check Status of Shelters 1.4.6.2.2. Create Status of Shelters 1.4.6.2.3. Edit/Update status of

Shelters 1.4.6.2.4. Delete status of Shelters 1.4.6.2.5. Dynamic status updates 1.4.6.3. Points of Distribution (PODs) 1.4.6.3.1. Check Status of PODs 1.4.6.3.2. Create Status of PODs 1.4.6.3.3. Edit/Update status of

282

 

 

PODs 1.4.6.3.4. Delete status of PODs 1.4.6.3.5. Dynamic status updates 1.4.6.4. Hospitals 1.4.6.4.1. Check Status of Hospitals 1.4.6.4.2. Create Status of Hospitals 1.4.6.4.3. Edit/Update status of

Hospitals 1.4.6.4.4. Delete status of Hospitals 1.4.6.4.5. Dynamic status updates 1.4.6.5. Lives Data 1.4.6.5.1. View Lives Data 1.4.6.5.2. Add Lives Data 1.4.6.5.3. Edit/Update Lives Data 1.4.6.5.4. Delete Lives Data 1.4.7. Logistics 1.4.7.1. Miami Dade Resources 1.4.7.1.1. View Miami Dade

Resources 1.4.7.1.2. Add a Miami Dade

Resource 1.4.7.1.3. Edit/Update Miami Dade

Resources 1.4.7.1.4. Delete Miami Dade

Resources 1.4.7.2. Resources 1.4.7.2.1. Acquire a Contract

Resource

1.4.7.2.1.1. In-house 1.4.7.2.1.2. Out-house 1.4.7.2.1.3. Log to EM Constellation

1.4.7.2.1.4. Dynamic status updates 1.4.7.2.1.5. View Resource

Requests 1.4.7.2.2. Approve a Resource

Request 1.4.7.2.2.1. Update a Resource

Request 1.4.7.2.2.2. Change the status of

Resource Request 1.4.7.2.2.3. Add logistical notes to

Resource Request

283

 

 

1.5. Exercise Panel 1.5.1. Collapsible Menus

1.5.2. Disaster Assistant 1.5.2.1. Ask a Question to the Disaster

Assistant 1.5.3. Dashboards 1.5.3.1. Cost to County 1.5.3.1.1. View dashboard data 1.5.3.2. Casualties 1.5.3.2.1. View dashboard data 1.5.3.3. Received Injects 1.5.3.3.1. View Received injects 1.5.4. Communication Tools 1.5.4.1. Chat 1.5.4.1.1. Face-to-Face 1.5.4.1.1. Initiate Chat 1.5.4.1.2. End Chat 1.5.4.1.3. Receive Chat 1.5.4.1.4. Accept Chat 1.5.4.1.5. Reject Chat 1.5.4.1.2. Telephone 1.5.4.1.1. Initiate Chat 1.5.4.1.2. End Chat

1.5.4.1.3. Receive Chat 1.5.4.1.4. Accept Chat 1.5.4.1.5. Reject Chat 1.5.4.1.3. Radio 1.5.4.1.1. Initiate Chat 1.5.4.1.2. End Chat 1.5.4.1.3. Receive Chat 1.5.4.1.4. Accept Chat 1.5.4.1.5. Reject Chat 1.5.4.1.4. Cellphone 1.5.4.1.1. Initiate Chat 1.5.4.1.2. End Chat 1.5.4.1.3. Receive Chat 1.5.4.1.4. Accept Chat 1.5.4.1.5. Reject Chat 1.5.4.1.5. PDA 1.5.4.1.1. Initiate Chat 1.5.4.1.2. End Chat 1.5.4.1.3. Receive Chat 1.5.4.1.4. Accept Chat

284

 

 

1.5.4.1.5. Reject Chat 1.5.5. Scripting

1.5.5.1. the exercise begins (Play) 1.5.5.2. the exercise is paused

(Pause) 1.5.5.3. the exercise is terminated

(Stop) 1.5.5.4. the exercise is in Normal

Time (Normal Time) 1.5.5.5. the exercise is in Fast Time

(Fast Time/Next Block) 1.5.6. Injects

1.5.6.1. Acknowledge inject 1.5.6.2. Clarify an inject 1.5.7. Global Updates 1.5.7.1. Receive a global update 1.5.5. Exercise Log (not seen in player console) 1.5.5.1. Log chats 1.5.5.2. Log significant events 1.5.5.3. Log user actions during the

exercise 1.5.5.3.1. log user response to

injects 1.5.5.4. log position logs 1.5.9. Logout 1.5.9.1 Regular logout 1.5.9.2. Automatic logout if time

expires 1.5.9.3. Automatic logout if user

closes windows without logging out 1.5.10. Other 1.5.10.1. Customized Roles 1.5.10.1.1. Add a customized role 1.5.10.1.2. Delete a customized role 1.5.10.2. Invalid Entries

2. Exercise Developer

2.1. Login 2.1.1. Main browser login message 2.1.2. Invalid Login 2.2. Collapsible Menus 2.3. Exercise Development Tools 2.4. Handbook Developer

285

 

 

2.4.1. Update the handbook developer

2.5. Player Responsibilities Developer 2.5.1. Add player responsibilities 2.5.2. Edit player responsibilities 2.5.3. Delete player responsibilities 2.6. Initial Status Developer 2.6.1. Create starting status 2.6.1.1. Update starting status 2.6.1.1.1. Text 2.6.1.1.1.1. Insert text 2.6.1.1.1.2. Update text 2.6.1.1.2. Figures 2.6.1.1.2.1. Insert figure 2.6.1.1.2.2. Change

figure 2.6.1.1.2.3. Delete figure 2.6.1.2. Create multiple starting

status reports 2.7. Target Capabilities Developer 2.7.1. Target Capabilities 2.7.1.1. Add target capabilities to

script 2.7.1.1.1. Add new target

capability 2.7.1.1.2. Add target

capability from database 2.7.1.2. Edit target capabilities 2.7.1.3. Delete target capabilities

from script 2.7.2. Target Capability Metrics 2.7.2.1. Add target capability metrics

to script 2.7.2.1.1. Add new target

capability metric 2.7.2.1.2. Add target capability

metric from database 2.7.2.2. Edit target capability metrics 2.7.2.3. Delete target capability

metrics from script 2.7.3. Exercise Objectives 2.7.3.1. Create exercise objectives 2.7.3.2. Edit exercise objective

286

 

 

2.7.3.3. Delete exercise objectives 2.5. Script Developer 2.5.1. Create a Script 2.5.2. Edit Script 2.5.2.1. Injects 2.5.2.1.1. Add inject from

Database 2.5.2.1.2. Add New Inject 2.5.2.1.3. Delete an inject from

the script 2.5.2.1.4. Edit an inject 2.5.2.1.5. Move Injects Around

Ad-hocly 2.5.3. Delete Script 2.5.4. Import/Upload Script 2.5.5. Export Script 2.9. Database Control Tools 2.9.1. Reset the tables for the script 2.9.2. Reset the tables for the player 2.9.3. Archive the script (and all of its

associated tables for later viewing, if desired)

2.10. Exercise Control and Tracking 2.11. Exercise Controller 2.11.1. Control the Exercise 2.11.1.1. Start Exercise 2.11.1.2. Pause Exercise 2.11.1.3. Terminate Exercise 2.11.1.4. Next Block 2.11.1.5. Fast Time 2.11.1.6. Move Injects Around Ad-

hocly 2.11.2. Exercise Log 2.11.2.1 View the exercise log 2.11.2.2. Filter the exercise log 2.11.2.3. Download as a CSV 2.12. Exercise Reports and Evaluation 2.12.1. Player Reports 2.12.1.1. View Player Reports 2.12.1.2. Filter Player Reports 2.12.2. Evaluation Metrics 2.12.2.1. View Evaluation Metrics 2.13. External Links

287

 

 

2.13.1. View external links 2.14. References 2.14.1. Disaster Map 1.14.1.1. View the Disaster Map 1.14.1.2. Edit/Update Disaster Map 1.14.1.3. Clear Disaster Map 1.14.1.4. Save Disaster Map 2.14.2. Miami Dade Resources

2.14.2.1. Add a Miami Dade Resource

2.14.2.2. View Miami Dade Resources

2.14.2.3. Edit/Update Miami Dade Resources

2.14.2.4. Delete Miami Dade Resources

2.15. Logout 1.15.1 Regular logout 1.15.2. Automatic logout if time

expires 1.15.3. Automatic logout if user

closes windows without logging out 2.16. Other 2.16.1. Invalid entries

3. Researcher

3.1. Login 3.1.1. Main browser login message 3.1.2. Invalid Login

3.2. Collapsible Menus 3.3. Researcher Tools 3.4. Evaluation Metrics 3.4.1. Exercise Reports 3.4.1.1. View evaluation metrics 3.4.2. Exercise Log 3.4.2.1. View the exercise log 3.4.2.2. Filter the exercise log 3.5. Player Reports 3.5.1. View player report 3.6. Position Logs 3.6.1. View position log 3.7. References 3.7.1. Disaster Map 3.7.1..1. View the Disaster Map

288

 

 

3.7.1.2. Edit/Update Disaster Map 3.7.1.3. Clear Disaster Map 3.7.1.4. Save Disaster Map 3.5. Exercise Development Tools 3.9. Script Developer 3.9.1. Create a Script 3.9.2. Edit Script 3.9.2.1. Injects 3.9.2.1.1. Add inject from

Database

3.9.2.1.2. Add New Inject 3.9.2.1.3. Delete an inject from

the script

3.9.2.1.4. Edit an inject 3.9.2.1.5. Move Injects Around

Ad-hocly 3.9.3. Delete Script 3.9.4. Import/Upload Script 3.9.5. Export Script 3.10. Logout 1.10.1 Regular logout 1.10.2. Automatic logout if time

expires 1.10.3. Automatic logout if user

closes windows without logging out 3.11. Other 3.11.1. Invalid entries

4. Administrator

4.1. Create Console 4.1.1. Login 4.1.1.1. Create User Logins 4.1.1.2. Edit User Logins 4.1.1.3. Delete User Logins 4.1.1.4. Reset Locked Players 4.2. Manual Database Access 4.2.1. Modify tables and data in tables 4.3. Logout 4.3.1 Regular logout 4.3.2. Automatic logout if time expires

289

 

 

4.3.3. Automatic logout if user closes windows without logging out

4.4. Other 4.4.1. Invalid entries

5. Database

5.1. Input validation 5.1.1. Input validation

6. System

5.1. System scalability 5.2. Security 5.3. Web-browser compatibility 5.3.1. Firefox 3.6.18 or greater 5.3.2. Internet Explorer 5.3.3. Safari 5.3.4. Chrome

 

290

APPENDIX K

vEOC USABILITY TEST

Below is the usability test completed several times throughout the development

lifecycle.

291

Usability Test

Part I: The Trainee Console Initial status

Task I: Submit a resource request for 200 sheets of plywood needed by John Adams at the Build a House agency. Check the status of your resource request.

Task II: Ask a question to the disaster assistant. The disaster assistant is the automated help bot. An example question is “What is an inject?” or you can also type “menu” or “What is the weather today?”

Task III: Have a face-to-face conversation with the Salvation Army regarding the mission request you sent earlier.

Task IV: Post a position log of the actions you have taken including a summary of the shelter situation.

Task V: Logout of the trainee console. Question: On a scale of 1 to 5, with 5 being the most difficult and 1 being the easiest, rate the difficulty you experienced completing these tasks.

Part II: The Exercise Developer Console

Task I: Open a new tab in Firefox and log into the vEOC as an exercise developer (see menu at the top right hand corner) (http://216.151.208.117:8080/veoc/RegularLogin2.php)

Task I: Take a few moments to go through the various options on the two panels and familiarize yourself with the system.

292

Task II: Again, note that you can click on a gray title bar to collapse or expand the menus.

Task III: Create target capabilities. Click on target capabilities developer. Choose the Hurricane2 script. Click on add target capability. Click on show common mission: communications. Add target capability 4.2.1.1.

Task IV: Create metrics for the exercise. Click on add metric. Search for metric 4.2.1. Add metric 4.2.1.

Task V: Add exercise objectives. Click on add exercise objectives. Click show all. Add any exercise objective to the script. Close the window.

Task VI: Update the handbook developer. Choose the Hurricane2 script. Updated one element of the handbook and save the handbook.

 

Task VII: Create a new script using the script developer. NOTE: Use a 1-word name with no special characters.

Task VIII: Open the script Hurricane2 using the script developer. Add an inject that has already been created in the database. Change the time to anytime between 0005 and 0010 seconds. (eg 0005). Change the sending agency to the Salvation Army and the receiving agency to the Coast Guard. Also, ensure there are no apostrophes, quotations, or slashes in the message text. Add another inject that has already been created in the database. Again, change the time to anytime between 0005 and 0010 seconds. (eg 0007). Change the sending agency to the Salvation Army and the receiving agency to the Coast Guard. Again, ensure there are no apostrophes, quotations, or slashes in the message text. Also, use a different communication medium from the first inject.

Task IX: Create an entirely new inject and add it to the database. Set the time to anytime between 0005 and 0010 seconds. (eg 0005). Set the sending agency to the Salvation Army and the receiving agency to the Coast Guard.

293

Do not use any apostrophes, quotations, or slashes in the message text. Use a different communication medium than the other two injects you just created.

Task X: Create another entirely new inject and add it to the database. Set the time to anytime between 0010 and 0015 seconds. (eg 0012). Set the sending agency to the Salvation Army and the receiving agency to the Coast Guard. Do not use any apostrophes in the message text. Create an inject stating that “Radioactivity has been detected between Miami Gardens, Golden Glades, and North Miami.” Use any communication medium you desire.

Task XI: Delete the first inject you added to the script. Close the script.

Task XII: Switch back to the trainee console tab. Log in again using the public safety group and the coast guard position.

Exercise Controller – part of Exercise Developer Console Task XIII: Now you are going to use the exercise controller to test the script you just created. Open the Hurricane2 script, then when you are ready, hit the play button. At this time, you will receive an alert on the exercise developer console and on the trainee console that the exercise is about to begin. Minimize the exercise developer console.

Trainee Console (continued) Task XIV: Switch to the trainee console. Note the color of the communication tools on the exercise panel. Areas highlighted in red indicate that you have received an inject over this communication medium. Click on any communication medium. Acknowledge this inject. Click on the remaining communication tool and Clarify an aspect of this inject with the Salvation Army (by clicking on the Clarify button).

Task XV: Update the Disaster Map. When the last inject arrives, update the disaster map as appropriate. Save your changes to the map.

Logistics Task XVI: Approve a logistics request. Acquire a contract resource. (there is nothing to do here, but make a show all request and pretend that you made a telephone call)

294

Logistics Task XVII: Update a logistics request.

Task XVIII: Update lives data to indicate that 200 lives have been lost and 40 lives have been rescued so far in the storm. Add another lives data point with any data you wish.

Task XIX: View the Casualties dashboard (on the exercise control panel)

Task XX: View the Received Injects and note the injects received and your responses to the injects.

Common Operating Picture

Task XXI: Open the disaster map and view the common operating picture.

Task XXII: Log out of the Trainee Console

Exercise Controller (continued) Task XXIII: Look at your Player Report. Select the Hurricane2 script and Coast Guard player. Task XXIV: Look at the Evaluation Metrics. Select the Hurricane2 script and Coast Guard player

Task XXV: Log out of the Exercise Developer Console

Question: On a scale of 1 to 5, with 5 being the most difficult and 1 being the easiest, rate the difficulty you experienced completing these tasks.

Part III: Researcher Console

Task I: Open a new tab in Firefox and log into the vEOC as a researcher (http://216.151.208.117:8080/veoc/RegularLogin2.php)

295

Task II: Choose the exercise metrics that you wish to evaluate during the exercise.

Task III: View the logs of the exercise.

Task IV: View the Coast Guard player report for the Hurricane2 the exercise.

Task V: Log out of the Researcher Console

Question: On a scale of 1 to 5, with 5 being the most difficult and 1 being the easiest, rate the difficulty you experienced completing these tasks.

Part IV: Follow-up Questions

1. Identify three player evaluation metrics you would like to see used in this program. Such as time to complete a task etc.

2. Identify three research metrics you would like to see used in this program. Such as time to complete a task etc.

3. Identify three dashboard metrics you would like to see used in this program. Such as time to complete a task etc.

4. Are there any parts of the system that you feel are unnecessary?

5. Are there any additional parts of the system that should be added?

6. Identify three items that would be helpful in making a critical decision, such as cost to the county, lives saved etc.

296

7. Rate your experience with the user interface (very pleasant, pleasant, fair, poor, awful)

8. Do you have any additional comments? Thank you for your time and participation in helping to create a better virtual training center.

 

297

APPENDIX L

vEOC USER MANUAL

298

   

Virtual  Emergency  Operations  Center  

   

User  Manual      

299

 

Table  of  Contents      

VEOC  Summary  Accessing  the  VEOC  Webpage  User  Profiles     Trainee       User  Permissions       Logging  On         View  Tutorial         Selecting  a  Script,  Role  and  Individual  Main  Panel       Exercise  Background         EOC  Layout         Player  Handbook         Initial  Status       Boards         Position  Log         Significant  Events         Status  Board         Mission/Tasks         Resource  Requests       Links         CNN         National  Weather  Service       References         Incident  Action  Plan         Disaster  Map           Position  Responsibilities       Reports         Road  Closures         Shelters         PODs         Hospitals         Lives  Data  Exercise  Panel       Interactive  Assistance         Disaster  Assistant       Dashboards         Cost  to  County         Casualties       Received  Injects         Received  Injects          

300

 

Communication  Tools         Face-­‐to-­‐Face         Telephone         Radio         Cellphone         PDA         Global  Updates      Advanced  User  Functionality  (Exercise  Developers  and  Researchers)        Exercise  Developer     User  Permissions     Logging  On     Exercise  Development  Tools       Handbook  Developer       Player  Responsibilities  Developer       Initial  Status  Developer       Target  Capabilities  Developer       Script  Developer       Database  Control  Tools     Exercise  Control  and  Tracking       Exercise  Controller       Exercise  Log     Exercise  Reports  and  Evaluation       Player  Reports       Evaluation  Metrics     External  Links       FEMA  Training  Center     References       Disaster  Map       Miami-­‐Dade  Resources    Researcher     User  Permissions     Logging  On     Researcher  Tools       Evaluation  Metrics     Exercise  Reports       Exercise  Log       Player  Reports       Position  Logs     References       Disaster  Map       View  Scripts    

301

 

  Exercise  Development  Tools       Script  Developer    Scripts     Creating  Scripts     Editing  Scripts     Deleting  Scripts  Injects     Creating  Injects     Editing  Injects     Deleting  Injects          Glossary    

302

 

VEOC  Summary    The  Virtual  Emergency  Operations  Center  (VEOC)  is  a  web-­‐based  research  tool  designed   to   allow   researchers   to   evaluate   and  manipulate   certain   tasks   and  metrics  based  upon  a  user’s  actions.      The  current  version  allows  a  user  to  assume  one  of   three  roles,  a   trainee  (or  user),  an  exercise  developer  or  a  researcher.      Each  user  is  assigned  different  permissions  and  is  expected  to  perform  certain  tasks.  A   trainee   for  example   can  assume   the   role  of   any  employee  or  person  interacting   with   the   Emergency   Operations   Center.   An   Exercise   developer  exists   for   the  purpose  of   facilitating   the  simulation  exercises.  The  researcher  role   is   assumed   by   someone   researching   the   interactions   of   the   VEOC   users  and   is   able   to  manipulate   the   data   collected   from   the   simulation   to   suit   his  needs.      Although  some  of  the  functionality  is  shared  across  the  different  types  of  user,  it  is  important  to  select  your  proper  profile  when  interacting  with  the  VEOC.                

303

 

 You  access  the  VEOC  like  any  other  website.    Open   an   Internet   browser   client   such   as   Chrome,   Firefox   or   IE   (Internet  Explorer)  and  point  to  the  following  address:  129.74.246.242    The   VEOC  Webpage   should   have   successfully   loaded   into   your   browser   and  you  should  now  have  the  option  to  logon  to  the  VEOC.              Once   the   webpage   is   loaded,   your   screen   should   look   similar   to   the   image  below.      

Accessing  the  VEOC  Webpage  

Note:   You  may   need   to  update   your  web   browser   if   the   page   has   trouble  displaying.    

304

 

 As  discussed  earlier  in  the  VEOC  Summary,  three  types  of  user  profiles  are  available:    

• User  (Trainee)  • Exercise  Developer  • Researcher  

 Below  we  will  outline  each  logon  type  and  briefly  discuss  the  functionality  and  permissions  provided  to  each  one.      User  (Trainee)    Most  people  will   logon  as  a  User   (Trainee).  With   this   type  of   logon,   the  user  will  be  able  to  assume  a  role  (e.g.  City  Manager)  and  interact  with  other  users.  As  a  user,  all  of  your  actions  will  be  tracked  and  stored  in  a  database  accessible  by  the  exercise  developer  and  researcher.      Exercise  Developer    The  exercise  developer’s  role  is  unique  to  the  VEOC,  his  job  is  to  create  scripts  and   injects,   control   the   exercise   session   and   make   adjustments   to   research  metrics  as  necessary.    Researcher    You   can   think   of   the   researcher   role   in   the   literal   meaning   of   the   term,   a  person   conducting   research.   The   researcher   can   create   scripts,   manipulate  metrics   and   view   exercise   data   but   he   does   not   interact   with   the   trainees  (users  logged  on  as  User)  in  anyway  during  the  exercise.        

User  Profiles    

305

 

User  Permissions    A  User  (Trainee)  has  two  control  panels,  the  trainee  console  and  the  exercise  panel.    Most   functionality   will   occur   through   the   trainee   console   but   the   receipt   of  chat   requests   and   injects   as   well   as   other   functions   will   occur   through   the  exercise  panel.      A   User   (Trainee)   will   receive   injects   created   by   the   exercise   developer   or  researcher   and   needs   to   respond   accordingly   by   using   the   functionality  available  to  him.      Logging  On    Once  you  have  successfully  accessed  the  vEOC  Webpage,  the  next  step  will  be  to  logon.  Select  User  Login  at  the  top  right  hand  corner  of  the  screen  to  logon  as  a  User  (Trainee).                                                

User  (Trainee)  

306

 

After  logging  on,  you  will  be  presented  with  the  option  to  either  view  a  tutorial  before  proceeding  to  the  VEOC  or  to  continue  without  viewing  the  tutorial.                                  If   you   choose   to   view   the   tutorial,   you   will   be   directed   to   a   Template   page  containing   the   available   tutorials.  Note:   if   no   tutorials   are   available   this  page  will  be  blank.                                                  

307

 

After   either   viewing   the   tutorial   or   skipping   ahead   to   the  VEOC,   you  will   be  prompted  to  choose  a  script,  a  role  and  an  Individual  (if  applicable).                                  After  selecting  a  script,  role  and  individual  (if  applicable),  two  new  screens  will  launch:    

• VEOC  2.0  TRAINEE  CONSOLE  • VEOC  2.0  EXERCISE  PANEL  

                                         

Note:  For  the  next  step  it  is  important  that  you  disable  your  popup  blocker.  

308

 

 Your  screen  should  look  similar  to  the  image  below:                                                            Trainee  Console    The   trainee   console   is   the  window   that   pops   up   to   the   left   after   logging   on  from   the   main   screen.   The   main   tab   headings   (the   gray   header   bars   in   the  window)  can  be  minimized  and  maximized  by  clicking  on  them.    Exercise  Background    In  this  tab,  you  will  find  information  pertaining  to  the  EOC  and  your  exercise.  The  following  links  are  available  under  this  tab:  

Note:  To  exit  these  windows,  click  logout.  If  you  close  a  window  without  logging   out   you  will   have   to   reload   the   VEOC   and   logon   again   to  make  them  reappear.  You  can  minimize  the  windows  if  you  want  but  you  must  leave  all  three  running.    

309

 

 • EOC  Layout  

o Link  to  a  PDF  document  containing  a  map  of  the  EOC.  • Player  Handbook  

o Browser   based   handbook   created   by   the   exercise   developer  covering  the  purpose  and  scope  of  the  exercise.    

• Initial  Status  o The  status  of  the  simulation  upon  commencing  the  exercise.  This  

is  moderated  by  the  exercise  developer.    Boards    Boards  allow  the  User  (Trainee)  to  publish  updates  that  will  be  viewable  to  all  other  users.  In  this  menu  you  will  find  the  following  links:    

• Position  Log  o Needs  development*      

• Significant  Events  o Add  events  and  view  significant  events  log.    

• Status  Board  o Overview   of   data   entered   in   the   Reports   tab.   Click   on   the   top  

menu  bar  to  isolate  data  from  a  specific  report.    • Mission/Tasks  

o This  tab  is  used  to  send  requests  to  other  agencies.  Here  the  user  can  enter  a  request  by  specifying   the  agency,  mission/task  and  priority.  Any  user  can  view,  edit  or  delete  these  requests.    

• Resource  Requests  o A   resource   request   is   similar   to   entering   a   mission/task.   It   is  

visible   to  all  users  and  can  be  edited  by  anyone.  To  add  a  new  resource   request   you   must   define   the   resource   requested,  resource   type,   quantity,   unit,   priority,   reason   needed,   special  instructions,  address,  requesting  organization,  requestor’s  name  and  requestor’s  phone  number.    

 Links    The  Links  section  provides  a  link  to  external  websites.  If  you  click  a  link  a  new  browser  window  will  open,  your  existing  windows  will  not  be  affected.      

• CNN  o Link  to  CNN.COM  

• National  Weather  Service  o Link  to  WEATHER.GOV  

   

310

 

References    The   References   section   contains   information   related   to   the   exercise.   A   User  (Trainee)   only   has   permission   to   view   the   content   in   this   section   with   the  exception  of  the  disaster  map.  The  disaster  map  can  be  edited  and  viewed  by  any  type  of  user.  The  References  section  contains  the  following  links:    

• Incident  Action  Plan  o Link  to  a  PDF  file  (uploaded  by  exercise  developer).    

• Disaster  Map  o Interactive   map   that   allows   any   user   to   plot   flood,   fire,  

radioactivity  and   first  aid   locations.  The  map  can  be  saved  and  viewed  by  any  user.  To  add  a  disaster  or  share  point  simple  click  one  of   the   links  and   then  click   the  area  on   the  map  where  you  would  like  it  to  be  added.  You  must  save  changes  for  them  to  be  visible  to  other  users.      

• Position  Responsibilities  o The   exercise   developer   defines   position   responsibilities.   Your  

only   responsibility   as   a   user/trainee   is   to   understand   your  position  responsibility.    

Reports    In   the  Reports   section   the  User   (Trainee)   can  create,   edit,  update  and  delete  informational   reports   related   to   the   exercise.   Any   changes   to   the   reports  (including   creating   a   new   report)  will   be   displayed   in   the   Status  Board.   The  Reports  section  contains  the  following  links:      

• Road  Closures  o This   section  allows  any  user   to  add,  view,  edit  and  delete   road  

closures.  You  will  need  to  define  road  name,  date   it  was  closed  and  starting  and  ending  points  of  closure.  Any  reports  you  save  here  will  appear  in  the  status  board.    

• Shelters  o This  section  allows  any  user  to  add,  view,  edit  and  delete  shelter  

reports.   You   will   need   to   define   the   following   fields:   shelter,  status,  accommodates  and  used.  Any  reports  you  save  here  will  appear  in  the  status  board.  

• PODs  o This   section  allows  any  user   to   add,   view,   edit   and  delete  POD  

(Points   of   Distribution)   reports.   You   will   need   to   define   the  following   fields:   PODS,   Status   and  Notes.  Any   reports   you   save  here  will  appear  in  the  status  board.    

• Hospitals  

311

 

o This   section   allows   any   user   to   add,   view,   edit   and   delete  Hospital   Reports.   You   will   need   to   define   the   following   fields:  Hospital,   Status,   Accommodations   and   Used.   Any   reports   you  save  here  will  appear  in  the  status  board.    

• Lives  Data  o This  section  allows  any  user  to  add,  view,  edit  and  delete  Lives  

Data.  You  will  need  to  define  the  following  fields:  Date,  Location,  Saved,   Injured  and  Lost.  Any  reports  you  save  here  will  appear  in  the  status  board.  

                                                                         

312

 

Exercise  Panel    

The   exercise   panel   is   where   you   will   view  injects   and   find   your   communication   tools   as  well   as   disaster   assistant   and   other   useful  dashboards.      The  Exercise  Panel  will  also  tell  you  the  status  of  the  exercise.    Notice   how   under   the   title   “VEOC   2.0   Exercise  Panel”   you   see   the   status   of   the   exercise,  “EXERCISE  IS  NOT  STARTED”.  Once  the  exercise  controller   activates   the   exercise   this   status  should  change  to  “EXERCISE  STARTED”.    

   Interactive  Assistance  

• Disaster  Assistant  o Needs  development*  

Dashboards  • Cost  to  County  

o Dashboard  with  details  of  costs  to  county.    • Casualties  

o Dashboard  with  details  of  casualties.    Received  Injects  

• Received  Injects  o Dashboard  with   list   of   injects   received.   Injects   are   created   and  

sent  by  the  exercise  developer.  Communication  Tools  

• Face-­‐to-­‐Face  o Initiate  face-­‐to-­‐face  chat  request.  Select  person  you  would  like  to  

speak  to  and  press  talk.    • Telephone  

o Initiate  telephone  chat  request.  Select  person  you  would  like  to  speak  to  and  press  talk.    

• Radio  o Initiate  radio  chat  request.  Select  person  you  would  like  to  speak  

to  and  press  talk.    • Cellphone  

o Initiate  cellphone  chat  request.  Select  person  you  would  like  to  speak  to  and  press  talk.    

• PDA  

313

 

o Initiate  PDA  chat  request.  Select  person  you  would  like  to  speak  to  and  press  talk.    

• Global  Updates  o Needs  development*  

     One  a  communication  request  has  been  sent,  the  user  will  receive  the  request  and  have  the  option  to  accept  or  decline.  *Not  currently  working*    

314

 

         User  Permissions    Logging  On    Once  you  have  successfully  accessed  the  VEOC  Webpage,  the  next  step  will  be  to  logon.  Select  Exercise  Developer  at  the  top  right  hand  corner  of  the  screen  to  logon  as  an  Exercise  Developer.                                                

Exercise  Developer  

Please   be   sure   that   you   see   “Exercise  Developer   Log   in”   displays   above  the  table  where  you  enter  your  log  in  credentials.      

315

 

Exercise  Developer  Console  The  user  creating  the  exercise  uses  the  exercise  developer  console.  This  type  of  user  has  an  array  of  exercise  development  tools  available  as  well  as  tools  to  measure  and  evaluate  metrics  used  during  the  operation.    Exercise  Development  Tools  

• Handbook  Developer  o The  handbook  developer  option  allows  the  exercise  developer  to  

create   a   digital   handbook,   which   will   be   visible   to   all   users  logged  in  to  a  certain  script.  To  create  or  edit  a  handbook,  click  on   the  handbook  developer   link,   choose   the  desired   script   and  fill  in  the  fields  as  necessary.  You  must  save  the  handbook  (Click  “Save  handbook”)  when  finished  entering  your  information.    

• Player  Responsibilities  Developer  o With   the   player   responsibilities   developer   option   the   exercise  

developer   can  define   responsibilities   for   the   trainee  users.  The  trainee   can   view   his   or   her   responsibilities   after   logging   in.  *need  more  info*  

• Initial  Status  Developer  o The  initial  status  developer  option  allows  the  exercise  developer  

to  set  an   initial  status  for  the  trainee  users  to  see  upon  logging  in.  You   can   file   additional   reports   if   necessary  and  each   report  can  be  accompanied  by  an  image.  The  image  is  optional,  if  using  an  image  the  Image  format  must  be  .jpg  and  <  14BM.  

• Target  Capabilities  Developer  o Needs  development*  

• Script  Developer  o A  script   is  one  of   the  principal  elements  of   the  VEOC.  All  of   the  

data   and   settings   are   stored   in   scripts   and   therefore   it   is  necessary   for   trainee   users   to   select   a   script   during   the   login  process.   Scripts   are   created   by   the   exercise   developer   and/or  researcher.  The  script  developer  option  allows  you  to  open,  edit,  delete  and  create  new  scripts.    

• Database  Control  Tools  o The  database  control  tools  option  allows  the  script  developer  to  

select  a  script  and  disable,  reactivate,  delete  or  archive  it.  Exercise  Control  and  Tracking  

• Exercise  Controller  o The  exercise  controller  desktop  is  where  the  exercise  developer  

can  select  a  script  and  start,  stop  (reset),   fast-­‐forward  or  pause  the   exercise.   The   injects   defined   in   a   script   will   appear   in   the  table  below   the   control  panel   and   the   last   column  of   each   row  will  change  from  red  to  green  once  an  inject  is  sent.    

• Exercise  Log  

316

 

o The  exercise  log  tracks  ALL  events  and  actions  (including  pages  viewed)   for   all   users   while   logged   in   under   a   certain   script.  Opening  the  exercise  log  will  allow  you  to  filter  the  data  by  time,  user,  page,  action  and/or  role.  Any  combination  of  these  options  may  be  used.  The  data  can  then  be  downloaded  as  a  .csv  file  and  analyzed  with  Microsoft  Excel.    

Exercise  Reports  and  Evaluation  • Player  Reports  

o Needs  development*  • Evaluation  Metrics  

o Needs  development*  External  Links  

• FEMA  Training  Center  o Link  to  TRAINING.FEMA.GOV  

References  • Disaster  Map  

o Link   to  disaster  map.  Exercise  developer   can  edit   this  with   the  same  abilities  as  a  trainee  user.    

• Miami-­‐Dade  Resources  o Resources  available  to  the  county.  Exercise  developers  can  add  

resources  by  defining  the  following  fields:  Item,  Quantity  and  Cost.  

               

317

 

     

 User  Permissions    Logging  On    Once  you  have  successfully  accessed  the  VEOC  Webpage,  the  next  step  will  be  to  logon.  Select  Researcher  Login  at  the  top  right  hand  corner  of  the  screen  to  logon  as  a  Researcher.                  

Researcher  

Please  be  sure  that  you  see  “Researcher  Log  in”  displays  above  the  table  where  you  enter  your  log  in  credentials.      

318

 

 Researcher  Console    Researcher  Tools  

• Evaluation  Metrics  o Needs  development*  

Exercise  Reports  • Exercise  Log  

o The  exercise  log  tracks  ALL  events  and  actions  (including  pages  viewed)   for   all   users   while   logged   in   under   a   certain   script.  Opening  the  exercise  log  will  allow  you  to  filter  the  data  by  time,  user,  page,  action  and/or  role.  Any  combination  of  these  options  may  be  used.  The  data  can  then  be  downloaded  as  a  .csv  file  and  analyzed  with  Microsoft  Excel.    

• Player  Reports  o Needs  development*  

• Position  Logs  o Needs  development*  

References  • Disaster  Map  

o Link   to   disaster   map.   Researcher   can   edit   this   with   the   same  abilities  as  a  trainee  user.    

• View  Scripts  o Clicking   this   option  will   allow   the   researcher   to   select   a   script  

and   injects   related   to   that   script  will   be   displayed   in   the   table  below.  

Exercise  Development  Tools  • Script  Developer  

o A  script   is  one  of   the  principal  elements  of   the  VEOC.  All  of   the  data   and   settings   are   stored   in   scripts   and   therefore   it   is  necessary   for   trainee   users   to   select   a   script   during   the   login  process.   Scripts   are   created   by   the   exercise   developer   and/or  researcher.  The  script  developer  option  allows  you  to  open,  edit,  delete  and  create  new  scripts.    

     

319

 

       

Point  of  Distribution  (POD)  Inject  Script    

 

Glossary  

320

BIBLIOGRAPHY

1. Advanced Disaster Management Simulator (ADMS). http://www.

trainingfordisastermanagement.com, Accessed October 2014.

2. R. Agrait, A. English, D. Evans, T. Hammell, J. Loughran, and M. Stahl.Review of models, simulations, and games for domestic preparedness trainingand exercising: Volume iii. Office for Domestic Preparedness. Department ofHomeland Security, 2004.

3. S. Ajabshir. Personal communication, November 2009.

4. S. Altschuller and R. Benbunan-Fich. Potential antecedents to trust in ad hocemergency response virtual teams. Proceedings of the 5th International ISCRAMConference, pages 254–264, 2008.

5. R. Y. Ameen and A. Y. Hamo. Survey of server virtualization. InternationalJournal of Computer Science and Information Security (IJCSIS), 11(3), 2013.

6. American Red Cross. Disaster online newsroom. disaster alert:Earthquake in haiti. http://newsroom.redcross.org/2010/01/12/

disaster-alert-earthquake-in-haiti/, Accessed October 2014.

7. B. J. Avolio, S. S. Kahai, and G. E. Dodge. E-leadership: Implications fortheory, research, and practice. Leadership Quarterly, 11(4):615–668, 2001.

8. S. Aycock. Browserbot: An online browser code conversion tool. REU ProjectPaper, Notre Dame, August 2010.

9. I. Becerra-Fernandez, M. Prietula, G. Madey, and D. Rodriguez. Project en-sayo: a virtual emergency operations center for disaster management research,training, and discovery. The First International Conference on Global Defenseand Business Continuity (ICGD&BC), 2007.

10. I. Becerra-Fernandez, M. Prietula, G. Madey, D. Rodriguez, A. Gudi, andJ. Rocha. Project ensayo: A virtual emergency operations center. Sixteenth In-ternational Conference on Management of Technology (IAMOT’07), May 2007.

11. I. Becerra-Fernandez, G. Madey, M. Prietula, D. Rodriguez, R. Valerdi, andT. Wright. Design and development of a virtual emergency operations centerfor disaster management research, training, and discovery. Hawaii InternationalConference on System Sciences (HICSS)-41, 2008.

321

12. I. Becerra-Fernandez, W. Xia, A. Gud, and J. Rocha. Emergency managementtask characteristics, knowledge sharing and integration and task performance:Research agenda and challenges. Proceedings of the 5th International ISCRAMConference, May 2008.

13. B. Bell and S. W. Kozlowski. A typology of virtual teams, implications foreffective leadership. Group and Organization Management, 27(1):14–49, 2002.

14. Bing. http://www.bing.com, Accessed October 2014.

15. E. Blake, T. Kimberlain, R. Berg, J. Cangialosi, and J. Beven II. Tropicalcyclone report hurricane sandy (al182012). Technical report, National HurricaneCenter, 2013.

16. B. Blanchard. Guide to emergency management and related terms, defini-tions, concepts, acronyms, organizations, programs, guidance, executive or-ders & legislation. a tutorial on emergency management, broadly defined, pastand present. http://training.fema.gov/EMIWeb/edu/docs/terms%20and%

20definitions/Terms%20and%20Definitions.pdf, Accessed October 2014.

17. M. Buscher, P. Mogensen, and M. Kristensen. When and how (not) to trust it?Supporting virtual emergency teamwork. Proceedings of the 5th InternationalISCRAM Conference, pages 167–176, 2008.

18. B. Campbell and C. Weaver. Rimsim response hospital evacuation: Improvingsituation awareness and insight through serious games play and analysis. Inter-national Journal of Information Systems for Crisis Response and Management,3(3):1–15, 2011.

19. L. Carver and M. Turoff. Human-computer interaction: The human and com-puter as a team in emergency management information systems. Communica-tions of the ACM, 50(3):33–38, March 2007.

20. W. F. Cascio and S. Shurygailo. E-leadership and virtual teams. OrganizationalDynamics, 31(4):362–376, 2003.

21. T. Catarci, M. de Leoni, A. Marrella, M. Mecella, A. Russo, R. Steinmann,and M. Bortenschlager. Workpad: Process management and geo-collaborationhelp disaster response. International Journal of Information Systems for CrisisResponse and Management, 3(1):32–49, 2011.

22. Centers for Disease Control and Prevention (CDC). Updated cdc estimates of2009 h1n1 influenza cases, hospitalizations and deaths in the united states, april2009 - april 10, 2010. http://www.cdc.gov/h1n1flu/estimates_2009_h1n1.

htm, Accessed October 2014.

23. K. Chang. Introduction to Geographic Information Systems. McGraw-Hill Com-panies, Inc, 2010.

322

24. A. Chen, F. Pea-Mora, S. Mehta, S. Foltz, A. Plans, B. Brauer, and S. Nache-man. Equipment distribution for structural stabilization and civilian rescue.International Journal of Information Systems for Crisis Response and Manage-ment, 3(1):19–31, 2011.

25. Civil Emergency Reaction and Responder Training System (CERRTS). Civilemergency reaction and responder training system (cerrts). http://www.

raytheon.com/capabilities/products/cerrts/, Accessed April 2009.

26. C. Clark. Capp 30523 lecture notes, 2008.

27. ComCARE Alliance ACN Data Set Working Group. Vehicular emer-gency data set recommendation version 2.0. http://xml.coverpages.org/

ComCARE-VEDSv20-2004.pdf, Accessed November 2014.

28. Cover Pages. Technology reports. xml and emergency management. http://

xml.coverpages.org/emergencyManagement.html, Accessed November 2014.

29. D. Crane and P. McCarthy. Comet and Reverse Ajax. Springer-Verlag NewYork, Inc, 2008.

30. S. Davis. Virtual emergency operations centers. Risk Management, 49(7):46,2002.

31. F. Dawood, A. D. Iuliano, C. Reed, M. I. Meltzer, D. K. Shay, P.-Y. Cheng,D. Bandaranayake, R. F. Breiman, A. Brooks, P. Buchy, D. R. Feikin, K. B.Fowler, A. Gordon, N. T. Hien, P. Horby, Q. S. Huang, M. A. Katz, A. Krishnan,R. Lal, J. M. Montgomery, K. Mlbak, R. Pebody, A. M. Presanis, H. Razuri,A. Steens, Y. O. Tinoco, J. Wallinga, H. Yu, S. Vong, J. Bresee, and M. A.Widdowson. Estimated global mortality associated with the first 12 monthsof 2009 pandemic influenza a h1n1 virus circulation: a modelling study. TheLancet Infectious Diseases, 12:687–695, September 2012.

32. E. A. der Heide. Disaster Response: Principles of Preparation and Coordination.CV Mosby, 1989.

33. A. Dix, J. Finlay, G. Abowd, and R. Beale. Human-computer interaction (3rded.). London, UK: Prentice Hall, 2004.

34. Dojo Toolkit. http://dojotoolkit.org, Accessed November 2014.

35. M. Dorasamy and M. Raman. Information systems to support disaster planningand response: Problem diagnosis and research gap analysis. Proceedings of the8th International ISCRAM Conference, 2011.

36. J. DuVal. Webeoc resource manager a collaborative framework: Developingstandard resource management processes for disaster relief. Proceedings of the5th International ISCRAM Conference, 2008.

323

37. Emergency Interoperability Consortium (EIC). Creating an emergency dataexchange language. http://xml.coverpages.org/EDXL-OverviewEIC200503.pdf, Accessed November 2014.

38. Emergency Management. All-hazards, all-stakeholders summit, 2009.

39. Emergency Management Institute. Is-1: Emergency program manager an ori-entation to the position, 2002.

40. Emergency Management Institute. Is-139: Exercise design, 2003.

41. Emergency Management Institute. Is-100: Introduction to incident commandsystem, 2008.

42. Emergency Management Institute. Is-120a: An introduction to exercises, 2008.

43. Emergency Management Institute. Is-702.a: National incident managementsystem (nims) public information systems, 2013.

44. Emergency Management Institute. Is-700.a: National incident managementsystem (nims) an introduction, 2013.

45. Emergency Management Institute. Is-775: Eoc management and operations,2008.

46. Emergency Management Staff Trainer (EMST). http://www.ecsorl.com/

solutions/emergency-management-staff-trainer-emst, Accessed Novem-ber 2014.

47. M. Endsley. Design and evaluation for situation awareness enhancement. Pro-ceedings of the Human Factors Society 32nd Annual Meeting ,Human FactorsSociety, pages 97–101, 1988.

48. K. Ericsson, M. Prietula, and E. Cokely. The making of an expert. HarvardBusiness Review, pages 114–121, Jul-Aug 2007.

49. ESi Acquisition, Inc. Webeoc administrator manual, 2009.

50. ESi Acquisition Inc. Webeoc 7 professional. http://www.esi911.com/esi,Accessed December 2013.

51. ESi Acquisition, Inc. Webfusion regional meeting, 2010.

52. ESi Acquisition, Inc. Certified webeoc administrators. http://www.esi911.

com/esi/index.php?option=com_content&task=view&id=192, 2010.

53. Facebook. http://www.facebook.com, Accessed November 2014.

54. Federal Emergency Management Agency. About na-tional incident management system. http://www.fema.gov/

about-national-incident-management-system, Accessed November 2014.

324

55. Federal Emergency Management Agency (FEMA). State and local guide (slg)101: Guide for all-hazard emergency operations planning, September 1996.

56. D. Fetterman. Ethnography Step-by-Step. Sage Publications, Inc, 2010.

57. Field Research. Miami-dade emergency operations center, 2009-2010.

58. Flickr. http://www.flickr.com, Accessed November 2014.

59. Florida Department of Emergency Management. Operation cassandra: Miami-dade exercise eoc activation, July 2009.

60. J. Glynn, S. Souhndararajan, and M. Yerra. Virtual emergency operationscenter user manual v 3.0. Technical Memo, Computer Science & Engineering,University of Notre Dame, Notre Dame, IN, 2013.

61. T. Grant. A checklist for comparing emergency management information sys-tems. Proceedings of the 5th International ISCRAM Conference, May 2008.

62. W. Green. E-emergency management in the u.s.a.: A preliminary survey of theoperational state of the art. International Journal of Emergency Management,1(1):70–81, 2001.

63. A. Gryszkiewicz. Evaluating design principles for temporality in informationtechnology for crisis management. International Journal of Information Systemsfor Crisis Response and Management, 4(1):29–46, 2012.

64. A. Gryszkiewicz and F. Chen. Temporal aspects in crisis management and itsimplications on interface design for situation awareness. Cognition, Technology,and Work, 14(2):169–182, 2012.

65. C. Hall. Personal communication, November 2009.

66. J. Harrald and T. Jefferson. Shared situational awareness in emergency man-agement mitigation and response. Proceedings of the 40th Hawaii InternationalConference on System Sciences, 2007.

67. J. Heinzelman and C. Waters. Crowdsourcing crisis information in disaster-affected haiti. Technical report, Special Report 252. United States Institute ofPeace, 2010.

68. HERENetwork. http://herenetwork.org, Accessed November 2014.

69. B. Hertzler, E. Frost, G. Bressler, and C. Goehring. Experience report: Using acloud computing environment during haiti and exercise24. International Journalof Information Systems for Crisis Response and Management, 3(1):50–64, 2011.

70. A. Hevner, S. March, J. Park, and S. Ram. Design science in informationsystems research. MIS Quarterly, 28:50–64, 2011.

325

71. J. Holgun-Veras, N. Prez, S. Ukkusuri, T. Wachtendorf, and B. Brown. Emer-gency logistics issues affecting the response to katrina. a synthesis and prelimi-nary suggestions for improvement. Transportation Research Record: Journal ofthe Transportation Research Board, 2022:76–82, 2007.

72. Incident Commander. http://www.incidentcommander.net/product.shtml,Accessed November 2014.

73. Institute for Crisis, Disaster, and Risk Management. George Washington Univer-sity. Icdrm/gwu emergency management glossary of terms. http://www.gwu.

edu/~icdrm/publications/PDF/EM_Glossary_ICDRM.pdf, Access November2014.

74. R. Irwin. The incident command system. In Disaster Response Principles ofPreparations and Coordination: Chapter 7 - The Incident Command System.1989.

75. S. Jain and C. McLean. Modeling and simulation of emergency response: Work-shop report, relevant standards, and tools. National Institute of Standards,December 2003.

76. S. Jain and C. McLean. An integrating framework of modeling and simula-tion for incident management. Journal of Homeland Security and EmergencyManagement, 3(1), 2006.

77. S. Jain, C. McLean, and Y. Lee. Towards standards for integrated gaming andsimulation for incident management. Proceedings of the 2007 Summer ComputerSimulation Conference (SCSC ’07), 2007.

78. S. L. Jarvenpaa and D. Leidner. Communication and trust in global virtualteams. Organization Science, 10(6):791, 1999.

79. S. L. Jarvenpaa and D. E. Leidner. Is there anybody out there? Antecedentsof trust in global virtual teams. Journal of Management Information Systems,14(4):29–64, 1998.

80. M. Jennex. Emergency response systems: the utility y2k experience. Journal ofInformation Technology Theory and Application (JITTA), 6(3):85–102, 2004.

81. M. Jennex. Implementing social media in crisis response using knowledge man-agement. International Journal of Information Systems for Crisis Response andManagement, 2010.

82. Jetty Server. http://www.mortbay.org/jetty, Accessed November 2014.

83. John F. Kennedy School of Government. Leadership in crisis, April 2010.

84. T. Johnson. Personal communication, March 2008.

326

85. T. Johnson. Personal communication, January 2010.

86. S. Kalyuga and J. Sweller. Rapid dynamic assessment of expertise to improvethe efficiency of adaptive e-learning. Educational Technology Research and De-velopment, 53(3):83–93, 2005.

87. Kernel Based Virtual Machine. http://www.linux-kvm.org/page/Main_Page,Accessed November 2014.

88. S. King, G. Dunlap, and P. Chen. Operating system support for virtual ma-chines. Proceedings of the Annual Conference on USENIX Annual TechnicalConference, 2003.

89. A. Lebsock. Personal communication, January 2010.

90. G. Madey, I. Becerra-Fernandez, C. Nikolai, and M. Prietula. Ensayo: A virtualemergency operations center for training and research. Institute for OperationsResearch and the Management Sciences (INFORMS), 2009.

91. G. Madey, I. Becerra-Fernandez, C. Nikolai, and M. Prietula. A training andresearch simulator for emergency management. Presentation, The Institute forOperations Research and the Management Sciences (INFORMS), 2010.

92. P. Marques-Quinteiro, L. Curral, A. Passos, and K. Lewis. And now what dowe do? the role of transactive memory systems and task coordination in actionteams. Group Dynamics: Theory, Research, and Practice, 17:194–206, 2013.

93. Miami-Dade County Website. Eoc activation levels. http://www.miamidade.

gov/fire/about-activation-levels.asp, Accessed November 2014.

94. Miami-Dade EOC. Miami-dade eoc haiti relief effort situation reports #1-#13,January 2010.

95. Miami-Dade EOC. Miami-dade exercise eoc activation, May 2008.

96. Miami-Dade EOC. Hurricane suiter: Miami-dade exercise eoc activation, June2009.

97. Miami-Dade EOC. Operation valor exercise plan: Miami-dade exercise eocactivation, 2010.

98. Miami-Dade EOC. Haiti earthquake: Miami-dade eoc activation, January 2010.

99. Miami-Dade EOC. Pro bowl: Miami-dade eoc activation, January 2010.

100. Miami-Dade EOC. Super bowl xliv: Miami-dade eoc activation, February 2010.

101. M. Mooney. Virtual emergency operations center individual development doc-umentation. REU Project Paper, Notre Dame, 2010.

327

102. MySQL. http://dev.mysql.com, 2014.

103. J. Nabila and D. Mohamed. A comparative analysis of collective awarenessbuilding in virtual teams. Proceedings of the Mediterranean Conference on In-formation Systems (MCIS), 2008.

104. National Weather Service. Storm-zone exercise, epiphany catholic high school,November 2009.

105. NC4. E-team brochure, 2013.

106. NC4. E-team Enterprise brochure, 2009.

107. NC4. E-team. http://www.nc4worldwide.com/Pages/eteam.aspx, AccessedNovember 2014.

108. A. Nelson, I. Sigal, and D. Zambrano. Media, information systems, and commu-nities: Lessons from haiti. Communicating with Disaster Affected Communities(CDAC), n.d.

109. C. Nikolai. Miami-dade emergency operations center field research report. Uni-versity of Notre Dame Zahm Research Travel Report, Notre Dame, IN, 2010.

110. C. Nikolai. veoc assumptions and requirements document v.5. Project Report,Computer Science & Engineering, University of Notre Dame, Notre Dame, IN,2012.

111. C. Nikolai and G. Madey. Anatomy of a toolkit: A comprehensive compendiumof various agent-based modeling toolkits on the market today. Proceedings ofthe Agent 2007 Conference on Complex Interaction and Social Emergence, 2007.URL http://www.dis.anl.gov/pubs/60568.pdf.

112. C. Nikolai and G. Madey. Searchable taxonomies of agent based modelingtoolkits. Proceedings of the 2009 Spring Simulation Multiconference. Societyfor Computer Simulation International, 2009.

113. C. Nikolai and G. Madey. Tools of the trade: A survey of various agentbased modeling platforms. Journal of Artificial Societies and Social Simulation,12(2)2, March 2009. URL http://jasss.soc.surrey.ac.uk/12/2/2.html.

114. C. Nikolai and G. Madey. Agent based modeling toolkit search engine. Agent-Directed Simulation Symposium (ADS’09). The Society for Modeling and Sim-ulation International (SCS), 2009.

115. C. Nikolai, I. Becerra-Fernandez, M. Prietula, and G. Madey. Project ensayo:Designing a virtual emergency operations center. Proceedings of the IEEE In-ternational Conference on Systems, Man, and Cybernetics, 2009.

328

116. C. Nikolai, T. Johnson, I. Becerra-Fernandez, and G. Madey. Leveraging we-beoc in support of the haitian relief effort: Insights and recommendations. The7th International Community on Information Systems for Crisis Response andManagement (ISCRAM) Conference, 2010.

117. C. Nikolai, G. Madey, I. Becerra-Fernandez, and M. Prietula. Ensayo: A dis-tributed, web-based virtual emergency operations center for training and re-search. The 7th International Community on Information Systems for CrisisResponse and Management (ISCRAM) Conference PhD Colloquium and PosterSession, 2010.

118. C. Nikolai, G. Madey, I. Becerra-Fernandez, and M. Prietula. Ensayo: A col-laborative cyberinformation portal for emergency management training and re-search. Cyberinfrastructure Days, University of Notre Dame, 2010.

119. C. Nikolai, G. Madey, I. Becerra-Fernandez, and M. Prietula. Ensayo: A vir-tual emergency operations center simulator for training and research. Ph.D.Colloquium and Poster Session, Winter Simulation Conference, 2010.

120. C. Nikolai, M. Prietula, G. Madey, I. Becerra-Fernandez, T. Johnson,M. Mooney, and R. Bhandari. Experiences and insights using a virtual emer-gency operations center. Project Report, 2010.

121. C. Nikolai, T. Johnson, I. Becerra-Fernandez, M. Prietula, and G. Madey.Simeoc: A distributed web-based virtual emergency operations center simulatorfor training and research. Technical Report, Computer Science & Engineering,University of Notre Dame, Notre Dame, IN, 2014.

122. C. Nikolai, T. Johnson, I. Becerra-Fernandez, M. Prietula, and G. Madey. A callfor data exchange standards for operational crisis information management sys-tems and emergency management exercise simulators. Technical Report, Com-puter Science & Engineering, University of Notre Dame, Notre Dame, IN, 2014.

123. C. Nikolai, T. Johnson, I. Becerra-Fernandez, M. Prietula, and G. Madey. De-sign principles for modern crisis information management systems: From closedlocal systems to the web and beyond. Technical Report, Computer Science &Engineering, University of Notre Dame, Notre Dame, IN, 2014.

124. C. Nikolai, T. Johnson, M. Prietula, and G. Madey. Games and simulations foremergency operations centers: challenges and opportunities. Technical Report,Computer Science & Engineering, University of Notre Dame, Notre Dame, IN,2015.

125. Office of the Mayor. Mayor carlos alvarez proposes fy 2009-2010 bud-get. service cuts and workforce reductions recommended. news release,2010. URL http://www.miamidade.gov/mayor/releases/09-07-15_budget_

proposal_presented.asp.

329

126. O. Okolloh. Ushahidi, or testimony: Web 2.0 tools for crowdsourcing crisisinformation. Participatory Learning and Action, 59:65–70, 2009.

127. T. Onorati, A. Malizia, P. Daz, and I. Aedo. Interaction design principlesfor web emergency management information systems. International Journal ofInformation Systems for Crisis Response and Management, 3(2):48–66, 2011.

128. C. Pearson and I. Mitroff. From crisis prone to crisis prepared: A framework forcrisis management. The Academy of Management Executive, 7(1):48–59, 1993.

129. D. Perez. Personal communication, August 2008.

130. S. Perng, M. Bscher, L. Wood, R. Halvorsrud, M. Stiso, L. Ramirez, and A. Al-Akkad. Peripheral response: microblogging during the 22/7/2011 norway at-tacks. International Journal of Information Systems for Crisis Response andManagement (IJISCRAM), 5(1):41–57, 2013.

131. PFIF 1.0 Specification. http://zesty.ca/pfif/1.0/#principles, AccessedNovember 2014.

132. L. Plotnick, R. Ocker, S. Hiltz, and M. Rosson. Leadership roles and communi-cation issues in partially distributed emergency response software developmentteams: A pilot study. Proceedings of the 41st Hawaii International Conferenceon System Sciences, 2008.

133. J. Preble. Integrating the crisis management perspective into the strategic man-agement process. Journal of Management Studies, 34(5):769–791, 1997.

134. N. Rabkin. Hurricane katrina: Providing oversight of the nation’s preparedness,response, and recovery activities. GAO Testimony Before the Subcommittee onOversight and Investigations, Committee on Energy and Commerce, House ofRepresentatives, 2005.

135. R. Ranstrom. Automated web software testing with selenium. REU ProjectPaper, Notre Dame, August 2010.

136. B. Raybaud. Ensayo - administration and script creation. Technical Memo,Computer Science & Engineering, University of Notre Dame, Notre Dame, IN,2012.

137. Raytheon. CERRTS product data sheet, 2009.

138. Y. Ren and L. Argote. Transactive memory systems 1985-2010: an integrativeframework of key dimensions, antecedents, and consequences. The Academy ofManagement Annals, 5:189–229, 2011.

139. J. Rocha, I. Becerra-Fernandez, W. Xia, and A. Gudi. Dealing with task un-certainty in disaster management: the role of knowledge sharing for explorationand exploitation. AMCIS 2009 Proceedings, Paper 714, 2009.

330

140. J. Rocha, I. Becerra-Fernandez, W. Xia, and A. Gudi. Dealing with task uncer-tainty in disaster management: The role of knowledge sharing for explorationand exploitation. AMCIS 2009 Proceedings, 2009.

141. M. Russell. Dojo: The Definitive Guide. O’Reilly Media, Inc, 2008.

142. E. Salas, S. Tannenbaum, K. Kraiger, and K. Smith-Jentsch. The science oftraining and development in organizations: What matters in practice. Psycho-logical Science, 13:74–101, 2013.

143. K. Satishkumar and D. Waxman. Rapid response virtual emergency operationscenter: A portal solution for interagency collaboration and incident manage-ment. PowerPoint Presentation, 2009.

144. R. Sebesta. Programming the World Wide Web. Pearson Education, Inc, 2008.

145. Selenium Website. Selenium website. http://www.seleniumhq.org/, AccessedNovember 2014.

146. V. Shute and B. Towle. Adaptive e-learning. Educational Psychologist, 38(2):105–114, 2003.

147. W. Sidney, M. Jonsen, J. Bergstrom, and N. Dahlstrom. Learning from failuresin emergency response: Two empirical studies. Journal of Emergency Manage-ment, 6(5):64–70, 2008.

148. I. Sikorski. Personal communication, August 25, 2014.

149. SimEOC. http://www.simeoc.org, Accessed November 2014.

150. S. Smits and N. Ally. Thinking the unthinkable leaderships role in creatingbehavioral readiness for crisis management. Communication Review, 13(1):1–23, 2003.

151. J. Sniezek, D. Wilkins, P. Wadlington, and M. Baumann. Training for crisisdecision-making: Psychological issues and computer-based solutions. Journalof Management Information Systems, 18(4):147–168, 2002.

152. R. Spears and M. Lea. Panacea or panopticum? The hidden power of com-puter mediated communication. Communication Research, 21(4):427–459, Au-gust 1994.

153. St. Louis Area Regional Response System. http://www.stl-starrs.org/, Ac-cessed November 2014.

154. K. Starbird and L. Palen. “voluntweeters”: Self-organizing by digital volunteersin times of crisis. Proceedings of the 2011 Annual Conference on Human Factorsin Computing Systems, 2011.

331

155. L. Suchman. Human-Machine Reconfigurations: Plans and Situated Actions.Cambridge University Press, 2nd edition, 2007.

156. J. Sutton, E. Spiro, C. Butts, S. Fitzhugh, B. Johnson, and M. Greczek. Tweet-ing the spill: online informal communications, social networks, and conversa-tional microstructures during the deepwater horizon oilspill. International Jour-nal of Information Systems for Crisis Response and Management (IJISCRAM),5(1):58–76, 2013.

157. N. Thomas. Software design within comet: Manipulating database tables withjavascript and php. REU Project Paper, Notre Dame, August 2010.

158. J. Trnka and B. Johansson. Collaborative command and control practice: Adap-tion, self-regulation and supporting behavior. International Journal of Infor-mation Systems for Crisis Response and Management, 1(2):47–67, 2009.

159. M. Turoff, M. Chumer, B. V. de Walle, and X. Yao. The design of a dynamicemergency response management information system (dermis). The Journal ofInformation Technology Theory and Application (JITTA), 5(4):1–35, 2004.

160. M. Turoff, M. Chumer, R. Hiltz, R. Klashner, M. Alles, M. Vasarhelyi, andA. Kogan. Assuring homeland security: Continuous monitoring, control, andassurance of emergency preparedness. Journal of Information Technology The-ory and Application, 6(3):1–24, 2004.

161. Twitter. http://www.twitter.com, Accessed November 2014.

162. Ubuntu Operating System. http://www.ubuntu.com, Accessed November 2014.

163. U.S. Department of Health and Human Services. Emergency support func-tions. http://www.phe.gov/preparedness/support/esf8/pages/default.

aspx, Accessed November 2014.

164. U.S. Department of Homeland Security. National response plan.https://acc.dau.mil/adl/en-US/25057/file/2934/HK%20DHS_National_

Response_Plan_2004.pdf, Accessed November 2014.

165. U.S. Department of Homeland Security. National response framework, 2008.

166. U.S. Department of Homeland Security. National incident management sys-tem. http://www.fema.gov/pdf/emergency/nims/NIMS_core.pdf, AccessedNovember 2014.

167. U.S. Department of Homeland Security. Homeland security exercise andevaluation program (hseep). https://www.llis.dhs.gov/HSEEP/Documents/

homeland-security-exercise-and-evaluation-program-hseep, AccessedNovember 2014.

332

168. U.S. Department of Homeland Security. Target capabilities list, September2007.

169. U.S. Department of Transportation Federal Highway Administration. Glos-sary: Simplified guide to the incident command system for transportation pro-fessionals. http://ops.fhwa.dot.gov/publications/ics_guide/glossary.

htm, Accessed November 2014.

170. U.S. Department of Transportation Federal Highway Administration. Glos-sary: Simplified guide to the incident command system for transportation pro-fessionals. http://ops.fhwa.dot.gov/publications/ics_guide/glossary.

htm, Accessed November 2014.

171. U.S. Geological Survey (USGS) Website. Haiti region earthquake details andsummary. http://earthquake.usgs.gov/earthquakes/eqinthenews/2010/

us2010rja6/, Accessed November 2014.

172. T. van Ruijven. Serious games as experiments for emergency management re-search: a review. Proceedings of the 8th International Conference on Informa-tion Systems for Crisis Response and Management, 2011.

173. Virginia Department of Emergency Management. Virginia emergency opera-tions center. http://www.vaemergency.gov/content/veoc, Accessed Novem-ber 2014.

174. Virtual Staff Trainer. http://www.ecsorl.com/solution/

virtual-staff-trainer/, Accessed November 2014.

175. R. Viterbo. Personal communication, November 2009.

176. vSphere 5 Documentation Center. What is a virtual machine.http://pubs.vmware.com/vsphere-50/index.jsp#com.vmware.vsphere.

vm_admin.doc_50/GUID-CEFF6D89-8C19-4143-8C26-4B6D6734D2CB.html,Accessed November 2014.

177. w3schools. Introduction to xml. http://www.w3schools.com/xml/xml_

whatis.asp, Accessed November 2014.

178. W. Waugh Jr and G. Streib. Collaborative and leadership for effective emergencymanagement. Public Administration Review, 66(1):131–140, 2006.

179. J. Weller, L. Wilson, and B. Robinson. Survey of change in practice followingsimulation-based training in crisis management. Anaesthesia, 58:471–479, 2003.

180. E. Wenger. Communities of practice: Learning, meaning, and identity. NewYork, NY: Cambridge University Press, 1998.

181. E. Wenger, R. McDermott, and W. Snyder. Cultivating communities of practice.Boston, MA: Harvard Business School Press, 2002.

333

182. A. Westfall, M. Jennex, S. Dickinson, and E. Frost. Event report: Goldenphoenix 2008. Journal of Information Systems for Crisis Response and Man-agement, 1(2):73–80, 2009.

183. R. Woltjer, I. Lindgren, and K. Smith. A case study of information and commu-nication technology in emergency management training. International Journalof Emergency Management, 3(4):332–347, 2006.

184. World Health Organization (WHO). Three months after the indianocean earthquake-tsunami: Photo essay. health consequences and who’s re-sponse. http://www.who.int/hac/crises/international/asia_tsunami/

3months/en, Accessed November 2014.

185. T. Wright and G. Madey. Explorations in building a virtual emergency oper-ations center using collaborative virtual environment. Proceedings of the 5thInternational ISCRAM Conference, May 2008.

186. W. Xia, I. Becerra-Fernandez, J. Rocha, and A. Gudi. Knowledge manage-ment task complexity in emergency management: An instrument. AMCIS 2010Proceedings, 2010.

187. W. Xia, I. Becerra-Fernandez, A. Gudi, and J. Rocha-Mier. Emergency man-agement task complexity and knowledge-sharing strategies. Cutter IT Journal,24(1):20–25, 2011.

188. I. Zigurs. Leadership in virtual teams: Oxymoron or opportunity? Organiza-tional Dynamics, 31(4):339–351, 2003.

189. M. Zook, M. Graham, T. Shelton, and S. Gorman. Volunteered geographicinformation and crowdsourcing disaster relief: a case study of the haitian earth-quake. World Medical & Health Policy, 2(2):7–33, 2010.

190. V. Zwass. Information Systems for Emergency Management. M.E. Sharpe,2010.

This document was prepared & typeset with pdfLATEX, and formatted withnddiss2ε classfile (v3.2013[2013/04/16]) provided by Sameer Vijay and updated

by Megan Patnott.

334