Spies Wearing Purple Hats: The use of social computing to ...

174
University of Calgary PRISM: University of Calgary's Digital Repository Graduate Studies The Vault: Electronic Theses and Dissertations 2012-10-03 Spies Wearing Purple Hats: The use of social computing to improve information sharing inside the Intelligence Community of the United States Chomik, Andrew Chomik, A. (2012). Spies Wearing Purple Hats: The use of social computing to improve information sharing inside the Intelligence Community of the United States (Unpublished master's thesis). University of Calgary, Calgary, AB. doi:10.11575/PRISM/27854 http://hdl.handle.net/11023/278 master thesis University of Calgary graduate students retain copyright ownership and moral rights for their thesis. You may use this material in any way that is permitted by the Copyright Act or through licensing that has been assigned to the document. For uses that are not allowable under copyright legislation or licensing, you are required to seek permission. Downloaded from PRISM: https://prism.ucalgary.ca

Transcript of Spies Wearing Purple Hats: The use of social computing to ...

University of Calgary

PRISM: University of Calgary's Digital Repository

Graduate Studies The Vault: Electronic Theses and Dissertations

2012-10-03

Spies Wearing Purple Hats: The use of social

computing to improve information sharing inside the

Intelligence Community of the United States

Chomik, Andrew

Chomik, A. (2012). Spies Wearing Purple Hats: The use of social computing to improve

information sharing inside the Intelligence Community of the United States (Unpublished

master's thesis). University of Calgary, Calgary, AB. doi:10.11575/PRISM/27854

http://hdl.handle.net/11023/278

master thesis

University of Calgary graduate students retain copyright ownership and moral rights for their

thesis. You may use this material in any way that is permitted by the Copyright Act or through

licensing that has been assigned to the document. For uses that are not allowable under

copyright legislation or licensing, you are required to seek permission.

Downloaded from PRISM: https://prism.ucalgary.ca

UNIVERSITY OF CALGARY

Spies Wearing Purple Hats: The use of social computing to improve information sharing

inside the Intelligence Community of the United States

by

Andrew David Chomik

A THESIS

SUBMITTED TO THE FACULTY OF GRADUATE STUDIES

IN PARTIAL FULFILMENT OF THE REQUIREMENTS FOR THE

DEGREE OF MASTER OF STRATEGIC STUDIES

CENTRE FOR MILITARY AND STRATEGIC STUDIES

CALGARY, ALBERTA

SEPTEMBER 2012

© Andrew David Chomik 2012

ii

Abstract

After the terrorist attacks against the United States on September 11, 2001, the 9/11

Commission Report identified that the “need to know” culture and the “stove-piping” of

agency-centric information in the United States intelligence community were critical

factors in contributing towards intelligence failures on that day. Since the report, new

methods of information sharing in the community have emerged, including the

implementation and use of social computing tools and Web 2.0 technology. However,

intelligence failures have continued to occur in recent years, and social computing tools

are not being used effectively enough to help mitigate these failures. For example, tools

are not required to be used during intelligence production cycles, making them

complementary to pre-existing processes. This thesis critically examines the community’s

internal use of social computing tools, using recent and online sources from within the

community and beyond.

iii

Acknowledgements

To my parents – thank you for your continual support while I spent my twenties juggling

graduate school, travel excursions and a growing career. Your patience with me and

continued encouragement to follow my interests has been instrumental in completing this

thesis and following my dreams in life.

To Dr. Thomas Keenan – thank you for taking me under your supervision and providing

me the room to spread my wings with a thesis topic of my interest. Your support has been

invaluable in my graduate school experience.

To the wonderful staff and student body at CMSS – on those long nights of studying and

being knee-deep in research, I found extra strength trudging through it all knowing my

department was behind me all the way. I will always remember the great times we

shared, whether out on the paintball field, playing volleyball, or just having a beer and a

game of pool at the Grad Lounge to unwind. Let’s keep the good times going well

beyond graduate school and into the future.

iv

Table of Contents

Abstract ............................................................................................................................... ii Acknowledgements ............................................................................................................ iii

Table of Contents ............................................................................................................... iv Abbreviations ..................................................................................................................... vi

CHAPTER 1: INTRODUCTION ........................................................................................1 1.1 Recent Efforts ............................................................................................................4 1.2 Research Problem ......................................................................................................6

1.3 Research Question ...................................................................................................10 1.4 Purpose of Study ......................................................................................................10

1.5 Methodology ............................................................................................................11 1.6 Definitions ...............................................................................................................13

1.6.1 Social Computing ............................................................................................13 1.6.2 Web 2.0 ............................................................................................................14

1.6.3 Social Media ....................................................................................................15 1.6.4 Open source .....................................................................................................17

1.7 Limitations ...............................................................................................................18

CHAPTER 2: THE UNITED STATES INTELLIGENCE COMMUNITY’S SOCIAL

COMPUTING TOOLS .............................................................................................21

2.1 Security and Intelligence Strategies .........................................................................26

2.2 Social Computing Strategy in the United States Government .................................32 2.2.1 The intelligence community’s computer network ...........................................35 2.2.2 Intelink .............................................................................................................36

2.2.3 A-Space (Analytic Space) ...............................................................................38 2.2.4 Intellipedia .......................................................................................................40

2.2.5 Inteldocs ..........................................................................................................42 2.2.6 iNews ...............................................................................................................42 2.2.7 Blogs ................................................................................................................42

2.2.8 Microblogging .................................................................................................43 2.2.9 Social bookmarking .........................................................................................43

2.2.10 Conferencing .................................................................................................44 2.2.11 Collaborative Workspaces .............................................................................44 2.2.12 Gallery ...........................................................................................................45 2.2.13 iVideo ............................................................................................................45

2.2.14 Other ..............................................................................................................46 2.3 Similar programs in other countries .........................................................................48

CHAPTER 3: BUILDING A MODEL OF SUCCESSFUL SOCIAL COMPUTING .....51 3.1 Successful Web 2.0 and Enterprise 2.0 strategies ...................................................53 3.2 The DIA Knowledge Laboratory Pilot Project studies ............................................55

3.2.1 Our Experience with Intellipedia: An Ethnographic Study at the Defense

Intelligence Agency (2008) ..............................................................................58

3.2.2 How A-Space is Shaping Analyst’s Work (2009) ...........................................61

v

3.3 Additional Observations and Feedback ...................................................................64 3.4 Effective use of social computing software .............................................................66

3.4.1 Source #1: McKinsey and Company ...............................................................67 3.4.2 Source #2: Gartner ...........................................................................................69

3.4.3 Source #3: Andrew McAfee ............................................................................71 3.4.4 Source #4: The Markle Foundation .................................................................74

3.5 A comprehensive model for social computing and information sharing .................76

CHAPTER 4: ANALYSIS ................................................................................................83 4.1 Discoverable Information ........................................................................................84

4.2 Crowdsourcing from the bottom-up ........................................................................90 4.3 Information Sharing in Collection and Analysis workflows ...................................99

4.4 Champions .............................................................................................................105 4.5 Performance and Incentives ...................................................................................110 4.6 Summary ................................................................................................................116

CHAPTER 5: RECOMMENDATIONS..........................................................................118

5.1 Recommendation #1: Improve faucets of analytic tradecraft to improve the

culture of information sharing as a basis for using social computing tools. ........121

5.2 Recommendation #2: Social Computing tools must be mandated into

intelligence production workflows. .....................................................................126 5.3 Recommendation #3: Intelligence products must become topical and dynamic

rather than specific and static. ..............................................................................130

5.4 Recommendation #4: Social computing tools must be designed within an

information architecture model that improves ease-of-use and access but

maximizes security...............................................................................................134

CHAPTER 6: CONCLUSION ........................................................................................141

BIBLIOGRAPHY ............................................................................................................145

APPENDIX A: THE U.S. INTELLIGENCE CYCLE ....................................................161

APPENDIX B: STRUCTURE OF THE USIC ................................................................163

APPENDIX C: GOOGLE LIVING STORIES................................................................164

APPENDIX D: USIC SOCIAL COMPUTING TOOLS.................................................165

APPENDIX D: USIC SOCIAL COMPUTING TOOLS (CONTINUED)......................166

APPENDIX D: USIC SOCIAL COMPUTING TOOLS (CONTINUED)......................167

vi

Abbreviations

AT Analytic Transformation (Program)

CIA Central Intelligence Agency

CIO Chief Information Officer

DAR Defense Analysis Report

DARPA Defense Advanced Research Projects Activity

DIA Defense Intelligence Agency

DNI Director of National Intelligence

EO Enterprise Objective

FBI Federal Bureau of Investigation

GIG Global Information Grid

IARPA Intelligence Advanced Research Projects Activity

IC Intelligence Community

IC-CIO Intelligence Community Chief Chief Information Officer

ICD Intelligence Community Directive

IEC Improvised Explosive Device

IRTPA Intelligence Reform and Terrorism Prevention Act

ISE Information Sharing Environment

IT Information Technology

JWICS Joint Worldwide Intelligence Communications System

LNI Library of National Intelligence

MIT Massachusetts Institute of Technology

MTF Markle Task Force

NGA National Geospatial-Intelligence Agency

NIE National Intelligence Estimate

NIPRNet Unclassified by Sensitive Internet Protocol Router Network

NIS National Intelligence Strategy

NISS National Information Sharing Strategy

NSA National Security Agency

NSS National Security Strategy

ODNI Office of the Director of National Intelligence

PM-ISE Program Manager – Information Sharing Environment

RSS Really Simple Syndication (protocol)

SIPRNet Secret Internet Protocol Router Network

USIC United States Intelligence Community

VoIP Voice over Internet Protocol

VPP Virtual Presence Post

XML Extensible Markup Language

1

CHAPTER 1: INTRODUCTION

On Christmas Day, December 25, 2009, Umar Farouk Abdulmutallab, a Nigerian-born

al-Qaeda extremist aligned with al-Qaeda, attempted to detonate plastic explosives

aboard flight Northwest Airlines 253 flying from Amsterdam to Detroit. Avoiding

detection through airport security and boarding the flight supposedly without a passport

(Daily News Reporter 2011), Abdulmutallab had sewn explosive material into his

underpants (the same material used by Richard Reid in the attempted explosive

detonation aboard American Airlines flight 63 in 2002) and attempted to detonate it

during the plane’s final approach into Detroit. Fortunately, Abdulmutallab was unable to

successfully detonate the device, and with the brave actions of the flight passengers and

staff, was subdued onboard and arrested at Detroit Metropolitan Wayne County Airport

by federal agents.

Within the details of the case, Abdulmutallab was found to have been reported by his

father to Central Intelligence Agency (CIA) officers at the U.S. embassy in Abjua,

Nigeria the month prior, and was already on the radar of the United States’ intelligence

community (USIC) due to his connections with al-Qaeda in Yemen. Although his name

was added to the Terrorist Identities Datamart Environment, a central database owned by

the National Counter Terrorism Center (NCTC) and used by all the major U.S.

intelligence agencies, it was not added to the Federal Bureau of Investigation’s (FBI)

Terrorist Screening Database, which supplied names for the official U.S. No-Fly list (The

Lessons and Implications of the Christmas Day Attack: Watchlisting and Pre-Screening

2010). Additionally, Abdulmutallab was able to secure a multiple-entry visa, which was

not revoked nor alerted to the NCTC even though U.S. intelligence agencies were already

2

investigating him (Chapman 2010). Due to this, Abdulmutallab was able to evade U.S.

intelligence detection and airport security protocols which nearly resulted in a devastating

terrorist attack on U.S. soil. In January, 2012, Abdulmutallab was sentenced to four

consecutive life sentences for attempting to destroy and place a destructive device on a

U.S. civil aircraft. Abdulmutallah became known popularly as the “Underwear Bomber”

and the case exposed shortcomings in information sharing efforts among agencies within

the USIC.

A similar intelligence failure resulted in a more violent incident on U.S. soil only one

month earlier. Intelligence failures were prevalent in the tragic acts of Major Nidal

Hasan, a U.S. Army psychiatrist who killed 12 of his fellow servicemen in Fort Hood,

Texas, in 2010. Hasan had been under investigation from the FBI months prior to the

attack, having been radicalized in Islamic fundamentalist beliefs. Hasan also had regular

communication with Anwar al-Awlaki, a prominent American-born imam, terrorist

recruiter and al-Qaeda member in Yemen. Cooperation among U.S. intelligence agencies,

including the U.S. Army, was absent in raising alerts about Hasan’s activities, and

although they had “sufficient information to have detected Hasan’s radicalization to

violent Islamist extremism,” they had “failed to act on it” which exposed numerous

systemic failures in intelligence cooperation between the FBI and the Department of

Defense (DoD) and in intelligence analysis and analytic tradecraft (Lieberman and

Collins 2011). Analytic tradecraft refers to the methodologies in which intelligence

3

analysis is conducted, and has been an important priority for the Office of the Director of

National Intelligence (ODNI) since 20041.

These two incidents exposed a number of agency coordination and cooperation

challenges that exist in the USIC today - challenges that have been attempted to be

addressed since the events of September 11, 2001. Yet, for all the legislation and strategy

development promoting intelligence in a new security environment, the community and

various U.S. administrations appear to suffer from the same intelligence challenges that

led to the disastrous and tragic events in September of 2001, in which al-Qaeda

operatives used four American passenger planes to commit deadly terrorist attacks in

New York, Washington and Pennsylvania.

The National Commission on Terrorist Attacks upon the United States (commonly

referred to as the “9/11 Commission Report”) brought to light various intelligence

failures and identified challenges the community faced in preventing future failures.

Among these included the inability of agencies to share information consistently and

effectively, a primary mandate of intelligence reform since its identification in the

Report. Agencies not sharing data, “stove-piping“ valuable information from each other

and failing to abandon long-standing and engrained habits within agency-centric

intelligence bureaucracies had created a culture of disconnected organizations failing to

work together. In other words, to “connect the dots” with the intelligence they had

(National Commission on Terrorist Attacks upon the United States 2004). To compound

1 For a greater discussion on analytic tradecraft reform, see Gabbard, C. Bryan, and Gregory F. Treverton.

2008. Assessing the tradecraft of intelligence analysis. Vol. TR-293. Santa Monica, CA: RAND

Corporation.

4

information sharing shortcomings, flawed intelligence was provided to American

policymakers rationalizing the 2003 invasion of Iraq (Laipson 2005, 21) but was also

highly debated as the Bush administration politicizing intelligence to rationalize pre-

determined policy choices in the Middle East (Bruno and Otterman 2008). A lack of

information sharing, substandard management and a lack of consensus among

intelligence agencies were also criticisms of the National Intelligence Estimate (NIE) on

Iraqi weapons of mass destruction (Bruno and Otterman 2008).

1.1 Recent Efforts

In 2004, Calvin Andrus, a chief technology officer with the CIA, wrote a paper entitled

The Wiki and the Blog: Toward a Complex Adaptive Intelligence Community. The paper

detailed how the intelligence community could adapt social software such as blogs and

wikis into the everyday work environments of intelligence community staff. The content

that would be created in these mediums would be directly generated from front-line users,

who could add, edit, refine and comment on the content even if it wasn’t their own. The

expansion of these tools as practical devices for intelligence and information sharing

would be fostered by the self-governing nature of the users and the content, rather than

simply being a conventional database of reports with security restrictions. Andrus

proposed this is how the USIC could dynamically respond to the growing and changing

nature of threats faced in the post-9/11 world - letting information sharing happen

organically with the proper tools at the disposal of intelligence staff and stakeholders to

use them (Andrus 2005).

Andrus’ paper served as a lynchpin for realizing the gains to be had through social

computing software – software that enabled users to connect, collaborate on data, and

5

share information among agencies. Today, the USIC uses many major social computing

software tools that have been implemented to enhance information sharing among USIC

members and staff. Two prominent tools in use today are Intellipedia, a wiki database of

information that is owned by the ODNI and contributed to by all registered USIC users

on various types of intelligence, and A-Space (Analyst’s Space), a social networking tool

that top-level security cleared analysts connect to and use as a collaboration environment

for information sharing. The USIC also employs other social computing software for

information sharing2 such as blogs, a microblogging service, multimedia sharing tools

and enterprise software platforms (e.g. Microsoft SharePoint) in an effort to provide user-

friendly, collaborative services.

These software efforts were implemented in conjunction with a variety of government

initiatives that would significantly define national intelligence for the United States. The

Intelligence Reform and Terrorism Prevention Act (IRTPA), passed in 2004, created the

ODNI and established the newly-created role of Director of National Intelligence (DNI)

to govern over the 16 U.S. intelligence agencies and to coordinate their strategy and

objectives in the post-9/11 threat environment. The National Intelligence Strategy (NIS)

(2005; updated in 2009), with its legal basis derived from the IRTPA, identified that

breaking down barriers to communication and promoting interagency collaboration were

central to producing valuable intelligence and realizing threats pro-actively from a

community approach (rather than agency-centric (ODNI 2009c, 7-8). Additionally, the

USIC’s Information Sharing Strategy (2008) identified that the “need to know” culture

2 These services are defined in more detail in Chapter 3.

6

during the Cold War was detrimental to modern intelligence challenges; rather, the

copious amounts of data that exist in various agency repositories are better shared among

other agencies in a “responsibility to provide” model (ODNI 2007d). Behavior, not just

technology needed to change, particularly to become more “accessible, available and

discoverable at the earliest point possible” (ODNI 2008d, 3).

To implement this in the USIC, many ODNI policies have used the “responsibility to

provide” mantra in their language and guidelines. One example of this is in Intelligence

Community Policy Memorandum Number 2007-200-2, which specifically outlines that in

order to provide customers with the highest quality of intelligence, even the lowest

classification of clearance levels need to share in more accessible, higher quality

information, including the use of metadata tagging (information about information) and

other analytic techniques (ODNI 2007d). The responsibility to provide information lay

with agencies in the intelligence production phases and from consumers requiring high-

quality and reliable intelligence products.

1.2 Research Problem

While efforts have been made to implement new strategies of information sharing

through new computerized techniques and a culture shift towards more openness and

collaboration, shortcomings exist in the operational and tactical execution of these plans

creating debate as to whether these strategies are actually effective. Social software, and

the policies that govern them, have come under scrutiny since the tools were first

7

implemented in 20063. Social computing tools in the USIC are not being used as

effectively as they are intended or designed for; they are contributing to (and suffering

from) challenges faced in information sharing among intelligence agencies in the United

States.

First, social computing tools and adopting new collaboration methods are not

independent from intelligence failures. The intelligence cycle4 and the tools used to

propagate intelligence products through it are all part of a system that ultimately

facilitates the quality of intelligence that eventually travels into the hands of consumers,

which are often policymakers in the highest positions of the U.S. Congress, military and

White House administration. Therefore, the ways in which the tools are used must be

examined within the context of how effective the intelligence cycle is, and how

particularly rigorous and thorough information is analyzed in the analysis phase of the

intelligence cycle. Intelligence failures, such as that of the “Underwear Bomber” or the

Fort Hood massacre, are evidence that challenges remain within the intelligence cycle,

and social computing tools are part of the techniques used within the cycle to mitigate

these threats.

Former head of the U.S. government`s Information Sharing Environment (ISE), Thomas

E. McNamara, suggested that information management is not standardized in the USIC,

and that systems are neither equipped to identify similar intelligence in multiple agency

databases nor able to “connect the dots” to flag this information as legitimate sources of

3 Intellipedia, the first widely-used social computing tool in the USIC, was established as a pilot project in

2005, but was formally adopted in 2006. See https://www.cia.gov/news-information/featured-story-

archive/2008-featured-story-archive/intellipedia-marks-second-anniversary.html. 4 See Appendix A for an overview of the intelligence cycle in more detail.

8

threats (McManus 2010). McNamara continued by saying that a lack of sophistication in

the tools being used to consolidate agency-centric information for the benefit of

intelligence production was due to weak oversight and resource allocation in improving

information sharing (McManus 2010). In an internal report, the ODNI’s Inspector

General Edward Maguire confirmed similar sentiments, when he suggested that the

culture of protecting “turf” remains a problem, there are little consequences for failing to

collaborate within USIC elements, the community suffers from outdated Information

Technology (IT) systems, and a general lack of overall strategy or leadership structure to

drive collaboration among IC agencies is pervasive (ODNI 2008b, 6-7).

Second, the tools themselves, from an operational perspective, are under-performing.

Chris Rasmussen, an award-winning social software knowledge manager at the National

Geospatial Intelligence Agency (NGIA) who helped pioneer the social computing

movement in the USIC and is considered to by many to be one of the leading voices for

innovation in the U.S. government (Federal Computer Week 2008), has argued that these

tools have “reached their limits” (Jackson 2009a). The tools sit outside the conventional,

bureaucratic workflows of producing intelligence, and the material is seen as “informal”

and “good for collaboration but not for the official product” (Rasmussen 2010). Users

continue to fall back on multiple, unconnected legacy databases and agency-centric

systems, and are still fearful of using social computing tools extensively because of

engrained habits of internal information hoarding and an unwillingness to share due to

the sensitivity of the material (Jackson 2009a). Rasmussen suggests that these work

behaviours create unnecessary duplication in material: USIC personnel are recording

“informal” information in places such as Intellipedia and A-Space, but recording the

9

same material elsewhere as official, agency-approved records (Rasmussen 2010).

Additionally, burgeoning agency budgets and fragmented resource allocation planning

between agencies are also leading to practices where the USIC is creating unnecessary IT

infrastructure (Rasmussen 2010), lending credence to a lack of oversight and proper

strategy management as identified by McNamara and Maguire.

Third, Rasmussen also asserts that the most effective innovations in organizational

technology are those that are considered disruptive rather than incremental (Rasmussen

2010). Disruptive technology (or “innovations,” according to Erik Christensen in The

Innovator`s Dilemma (1997)), are unexpected introductions of an idea, product or service

that create an entirely new market by gaining a competitive advantage over existing

innovations (Christensen 1997, xv). Incremental (or “sustaining”) innovations are those

that improve competitive advantage over time, but do not replace existing markets (xv) In

The Innovator’s Solution (2003), Christensen broadens the scope to include the business

model, not just the technology, as a primary vehicle for disruptive or sustaining change

(Christensen and Raynor 2003).

If social computing tools were designed to transform the sharing of intelligence within

the USIC as a remedy to the pre-9/11 information sharing woes, they are

underperforming as disruptive innovations meant to radically change and improve

business functions. Successfully disruptive innovations have the ability to create a major

transformation in an organization such that its use hits a point of critical mass, and the

organization would be negatively impacted more by a reversion to the old method of

technology from before (Kostoff, Boylan, and Simons 2004, 8-9). For the USIC and

social computing tools, this successful disruption has not been achieved. As Rasmussen

10

has stated, legacy data systems continue to be used by agencies that continue to stove-

pipe information from other agencies, and prefer to own their own content rather than

share it. Therefore, social computing tools currently play a sustaining innovation role,

rather than a disruptive one: they simply serve to enhance pre-existing technology,

process and procedures (Witzel 2003), which become one of the primary reasons in

identifying them as under-performing and being under-utilized as part of this research

problem.

1.3 Research Question

This thesis addresses the use of social computing tools in the USIC and how such

technology can be effectively used within a national intelligence and security

environment. How are the social computing tools being used, and to what extent are they

effective in fulfilling the mandates of information sharing as identified by the USIC after

the events of September 11, 2001? If shortcomings exist, what are these shortcomings

and how are they articulated through the available sources and against this thesis’

methodology for measuring successful social computing strategy in the United States

national security environment? Finally, what recommendations can be made to improve

the use of social computing in the USIC in order to better meet the goals and objectives

of the community’s need for better information sharing and intelligence?

1.4 Purpose of Study

Social computing tools have been implemented into the USIC to achieve better

information sharing within and among agencies under the ODNI. These tools were

implemented as one of a number of solutions to improve information sharing, where

agencies have historically “stove-piped” information from one another as identified in the

11

9/11 Commission Report and in other strategy-related documents that have identified a

lack of information sharing and agency cooperation as having led to intelligence failures

and challenges in the past. It is currently also unclear if the “responsibility to provide”

mantra that the ODNI has underline in its strategies has been truly adopted into

contemporary intelligence analysis and production, especially as it pertains to improving

the quality of intelligence as it moves through the conventional intelligence cycle5.

The purpose of this study was to examine how intelligence analysts, employees and

community stakeholders use the social computing tools available for use in the United

States intelligence community to share information as intended under the direction of the

ODNI and within the National Intelligence Strategy and the Information Sharing strategy.

This includes identifying recommendations for how the community can improve the use

of these tools to better attain their information sharing goals as outlined in the NIS and

the Information Sharing Strategy.

1.5 Methodology

In order to identify the effectiveness of these social computing tools as they pertain to

better information sharing among USIC agencies, this research paper employed a

methodology of building an effective social computing model and comparing the tools of

the community against it. The model is built and devised on four different sources of

information sharing requirements, and is used in this paper (Chapter 4) to analyze the use

of social computing tools in the USIC since their general inception in the middle of the

first decade of the 21st century. This thesis identified the four sources based on their

5 For an explanation of the intelligence cycle and its phases, see Appendix A.

12

relevance to effective use of social computing: effective use of internal social computing

in organizations, organizations within public sectors, using these tools within an

“Enterprise 2.0” deployment, and within recommendations made directly to the USIC

pertaining to known fallacies in intelligence gathering and sharing. The social computing

tools in the USIC are then compared against this model in Chapter 4, which identifies

strengths and weaknesses of their use. Chapter 5 outlines areas recommendations and

identifies areas of improvement for the USIC to undertake.

The model makes particular use of two studies: Our Experience with Intellipedia: An

Ethnographic Study at the Defense Intelligence Agency (2008) by Nancy Dixon & Laura

McNamara (Dixon and McNamara 2008) and How A-Space is Shaping Analysts’ Work

(2009) by Nancy Dixon (Dixon 2009). These two studies are both ethnographic studies

conducted with participants in the Defense Intelligence Agency (DIA), and provide the

basis for the main sample used in the analysis. The research paper also uses open sources

from academic journals, newspaper websites, magazine websites, blogs and other online

sources to generate additional feedback from users and commentators of the community’s

suite of social computing tools. The goal of these sources was to determine the level of

understanding and engagement with these tools as they are being used in contemporary

intelligence production.

As the thesis will show, there is a certain level of understanding and effectiveness among

users, but questions and uncertainties are present, including concerns relating to

discoverability of information, integration into business workflows, managerial support,

and performance measures and incentives related to social computing use.

13

1.6 Definitions

In discussing and analyzing social computing tools and their existence within the United

States’ security environment, certain terms must be defined in order to understand topics

examined throughout this thesis.

1.6.1 Social Computing

Social computing is a term often used throughout this thesis. The definition of social

computing is a broad term, and can be understood as people engaging in social behavior

using computational systems (in simpler terms, the intersection of social activities and

technology to create value). Forrester Research, a technology solutions firm, identified

the term as “a social structure in which technology puts power in the hands of

communities, not institutions” (Fenwick 2010, 2). This definition refers to the growing

power of web users to build valuable information rather than formal information

gatekeepers of past network systems. From another source, business technology website

ZDNet suggested that the basic tenets of social computing were that innovation is moving

from a top-down to bottom-up model, value is shifting from ownership (of information)

to experiences, and that power is shifting from institutions to communities (Hinchcliffe

2006).

One associated term is “crowdsourcing,” a concept articulated by James Suroweicki,

author of The Wisdom of Crowds (2004). According to Surowiecki, the activity of

crowdsourcing involves building knowledge by the collective efforts of many rather than

the expertise of a few. Surowiecki suggests that the lack of crowdsourcing and

aggregation of collective data within the USIC were prime tenets that contributed to the

attacks of September 11, 2001. Surowiecki surmises that crowdsourcing, whether done

14

through social computing or other social methods of information sharing, may have

contributed positively to national security efforts had they been employed effectively and

within a sophisticated social information sharing strategy (Surowiecki 2004, 68-75).

1.6.2 Web 2.0

Web 2.0 is a subset of social computing. The concept of Web 2.0 is an underlying

technology when discussing social computing tools. Web 2.0 is a term used to describe

the next “stage” of the Internet and its evolution: whereas the Internet once existed as a

simple one-way communication and information retrieval tool (managed by information

gatekeepers), it has since expanded to include a set of web technologies used as a

collaborative medium to create, build and exchange valuable information and data

(O'Reilly 2005). The term, in its broadest definition, was defined by Tim O’Reilly in

2005 to describe the “web as a platform” and that as opposed to simple information

retrieval, it could now be used as a platform for users to collaboratively create content

together, engage in two-way conversations and activities and create value through these

activities (O'Reilly 2005). The technology is based on web-based software (as opposed to

stand-alone desktop applications), client/server software and content syndication (Web

2.0 2009).

Andrew McAfee of the Massachusetts Institute of Technology (MIT) identified that Web

2.0 technology includes the following six features (McAfee 2006, 21) (all of these

features of Web 2.0 technology also comprise “Enterprise 2.0”, which is defined and used

in the effective social computing model devised in this thesis):

1) The ability to search for and discover information;

2) linking content together through collaboration (hyperlinking content);

15

3) The ability of users to author content and publish across multiple platforms;

4) using metadata6 to organize collaborative content and tag information;

5) identifying like-minded content (content by extension or similar qualities); and,

6) the ability to “signal” other users to new or changing content.

Web 2.0 technology has since become synonymous with mainstream websites such as

Twitter (a microblogging site), Facebook (social networking), Wikipedia (a wiki site) and

YouTube and Flickr (multimedia sharing sites), which use the technology extensively to

provide value-building web-based services to both users and website owners. All of these

sites and many more are integrating Web 2.0 technology into their front-end websites to

provide a more interactive experience for both commercial and recreational users.

1.6.3 Social Media

Social media is also a mainstream phrase that is related to social computing and Web 2.0

technology. Kapan and Haenlein define it as “a group of Internet-based applications that

build on the ideological and technological foundations of Web 2.0, and that allow the

creation and exchange of user generated content” (Kaplan and Haenlein 2010, 61). They

distinguish social media from Web 2.0 as being that which is produced by people, not

simply the technology available for collaboration and sharing (that is, the content

generation by users). In this sense, this can include content produced on social

networking sites (e.g. Facebook, LinkedIn), multimedia sharing sites (e.g. YouTube and

Flickr), social bookmarking (e.g. del.i.cious) and crowdsourced wiki sites (e.g.

6 Metadata is applying information about content such as authorship, copyright information, keywords and

other relevant categorical elements that can be used to organize and store data. See "metadata." 2006. In

High Definition: A-Z Guide to Personal Technology. Houghton Mifflin.

http://www.credoreference.com.ezproxy.lib.ucalgary.ca/entry/hmhighdef/metadata.

16

Wikipedia). Such sites have displayed high rates of usage on the Internet – social

networking alone counts for 22% of all time spent online in the U.S. (Social

Networks/Blogs Now Account for One in Every Four and a Half Minutes Online 2010),

one in every 13 people on Earth (845 million users) have a Facebook account (Hepburn

2011), and Twitter produces over 250 million “tweets” per day (Nakano 2011b).

In recent years, social media has played a large role in national and global events with

national security implications. During the 2008 terrorist attacks in Mumbai, social media

sites were used to inform the public of the events on the ground, including providing

information on locations of people, resources and even multimedia to capture the attacks

inside the Taj Mahal Hotel. The attackers were also suspected of using Google Maps and

Voice over Internet Protocol (VoIP) to coordinate their activities (Singh 2009). Social

media sites were also used by Iranian protestors during the 2009 Green Revolution in Iran

to organize massive anti-government demonstrations and capture police brutality and

violence on the streets of Tehran. Social media was used again in a similar manner during

the 2011 Arab Spring revolutions in Egypt, Libya and Tunisia. Governments in Iran and

across North Africa attempted to censor social media sites during these key political

events, fomenting political activism. In the Western world, social media also played roles

in major political events, including organizing protestors and broadcasting live, real-time

events during the 2008 RNC Convention to coordinate protesters in St. Paul, Minnesota

(Havenstein 2008), the protests against the G20 Summit in Toronto in 2010 (Zerbisias

2010), and in the United Kingdom during the London riots in 2011 (Tonkin, Pfeiffer, and

Tourte 2012, 51-52).

17

The concept of social media is not the focus of, nor often mentioned in this research

paper. This paper assumes social media is pertinent to the usage of social computing and

Web 2.0 technology in a public, peer-to-peer setting. It is namely used by organizations

towards its external and public audience where collaborative efforts produce valuable

public information rather than sensitive internal information (as is the focus of

intelligence agencies). Instead, this thesis focuses on using social computing and Web 2.0

to describe the social efforts of users within the United States intelligence community, as

they pertain more directly to the engagement of social behavior using computer and web

technology for internal organizational purposes.

1.6.4 Open source

Open source is the philosophy of promoting available products and information, and is

closely associated with using the power of the crowds (crowdsourcing) to provide open

source content (including content which is both free and licensed proprietary software).

This can include open source software, where programming code is available to be

augmented by users who can freely change, upgrade or alter the original code or other

people’s augmented open source code. Many Web 2.0 technologies and software

applications are built with open source software in mind – operating systems such as

GNU/Linux and Google Android OS are two instances where open source code is

available to be augmented for both business and recreational purposes.

From a non-software perspective this pertains to the availability of user-generated content

or the free licensing of information – content created on websites (Web 2.0 or not) can be

considered open source if it is intended to be used publicly and for the good of the public

domain. Blogs, online message boards, shared documents built collaboratively (e.g.

18

Google Documents) and multimedia available for re-use (e.g. content on YouTube and

Flickr) are examples of open source content. This thesis is written with the understanding

that open source content is a foundational element of effective social computing, and as

such, uses open source content in conducting research for social computing efforts within

the U.S. intelligence community where possible.

1.7 Limitations

While this thesis uses open source content for much of the research conducted (e.g.

newspapers, software industry websites, blogs, academic journals and other online

content), it must be understood that this is due the relative lack of content regarding the

topic of social computing tools used within the USIC. Social computing tools have only

become prominent ideas and applications within organizations in recent years, and

definitively more of a focus for the USIC since Andrus’ theories on their effective use in

the intelligence community. However, such in-depth material on social computing tools

and analytics to support their usage are not widely available due to the sensitive nature of

the community that this research paper focuses on. Therefore, this paper was researched

with the most up-to-date and relevant sources that could be availably found, both online

and in academic material, including public documents direct from the USIC.

Second, the studies completed by Dixon and McNamara, of which this thesis uses as

significant research components of measuring the success of social computing tools, are

careful to identify that they are not “findings” but rather “’observations’ derived from

themes” that reoccur in the interviews conducted (Dixon and McNamara 2008, 4) and

intended to be a “snapshot” of life using these social tools (Dixon 2009, 5). The external

validity of these observations may or may not be applicable to the larger USIC as it

19

pertains to the different working environments, cultures and work expectations of the

various agencies within the community. However, these studies serve as relevant

microcosms of the community as they pertain to larger information sharing challenges

from a whole-of-community approach.

Additionally, progress reports on the implementation of the goals and objectives from the

NIS, the Information Sharing Strategy and other USIC documents are difficult to acquire.

While the documents outlining the high-level vision and strategy can be found publicly

(even directly from major USIC websites such as www.intelligence.gov and

www.cia.gov), few primary sources could be sourced to track analytics and performance

measurement, nor could intelligence community staff be interviewed without proper

security clearance and permission from the ODNI and its agencies. Much of the research

within this paper is derived from commentary and knowledge from various analysts,

professionals, managers and academics within and related to the USIC and the general

intelligence field.

This issue is compounded by the fact that these strategies have been devised relatively

recently, and have not been given ample time to be assessed whether they have been truly

successful (or not) on a long-term basis. The demands of the United States intelligence

community adjust and expand rapidly due to their reactive nature to security threats;

therefore, it is understood that their strategies are not likely to have direct, measurable

analytics due to strategy often changing from changes in leadership, policy objectives

from Congress and White House administrations, as well as budgetary and resource-

intensive concerns.

20

Finally, this paper recognizes that the USIC is a large and complex bureaucracy, and that

many challenges are present beyond social computing and cultural factors discussed in

this thesis. The reluctance of agencies to expose sensitive tactical, operational and

strategic information is likely to continue providing large-scale organizational challenges,

regardless of any proposed technological solutions. This persistent environment of

secrecy and distrust, shrouded in long-standing intra-agency animosities means that

recommendations offered in this thesis may be hypothetic at best. Primary research may

provide opportunity to refine the analysis and recommendations in this thesis for more

complex challenges, although such methodology is likely to encounter the same cultural

and organizational barriers in siloed information sharing among USIC agencies as

identified in this thesis.

21

CHAPTER 2: THE UNITED STATES INTELLIGENCE COMMUNITY’S

SOCIAL COMPUTING TOOLS

A decade after the fall of the Soviet Union and the end of the Cold War, the United States

was at a crossroads. The bipolar world in which the nation existed had diminished, and

intelligence, much like the rest of the federal government, was left to restructure to

accommodate this “new world order” (Turner 1991, 150-151). The 1990-1991 Persian

Gulf War solidified the United States’ new role as a leader in the international

community, ousting Iraqi forces from Kuwait through a United Nations-backed coalition

of allies. While Operation Desert Storm was considered a success, the USIC identified

challenges to improve on, while also redefining their role in a post-Cold War

environment. The failure to predict the invasion and the lack of knowledge about Iraq’s

weapons of mass destruction program led to serious questions about the effectiveness of

collected intelligence, and whether the right kind of intelligence was being collected (e.g.

human intelligence over other types). Additionally, the Gulf War also displayed

challenges in the collection, processing and analysis phases of the intelligence cycle:

there was difficulty in transmitting tactical intelligence to the front lines, such as

detecting missile launches from Iraq or determining enemy ground movement before and

after battlefield operations (Thompson 2006; Weinrod 1996).

Solutions to these challenges, however, required a deep review of intelligence analysis

and collection, and yet the CIA and other U.S. intelligence agencies were scaled back,

both in terms of budget and workforce - as much as 25 per cent of IC personnel were

reduced (Weinrod 1996, 8). These matters were complicated by the lack of forward-

thinking strategy regarding improvements to all aspects of the intelligence cycle. To

22

complicate matters, the rise of non-state actors and the threat of terrorism created greater

difficulty in identifying and defining the new type of ‘enemy’ faced by the Western

world. The engagement of the U.S. military in Somalia in 1993 and the bombing of the

World Trade Center in the same year showcased problems in agency knowledge of

intelligence that pertained to both domestic and foreign affairs.

Responding to these difficulties and challenges in terms of intelligence analysis was also

hindered by the aging mandates and policies built from the foundations of the National

Security Act of 1947, which allowed only for vertical command and decision-making

rather than any horizontal cooperation among agencies. Such policies also continued to

encourage agency “stove-piping” of information, and a “need to know” culture that did

not permit active intelligence sharing (McConnell 2007). These issues were pervasive

throughout the 1990’s, a transitional period for the USIC defined by a lack of intelligence

strategy, structural cutbacks, a stubbornness to adopt new intelligence and information

sharing methods and to define changing threats to American security. It may have been

no surprise, then, that the terrorist attacks of September 11, 2001 shook the very

foundations of the intelligence community and the United States as a nation.

The attacks heralded a new era of intelligence for the United States. The 9/11

Commission Report determined that a lack of intelligence sharing was one of the largest

failures that led to the attacks (National Commission on Terrorist Attacks upon the

United States 2004), and that a culture of “need to know” intelligence and practices that

followed this precedent were part of the larger failure in intelligence analysis. The

intelligence community was plagued with a series of challenges which the 9/11

Commission Report identified as six problems in the IC that were apparent before and

23

after 9/11 (National Commission on Terrorist Attacks upon the United States 2004, 408-

410):

There were structural barriers to performing joint intelligence work;

There was a lack of common standards across the foreign-domestic divide in

intelligence gathering;

Divided management of national intelligence capabilities;

A weak capacity to set priorities and move resources between agencies;

There were too many jobs expected to be completed by agencies (e.g. the CIA’s

role); and,

The intelligence community was too complex and secretive.

From these points, intelligence reform was recommended in a variety of areas. The 9/11

Commission Report recommended that the IC needed a new command structure, and that

a National Intelligence Director oversee national intelligence programs and agencies.

Additionally, the Report recommended that the CIA needed to rebuild their analytic

capabilities, and improve their human and signals intelligence collection methods at the

tactical and operational levels. Recommendations were also put forth to operate in a more

joint-collaborative environment between the CIA and the military, including training,

direction and execution of special and/or clandestine operations (National Commission

on Terrorist Attacks upon the United States 2004, 411-416).

The Report continued to make recommendations regarding improving information

sharing. It determined that analysis and ‘making sense’ of the data they already had was

problematic, and the “need to know” culture was pervasive, which resulted in “security

24

requirements (that) nurture overclassification and excessive compartmentation of

information among agencies” (417). The report recommended that, as a solution,

information be shared horizontally across agencies rather than through the conventional

vertical model; databases should be open and searchable along agency lines (418). The

Report also identifies a 2003 white paper from the Markle Foundation which provides

recommendations on the creation of a ‘trusted information sharing network’ for

homeland security. This white paper identified the need for decentralization of

information to a network rather than a ‘hub and spoke’ model, and to move away from

the tightly-controlled access to information that emphasized a greater risk of “inadvertent

or malicious disclosure” over the “greater benefit of wider information sharing” within

the IC (Baird 2003, vi). The paper also identified problems before 9/11 and discussed

earlier, such as inadequate intelligence support for real-time operations and a lack of trust

and information leveraging between federal, state and local agencies.

In the subsequent years after 9/11 (and particularly after the 9/11 Commission Report),

the United States began to re-draw their intelligence strategy. While the USA PATRIOT

Act and the Homeland Security Act of 2002 included provisions for improved

information sharing among government agencies, the most prominent piece of legislation

to answer the recommendations put forth by the 9/11 Commission Report was the

creation of the Intelligence Reform and Terrorism Prevention Act (IRTPA) in 2004.

The IRTPA was the pinnacle piece of legislation that reformed the intelligence

community. This Act of Congress was legislated based on multiple pervading problems

in the intelligence community: that shortcomings in intelligence analysis were pervasive

(as opposed to collection, organization or management); that the attacks of 9/11 and the

25

flawed intelligence from the National Intelligence Estimate (NIE) on Iraqi Weapons of

Mass Destruction (WMD) in 2002 had resulted in serious investigations attempting to

identify causes of those failures and corrective actions; and that the organizational

structure of the IC needed reform for better control and coordination of intelligence

priorities (Cooper 2005, 3). The 9/11 Commission Report also had a pervasive theme -

that a lack of unity among the community of agencies was void of any meaningful

integration and cooperative efforts to truly work together to prevent major threats to the

United States (Harknett and Stever 2011, 701).

Among the various reforms from the IRTPA, the most pressing regarding the intelligence

community was the creation of the Office of the Director of National Intelligence, which

was established as the overseeing authority among all military and non-military

intelligence agencies in the IC (see Appendix B). The Act also created the National

Intelligence Council, established a National Counterterrorism Center, a Joint Intelligence

Community Council, a Privacy and Civil Liberties Oversight Board (in response to

government and public concerns about the PATRIOT Act), and introduced a series of

reforms regarding transportation security, border protection, immigration and terrorism

prevention.

The Act also brought about changes in agency information sharing. The Information

Sharing Environment (ISE) was created under Section 1016 of IRTPA. While the scope

of the ISE includes supporting mission processes with core capabilities (e.g. supporting

the Suspicious Activity Reporting (SAR) program though National Fusion Centers), the

focus of the ISE is on sharing “terrorism and homeland security information”

(Information Sharing Environment 2011). Facilitation of information sharing among not

26

just federal agencies but also state, local and tribal agencies as well as private sector

partners and international allies is also a mandate of the ISE. The Program Manager of

the ISE (PM-ISE) coordinates and facilitates access to information and services that

contribute towards anti-terrorism missions. The PM-ISE also issues procedures,

guidelines and standards relating to ISE policies, and ensures the ISE is “built to improve

sharing and protection of terrorism, homeland security and WMD information” (Program

Manager - Information Sharing Environment 2011, 3).

The ODNI serves as a mission partner of the ISE, and the ISE is a central player in the

development of the IC’s information sharing strategies since its inception in 2005. Such

efforts made with the ODNI include providing NCTC Current, a web site for analytical

products on the Joint Worldwide Information and Communications System (JWICS),

providing the Worldwide Incidents Tracking System (WITS) to the IC, posting

Intelligence Today, an online internal newspaper for IC analysts, as well as various

components that compose Intelink (Program Manager - Information Sharing

Environment 2011, 58-59).

2.1 Security and Intelligence Strategies

The National Security Strategy (NSS), which outlines the national security concerns of

the nation and how they intend to deal with these concerns, has primarily guided strategy

from the ODNI regarding intelligence and information sharing. The NSS, while

remaining high-level with respect to details for the USIC, nevertheless provides a theme

27

of collaboration and cooperation among federal agencies and international allies, while

promoting information sharing as a common goal to prevent attacks on the homeland7.

Derived from the NSS, the National Intelligence Strategy of the United States of America

(NIS) and under the directorship of the first DNI, John Negroponte (2005-2007), the

ODNI refined existing strategy and developed new plans and information sharing

strategies to meet the information requirements set out by IRTPA and its associated

programs (listed earlier). Having been initially released in 2005 and revised in 2009, the

NIS outlines particular mission objectives and enterprise objectives that the IC must

reach. Particularly, the enterprise objectives are focused towards improved information

sharing, and thus, this is the highest level document in the USIC that begins to articulate

an organizational shift towards a robust information sharing environment. Particularly,

the goals outlined in the NIS that most appropriately refer to information sharing and

interagency collaboration include (ODNI 2009c, 5):

- Goal 3: Operation as a single integrated team, where collaborative teams leverage

all IC capabilities to meet the requirements of users; and,

- Goal 4: Delivering balanced and improving capabilities that leverage the IC’s

competencies, and integrate these capabilities to reap synergies and efficiencies in

missions and challenges.

These goals are supported by seven Enterprise Objectives (EO) that identify what the

community “will achieve as an intelligence enterprise to support our Mission Objectives”

(ODNI 2009c, 11-17). Of the seven objectives, the most pertinent to information sharing

include:

7 It should be noted that collaboration and cooperation among agencies is a vastly larger focus in the 2010

version of the NSS (rather than the version issued in 2002), particularly as a whole-of-government

approach, (ODNI 2010, 14-16).

28

- EO #3: Streamline business processes – this objective is related to overcoming

redundant and non-interoperable systems and infrastructure that produces poor

quality data (13);

- EO #4: Improve Information Integration & Sharing: this objective is to improve

the use of Information Technology (IT) to handle the growth in data and

processing capabilities. The IC’s network must provide a trusted, reliable network

to promote a community-based, cross-agency information sharing environment

that eases workflow, improves information aggregation and analysis, and to

consolidate/eliminate legacy data systems where possible (14-15); and,

- EO #6: Develop the Workforce: The IC must develop and retain a workforce that

is agile, talented, and culturally understanding of modern day intelligence issues.

To support this, the workforce must be able to meet cross-functional and cross-

organizational objectives (that is, working in a community, not just within an

agency (16).

When the NIS was updated in 2005, the execution of the strategy would subsequently be

outlined it two documents (released in 2007): the 100 Day Plan for Integration and

Collaboration and the 500 Day Plan for Integration and Collaboration. Both were

devised under the leadership of DNI Michael McConnell, who had replaced Negroponte

in 2007. Many of the initiatives that McConnell focused on during his tenure as DNI

involved improving the IC information sharing environment. The 100 Day Plan included

six “integration and transformation focus areas,” including creating a culture of

collaboration, fostering collection and analytic transformation, modernizing business

practices, accelerating information sharing, building acquisition excellence and

technology leadership, and aligning the DNI’s authorities (ODNI 2007a). The section on

fostering collection and analytic transformation particularly refers to providing:

“… An IC-wide analytic information technology (IT) environment encompassing

current initiatives such as A-Space, the Library of National Intelligence, and

Catalyst. These initiatives integrate IC efforts to address the challenges of

information overload, collaboration and information sharing.”

- (ODNI 2007a, 6)

29

The 500 Day Plan for Integration and Collaboration, which was released in August,

2007, extended the initiatives of the 100 Day Plan. “Core” and “Enabling” Initiatives

were identified for each of the six focus areas outlined in the 100 Day Plan, and went into

further detail to address these areas for successful completion. This document also further

identified legacy issues within the IC, including the lack of cross-agency linkages, the

chronic challenges of siloed information, and the “need to know” culture that is pervasive

within the IC. The section entitled Fostering Collection and Analytic Transformation also

outlined that the IC will provide ‘pilot’ operations that enable community collaboration,

including providing access to consolidated data repositories of disseminated IC products

(ODNI 2007b). This section goes further in identifying the need for better collaboration

among IC analysts:

“An analytic collaboration environment supports ‘live’ knowledge bases,

providing an ability to identify and exchange critical insights with other experts

working on similar topics, and quantitative measures of effectiveness.”

- (ODNI 2007b, 9).

The section continues with:

“Improved collaboration and information sharing resulting in better intelligence

analysis, reduced information overload, and enhanced early detection of critical

new data. The full capabilities of the IC will be brought to bear in collaborative

all-source analysis.”

- (ODNI 2007b, 9).

Additional legislation, programs and strategy documents have also been released to

support the IC’s information sharing environment:

- Information Sharing Strategy (2008): this strategic document of the USIC adopts

many of the same principles of the NIS, the 100 Day Plan and the 500 Day Plan.

30

This document identifies the challenges the IC faced after 9/11, and recommends

methods in which the USIC will transform to an integrated enterprise with a common

end-state and shared vision. This document includes an analysis of the legacy

information sharing model in the IC compared to the “new” information sharing

model that needs to be implemented (among other methods for achieving

organizational transformation (ODNI 2008d)).

- Strategic Intent for Information Sharing 2011-2015 (2011): This document provides

a framework for the ongoing information sharing reforms brought about by previous

reports and strategies. It was also designed to resolve ongoing tensions between

information sharing policies among various agencies through risk-managed

approaches. More than in previous mandated documents, Strategic Intent focuses on

information sharing while mitigating risk. This includes maximizing intelligence

sharing while focusing on privacy rights, respect for sensitive information, and

implementing auditing, monitoring and improved oversight procedures for the use of

this information. This document was produced under the leadership of Corin Stone,

Intelligence Community Information Sharing Executive, who works with the PM-ISE

to coordinate a holistic approach to information sharing across the U.S. government

(ODNI 2011).

- Executive Orders (13311, 13356, 13388, 13470): Various executive orders from

former President George W. Bush hold agencies accountable for sharing information

and granting access to terrorism-related data to other agencies with counterterrorism

functions. McConnell worked with the White House to create Executive Order

13470, which updates Executive Order 12333 (an EO from the Ronald Reagan

31

administration) to accommodate changes in the IC structure after 9/11, including both

organizational structure and technology and methods shared among the IC agencies

(Register 2008).

- Intelligence Community Directive (ICD) 501: This directive charges each agency

within the IC with a “responsibility to provide” information, rather than the

traditional “need to know” basis which the IC used as a principle during the Cold

War and into the new millennium. It mandates that agencies must make intelligence

analysis available through automated means, and that to withhold information, an

agency must prove that sharing will jeopardize national security, or is illegal. It also

defines IT standards and articulates implementation plans for achieving the ICD 501

policy objectives (ODNI 2009b).

- Through a program called Analytic Transformation (AT), the ODNI seeks to provide

analysts shared access to relevant information, find solutions to the challenges

presented by expanding data volume, provide training in rigorous analytic methods

and standards, and build trust among USIC experts inside the IC and beyond (ODNI

2008a, 4). The AT program includes A-Space, Catalyst and the Library of National

Intelligence as examples of its’ Integrated Analytic Operations. This program also

aims to enhance the quality of analytic products through tradecraft training, better

collaborative analysis, and to foster alternative analysis for more comprehensive

intelligence products (ODNI 2008a).

- IARPA (Intelligence Advanced Research Projects Activity): As a ODNI-initiated

research agency, IARPA is tasked with researching, developing and disseminating

new technologies to the IC. Founded in 2006, the agency focuses on improving the

32

value of collected data from all sources, maximize insight from collected information

(“inclusive analysis”), and to counter new technological capabilities used by

adversaries and competitors (IARPA n.d.). Director of National Intelligence

McConnell remarked that the IARPA reinforces intelligence reform by assisting in

the collection of all-source data, creating solutions for analyst “information

overload”, and by effectively sharing and protecting intelligence through science and

technology (Negroponte 2006). IARPA is considered to be the ‘sister’ organization to

the Defense Advanced Research Projects Activity (DARPA).

2.2 Social Computing Strategy in the United States Government

Social computing and Web 2.0 technology permeates many areas of the United States

government, and nearly all federal agencies and departments have a social computing

presence. For example, the White House`s main website, www.whitehouse.gov, uses the

open source content management system Drupal, which allows the website to use third-

party plug-ins and integration with other social media sites (e.g. Facebook or Twitter) to

propagate its content among various social networks and search engines. Barack Obama`s

election campaign was largely based on online support provided through social media.

His election coincided and benefitted from the increased use of social computing tools in

the U.S. government, as well as the rapidly increasing popularity of social media sites

such as Facebook and MySpace (Harfoush 2009).

Social media and social computing are also large components of the United States’

strategy of transformational diplomacy, first introduced by former Secretary of State

Condoleezza Rice in 2004 as a means to improve public relations with both foreign

33

citizens and partners abroad (Rice 2011). Part of this strategy was an increase in dialogue

and communication with these groups using Web 2.0 technology. Such initiatives under

this program included implementing Diplopedia, a wiki used for State Department

employees, and Communities@State, a series of blogs designed to connect with the

public about the State Departments activities.

Particularly for the USIC, The Wiki and the Blog: Towards a Complex Adaptive

Intelligence Community was the inspiration and impetus towards adopting social

computing tools (Thompson 2006), although social computing, as demonstrated by the

various strategies and programs initiated by the ODNI, had the supportive strategic

environment to foster growth. As a result, the USIC suffers no shortage of Web 2.0 and

social computing tools. On Intelink, the community’s primary network for IC analysts

and other employees, various tools are made available that compliment intelligence

collection and analysis. Additionally, the IC has begun to engage the general public in

various social media realms, both from the ODNI and from individual agencies. The CIA,

for example, maintain their own Facebook, Flickr, YouTube and RSS public tools, which

differ from the ODNI’s particular Facebook page and series of multimedia offerings to

the public (including videos and podcasts). Although supportive of ODNI strategy, many

agencies maintain their public-facing social media tools exclusively from one another.

The larger framework for social computing adoption is often characterized by what

citizens and non-government audiences see on the Internet, many sites of which were

designed under the Obama administration’s Open Government Initiative. For the United

States’, major Web 2.0 initiatives that engage both employees and the public include (but

are not limited to):

34

DipNote: The official blog of the U.S. Department of State; the blog was set up in

2007 and covers a variety of world issues, including allowing readers to comment

on blog updates.

Virtual Presence Posts (VPP): a State Department initiative that assembles

websites to connect with individuals abroad where the United States does not

have a physical presence, such as Somalia and the Seychelles; there are currently

more than 50 active VPPs.

Communities@State: a series of blogs that let State Department employees

connect and form communities of interests (and blog about their perspectives).

Apps.gov: a service provided to federal agencies that, among other services, helps

set up blogs, wikis and discussion forums (and allow mobile access on

smartphones).

IdeaFactory: a social networking site that lets Transportation Security

Administration (TSA) employees suggest ideas for workplace improvement, with

other employees rating and commenting on ideas.

USAspending.gov: This site allows visitors to extrapolate data based on

government spending, and allow data to be retrieved in an open and transparent

manner. Information can be exported to common markup languages that can be

viewed on other websites (e.g. extensible markup language format (XML)).

The Weekly Address (given by President Barack Obama and hosted on YouTube

and Whitehouse.gov).

35

Data.gov: similar to USAspending.gov, this service allows visitors to access and

download federal datasets pertaining to a variety of data types, ranging from

statistics on national food consumption to geospatial data marked on historical

maps.

Other major Web 2.0 services are not directly an initiative of the government or part of

public strategy. GovLoop, a social networking site created to connect government

employees to all individuals interested in government business, began as the initiative of

a former government employee. The website now exceeds 32,000 members and is a

premier web destination for connecting U.S. government employees at the tribal, local,

state and federal level on a variety of topics (Rein 2010).

2.2.1 The intelligence community’s computer network

There are three main computer networks in the USIC based on permissible access to IC

employees with different security clearance levels. NIPRNet (Non-secure Internet

Protocol Router Network) is the largest and most accessible network among the IC’s

networked computer domains. NIPRNet’s security level is “Unclassified But Sensitive

(SBU).” It was created by the Department of Defense in the 1980s, and allows employees

controlled access to the Internet. Different social computing tools such as Intellipedia are

available on this network, although content on this network does not contain the same

material as that found on Intellipedia used in higher-clearance networks.

The second major network is SIPRNet (Secret Internet Protocol Router Network). Much

like NIPRNet, it is a secure computer domain that enables over 500,000 IC employees to

access intelligence classified as “Secret (S)” (Weinberger 2010). SIPRNet surfaced on

36

media headlines in 2010 when it was revealed that classified information was stolen from

the network by Pfc. Bradley Manning and given to whistleblowing website Wikileaks for

public dissemination.

The IC’s highest clearance network is the Joint Worldwide Intelligence Communications

System (JWICS). This network only allows users with Top Secret clearance (TS), and is

primarily used by analysts in the IC (NIPRNet and SIPRNet are heavily used by

Department of Defense employees). JWICS hosts the most popular version of Intellipedia

on the various networks (Hinck n.d.), and was also accessed by Manning to retrieve

information that was passed on to Wikileaks.

2.2.2 Intelink

Intelink is a series of secure intranets which the USIC uses to promote intelligence

analysis and business workflow. It is the most widely-used network service in the IC, and

hosts a variety of social computing tools to enable users to communicate, collaborate, and

perform job duties more effectively. Intelink is available on the IC’s different permission-

based intranets. The following versions are available across the IC:

Intelink-U: this version was established for use by various U.S. federal

organizations. It exists on NIPRNet, and is designed to allow the greatest amount

of access to employees of any clearance level. Intelink-U often is used to host

open-source intelligence data, and to create large communities of interest not

specific to just the IC alone.

Intelink-S: This version is for secret-level access. It is hosted on SIPRNet, and,

unlike services on JWICS, exists as an accessible communication network than

simply a single access terminal.

37

Intelink-TS: This version operates on JWICS, and permits access to those only

with a Top Secret / Sensitive Compartmented Information (SCI) level of access.

This is the version mostly often used by intelligence analysts, and is available by

terminal access only.

Intelink-P: Commonly referred to as CapNet, it is primarily used as a link

between the CIA and the White House, meant for high-level intelligence

consumers.

Intelink-C: This version is Intelink’s accessible form to trusted foreign partners,

such as the United Kingdom, Canada, Australia and New Zealand. Its’ security

level is equivalent to TS clearance (but is not the same), and is more commonly

referred to by its’ network name “STONEGHOST” (ODNI 2009a, 74-75).

Intelink was born from intelligence challenges that grew during the Persian Gulf War in

the early 1990’s. Military personnel complained that they had too many intelligence

systems to work with, and could not get a foothold on providing comprehensive

intelligence (Dean 2000) . The service was developed over the course of the mid-1990s,

and was released in 1996 on JWICS. It has since been re-produced on the lower security

networks, as shown earlier.

The two primary social computing tools on Intelink are Intellipedia, an internal wiki used

by the IC built using the same engine as popular encyclopedia website Wikipedia, and A-

Space (Analytic Space), a social networking and collaboration service that connects

analysts within and across agencies. To allow access to the various services and

applications on each instance of Intelink, users were given an Intelink Passport, a

38

universal network profile that lets all Intelink services authenticate users without

additional input (that is, users do not have to continually log-in to various services they

need access to). This functionality is akin to the Single-Sign On or OpenID standards that

exist on many websites on the Internet and private intranets.

2.2.3 A-Space (Analytic Space)

A-Space is a social networking tool that allows analysts across the IC to connect with

each other in a common collaborative workspace. It is a product of the Office of Analytic

Transformation and Technology, and provides access to the numerous databases within

the IC, as well as provides capabilities to search classified and unclassified documents.

A-Space allows instant messaging and collaboration tools to be used for intelligence

analysis and building intelligence products. The service was launched on JWICS in

September, 2008, and is built on Jive Software’s Clearspace application, a Java-based

enterprise knowledge management tool. Former DNI Thomas Fingar once described A-

Space as a “MySpace for analysts” and a “digital workspace where people . . . can share

information, share progress and draw upon data” (Council on Foreign Relations 2008). It

was also identified by the DNI as a significant component of the ODNI’s Analytic

Transformation program (ODNI 2008a, 8). The service was developed and implemented

by ManTech International Corporation with assistance from the DIA, and now hosts over

10,000 users (Business Wire 2009). A-Space was also named as TIME Magazine’s 32nd

best invention of 2008 (TIME 2008).

The service has a similar user interface to that of popular social business networking site

LinkedIn: analysts fill in their email addresses, phone numbers, contact information and

areas of expertise. The technology used allows users to discover other analysts with

39

similar interests and expertise, both within and across agencies, and to connect with them

to create direct contacts.

Users are giving workspaces within A-Space to share documents and conversations – this

allows for multiple-user document editing, as well as crowdsourcing new ideas and

methods into an otherwise one-dimensional document with little cross-agency

collaboration. Such was the case during the Mumbai terrorist attacks in 2008, when

analysts convened in a workspace (as well as Intellipedia) to share updates and expertise

(Jackson 2009b). A-Space also provides geospatial capabilities, including using Google

Maps as a “mashup” of information, where map points can be meta-tagged by users to

provide data on topics, or links to other pieces of information (Jackson 2009b).

Information access, however, is still compartmentalized. An analyst working specifically

on Russian affairs may not have the same access to Somalian or Eastern African affairs if

it doesn’t pertain to their role requirements. Additionally, the system flags users who

download mass amounts of data or repeatedly seek information they aren’t allowed to

access (Shane 2007).

A major part of implementing A-Space into the IC was also attributed to the changing

demographics of analysts entering the community. Fingar identified that “60 per cent of

all intelligence analysts have five years or less” in the workplace, and expect

collaboration tools that “push from the bottom” (Miller 2010). However, Michael

Wertheimer, former Assistant Deputy Director for National Intelligence, asserted that

participation in A-Space crosses all age and experience lines, and that is it not limited to

just one demographic of analysts (Shaughnessy 2008).

40

Eventually, A-Space is meant to provide analysts a way to support analytic research,

maintain situational awareness, sort and visualize data to identify trends and patterns, and

access analytic tools and methodologies, in addition to providing the social networking

functionality it currently provides (McIntyre 2009). It is expected that once A-Space is

fully activated with its features, it will enhance the quality of analysis by allowing the

collective knowledge to be accessed without the length and content constraints of

conventional intelligence products (ODNI 2008a).

2.2.4 Intellipedia

The largest and most widely-used social computing tool is Intellipedia, a wiki available

for use by employees across the entire intelligence community. According to the ODNI,

“Intellipedia enables collaborative drafting of short articles, which can be combined to

form lengthy documents, all using a simple interface in a web browser” (ODNI 2008a,

11). Intellipedia went live in April, 2006, and is powered by the MediaWiki engine, the

same software used by popular online (and crowdsourced) encyclopedia Wikipedia.

However, the rules for contributing content are different from those of Wikipedia. First,

there are no anonymous content contributions on Intellipedia – every article and piece of

content is tagged with the author of that content, what agency they represent, and when it

was contributed. Second, unlike Wikipedia, Intellipedia allows for partisan and non-

neutral articles that represent the viewpoints of the authoring agency. Finally, Intellipedia

allows users to use content for any collaboration purpose, whereas Wikipedia only allows

encyclopedia articles to be posted (e.g. every Intellipedia page has a corresponding

discussion page; Wikipedia does not) (ODNI 2007e). Users of Intellipedia are also

recognized for their contributions. Intellipedia recognizes active contributors by awarding

41

them a coffee mug and sending a letter to the contributors’ manager recognizing their

contributions. Agency managers also encourage participation by awarding prizes and

gifts (Havenstein 2008).

Intellipedia has more than 250,000 users and 1.28 million pages of intelligence content

available on all three secret domains – NIPRNet, SIPRNet and JWICS (Miller 2010). It is

installed as three separate services, and access is granted based on an employee’s

clearance level. Therefore, the most top-secret content on Intellipedia is only available to

those with access to JWICS, and the “sensitive but unclassified” material is available on

NIPRNet. Intellipedia on JWICS is the most popular version used – over 75,000 users

conduct over one million searches per week, over double the amount than on Intelink-U

(Intelligence Community Chief Information Officer 2010a; Miller 2010). The most

visited website on Intellipedia-TS is the Pacific Command Joint Intelligence Operations

Center portal, where over 1.2 million users consume all the intelligence products the

Command posts on this site (ODNI 2008a, 11). Intellipedia is also the only tool available

in the IC that enables community-consumer interaction.

Intellipedia has been used to collaborate on unfolding events, including creating an

intelligence product on the security climate in Nigeria in 2006, bomb-making by Iraqi

insurgents and North Korea’s missile tests (Hickman 2010). The benefits of Intellipedia’s

usefulness were illustrated in 2006 when a small two-seater Cessna crashed into a

Manhattan building. Users in nine different agencies collaborated to determine that it

wasn’t a terrorist act within only two hours, which drastically shortened the decision-

making process that was conventionally a challenge in the IC bureaucracy and through

traditional information sources (Hickman 2010).

42

Intellipedia is not the only wiki available to IC employees. The Federal Bureau of

Investigation has an agency-specific wiki called “Bureaupedia”, while the Department of

Defense used their own version entitled “DoDTechPedia”. The Department of State also

have their own wiki called “Diplopedia.” While these serve individual agencies, they do

not cover the wider audience that all agencies and consumers in the IC encompass;

Intellipedia remains the largest wiki in use among those in the intelligence community.

2.2.5 Inteldocs

Inteldocs allows every Intelink user 100 megabytes of storage space for files and

documents. Links to the documents can be shared via email or collaborative workspaces

(e.g. A-Space or Microsoft SharePoint workspaces), and are discoverable via Intelink’s

network search functionality (Intelligence Community Chief Information Officer 2010a).

2.2.6 iNews

iNews is a an enterprise news service that lets analysts and IC employees retrieve

intelligence documents using RSS feeds. RSS feeds are the most popular method of

distributing intelligence around the IC: Over 3000 RSS feeds exist on Intelink-TS and

over 2000 exist on Intelink-S (iNews Justification 2007). Intelligence is discoverable

based on intelligence product’s metadata, which will filter to each user based on metadata

matches (e.g. an analyst working on Middle Eastern affairs will have similar-themed

news appear in his RSS feeds, whereas an analyst working on local terrorism projects

will get news relevant to homeland security).

2.2.7 Blogs

Intelink hosts a variety of blogs owned by IC analysts. Because Intelink operates on three

different security domains, various blogs are available at the unclassified, secret and top

43

secret levels. The service was one of the first social computing tools brought onto

Intelink when it was launched in 2005, and hundreds of blogs are currently active using,

among other blog services, the publicly available blogging software WordPress.

Examples of blogs that exist on Intelink include a blog set up by the Coast Guard to allow

project managers to update team members of the improvements made to the Coast Guard

Logistics Information Management System (CG-LIMS), a blog about “living

intelligence” and increased adoption of social media tools, and general purpose topics,

including one which described the controversial technique of “waterboarding” as illegal

and a violation of the Geneva Convention (the blogger was subsequently released from

the CIA for her controversial blog post (Priest 2006)). Of note is that links to Intelink

blogs are commonly found on the Internet, although they are protected by the private

Intelink domain they are hosted on (that is, only permitted users can access these links)

2.2.8 Microblogging

The IC uses eChirp, a microblogging application similar to Twitter that allows users to

post short updates to their Intelink profiles pertaining to a variety of topics. The goal of

this service is to increase “situational awareness” on breaking events and provide

information discovery in a time-sensitive environment (Schroeder 2011, 9; Intelligence

Community Chief Information Officer 2010a, 8).

2.2.9 Social bookmarking

Intelink also allows IC employees to set up personal bookmarks to internal and external

web content. This service is called tag|Connect, and is similar to social bookmarking

website del.icio.us. These bookmarks can be viewed by other employees, sorted, meta-

tagged, and have recommendations for other users with similar bookmarks. tag|Connect

44

enables employees to locate other staff that could be looking at or working on similar

projects for collaborative purposes. Over 46,000 links have been bookmarked in the IC

(Intelligence Community Chief Information Officer 2010a, 14).

2.2.10 Conferencing

IC employees that need to hold web conferencing meetings use IC Connect. This is the

official tool used in the IC to hold online, collaborative meetings. IC Connect uses Adobe

Acrobat Connect to hold online training sessions, web conferences, and to allow desktop

sharing. While this tool isn’t a social networking or social media-based service as most of

the other IC tools are, IC Connect is accessible through Intelink and provides social

computing capabilities, tying in with user desktop sharing and focusing on collaboration.

It is also considered part of Intelink’s information sharing package of tools (Intelligence

Community Chief Information Officer 2010a, 9) (McCracken 2011, 33).

2.2.11 Collaborative Workspaces

The IC employs Microsoft SharePoint Server, a document management and content

management software platform, to bring intelligence employees together. Called “Shared

Space Web Hosting,” employees can use collaboration sites to work on intelligence

projects where analysts need a common digital workspace. Documents and content can

be discussed, edited and reviewed, and can scale to bring new functionality to provide

more value to a common workspace (e.g. introducing calendars, data analytic displays,

industry contact lists, etc.). SharePoint workspaces exist on all three security networks,

and were described by Director of Enterprise Solutions in the ODNI as a “product we

can’t get by without anymore” (Hoover 2009). However, the USIC has been known to

implement other collaboration workspace solutions such as IBM Lotus Sametime, a

45

middleware platform that enables unified communications and enterprise collaboration

functionality (Lynch 2008).

2.2.12 Gallery

Gallery is the intelligence community’s picture hosting service. It is found on Intelink,

and images are uploaded by intelligence employees. Images (as well as videos on iVideo)

can bet set as either private or public, and can be ‘tagged’ with keywords so as to present

similarly-tagged images (more commonly known as ‘meta-tagging’). These images can

be embedded in articles on Intellipedia, blogs, and on SharePoint collaboration spaces

and MyIntelink. Users can also comment on and rate both images and video on Intelink

(Intelligence Community Chief Information Officer 2010a). The service is similar to

public photo-sharing site Flickr.

2.2.13 iVideo

The intelligence community also has a video-sharing platform for use by IC employees.

This service operates very similar to popular video-sharing service YouTube – it allows

users to upload videos covering a wide variety of content, including training material,

opinion pieces, and intel-related videos that have contributed to intelligence analysis.

This service went online in 2007, and within 30 days, had reached the “critical mass”

necessary to become a primary tool for video consumption (Ackerman 2009). iVideo is

available on all three secure networks – NIPRNet, SIPRNet and JWICS. While

governance of video posting initially required a video to be posted on all three networks

(Bain 2008), the intention was to forward videos up to the higher level networks when

posted on NIPRNet. It is unclear if this improvement in uploading procedure has been

46

made. However, other social tools currently require multiple uploads on all three

networks (e.g. Intellipedia).

2.2.14 Other

Beyond Intelink, the USIC continues to engage with other social computing tools. One

such service is called CompanyCommand, an online discussion forum that lets front-line

army commanders share and discuss skills, knowledge and information, in an

environment designed to facilities intelligence sharing to better train and prepare

servicemen. Such topics discussed in this forum (which has its own private website

outside of USIC networks) include how to deal with the death of soldiers in the

battlefield, how to equip for mountainous terrain, and how to engage with locals in battle

zones. Additionally, members in CompanyCommand play various roles to build value

through quality content, including being contributors of content, facilitators of the

discussions, and ‘social catalysts’ tasked with onboarding new members into the online

community (Dixon 2007).

In a similar vein, the Department of Defense also employs a “2.0” version of their Global

Information Grid (GIG), essentially designed to bring Web 2.0 technology to let military

servicemen collaborate and share information on expertise, events and training. Many IC

professionals access this network to engage in military intelligence building, particularly

the DIA. The move towards more 2.0 tools in the GIG was reminiscent of the larger IC

information sharing problem – the services and applications designed on GIG were for

specific needs of the different military services, contributing to the “stove-piping”

systems problem (Jacks 2009). The new GIG creates a single network that allows

47

professionals in the Army, Navy, Air Force, Marine Corps, Coast Guard and other allied

forces can use and connect with each other online.

More closely related to Intelink, the intelligence community set up the Library of

National Intelligence (LNI), a repository for all disseminated intelligence products

(regardless of classification) jointly produced by the ODNI and CIA. The LNI allows

analysts to search for and request intelligence documents according to their individual

security clearance. Statistics and trends on intelligence reports (such as most cited or

viewed) can also be requested, and the LNI provides links to other pieces of intelligence

on other networks and portals within the IC. The LNI allows analysts to discover work on

similar topics done by other members of the IC, and “provides IC managers and

customers insight into the alignment of IC production and national intelligence priorities”

(ODNI 2008a, 7). Recently, the Library gained updated approval for XML (Extensible

Markup Language) Data Encoding Specifications for Intelligence Publications, which

allows intelligence products to be shared in a more accessible format across computer

networks and applications (e.g. allowing intelligence products to have metadata that is

readable across various IC networks and applications, while also allowing for clearance-

based permissible access (Intelligence Community Chief Information Officer 2010b).

This language is central to allowing information (now and in the future) to be discovered

on social computing tools such as Intellipedia, A-Space or news aggregators such as RSS

feeds.

The tools in use are all in their relative infancy, and are evolving to meet the needs of an

intelligence community and its changing demands over time. It is unknown (publically) if

any social computing tools have been decommissioned as a result of low usage rates,

48

ineffective information sharing, or from being unaligned with USIC strategy and long-

term goals.

2.3 Similar programs in other countries

By comparison, the United States appears to have one of the most sophisticated social

computing strategies in the world, both internally and externally. No internal social

computing tools were as openly discussed or covered in the media as that of the United

States. By a similar measure, the United States appears to have the most aggressive

online presence for their social networking and social media tools towards the public

(likely in alignment with their public diplomacy strategy). There are, however, instances

of social computing strategies in other countries that are akin to the services offered by

the USIC.

In Canada, GCPedia and Gov Connex are two services that provide public servants with

Web 2.0 technology in the workplace. GCPedia is the official wiki for Government of

Canada employees. The service was launched in 2008, and is considered to be the

Canadian government’s ‘answer’ to Intellipedia (Bastien 2009). The project was born

from the successful adoption of the NRCan Wiki, another MediaWiki-powered wiki

deployment implemented by Natural Resources Canada. Today, GCPedia serves over

22,000 users and has 11,000 pages of content (Eaves 2011). It also has a sister tool called

GCConnex, which is a social networking platform launched (similar in objectives to A-

Space). Finally, Canada has also implemented GCInnovation, a collaboration area that

hosts discussion forums, multimedia, and other forms of dynamic information that

connect government data and resources in a central location (Braybook et al. 2009), a

service similar to the deployment of the USIC’s SharePoint collaboration workspaces or

49

the TSA’s IdeaFactory. Additionally, the government of Canada has hired Open Text

Corp. on a multi-year contract to develop the country’s “government 2.0 strategy”

(starting with their document management eDocs technology), although this may not be

to provide social computing and Web 2.0 tools that do not relate GCPedia or GCConnex

(Arellano 2008).

While the federal government encourages employees to read and contribute to GCPedia

and participate in GCConnex, it is unknown the extent to which Canadian security

professionals in agencies such as the Canadian Security Intelligence (CSIS) or the

Communication Security Establishment Canada (CSE) use these services or whether they

have their own exclusive Web 2.0 and social computing tools. It is unknown if these

services are as comprehensive compared to the use of social computing tools on the

various permission levels of Intelink, although the Office of the Auditor General of

Canada (2009) in a report on national security, intelligence and information sharing,

determined that the Canadian government was making “satisfactory” progress in

developing systems of sharing intelligence information, albeit with slow momentum and

numerous legal challenges (Office of the Auditor General of Canada 2009).

Many of the same policies and aims to incorporate Web 2.0 and social computing tools in

the USIC are shared by the government of Australia. The Australian Government (2010b,

5) identified that information management is critical to an effective national security

environment, and that the issues of the United States, such as removing organizational

stove-pipes and establishing a “network-based information system that transcends

traditional government boundaries” were also equally valid in Australia. The roadmap

was founded on the Smith Review of Australian Homeland and Border Security (2008)

50

which highlighted that information sharing and cultural changes need to be made, and the

national security community wasn’t fulfilling the needs of agencies gathering and

disseminating security information. While there has been implementation progress,

“some legislative, technical and cultural barriers to information sharing – within and

between governments and the private sector – remain” (Department of the Prime Minister

and Cabinet 2008, 2). Additionally, some policy observers have argued that the

Australian effort for using Web 2.0 has “not been well developed,” but the appointment

of a National Security Chief Information Officer (a recommendation put forward by the

Smith Review) is a critical step to making further progress in fostering a collaborative

culture among the national security community (O'Hara and Bergin 2009, 1-3).

The Australian government commissioned a task force entitled Engage: Getting on with

Government 2.0 in 2009, which identified Web 2.0 tools as having “unprecedented

opportunity” to achieve open, efficient collaboration of information and a chance to shift

a public sector culture from not sharing agency data (Australian Government 2009, x). In

2010, the Government released an official response to the task force report, agreeing with

the recommendations regarding national security and Web 2.0 use (with a

recommendation that information sharing using Web 2.0 must also include civilian use of

the data where possible (Australian Government 2010a). These recommendations are

supplemented with the Australian Government’s high-level approach to information

sharing and collaboration in the National Government Information Sharing Strategy

(2009).

51

CHAPTER 3: BUILDING A MODEL OF SUCCESSFUL SOCIAL COMPUTING

As demonstrated in the previous chapter, the tools used by the USIC are consequential

from a national intelligence strategy that, in the post 9/11 environment, focuses on greater

information sharing. Measuring social computing use in the USIC is not separate from

the community’s larger intelligence strategy. The “need to know” culture has been

determined by numerous legislation and government officials to be a relic of the pre-9/11

intelligence environment; a new model of more collaboration, better communication and

more sophisticated information sharing was a precursor for developing the IRTPA and

numerous other pieces of legislation. Additionally, the 9/11 Commission Report

determined that information sharing was a priority in which improvements needed to be

made. However, as described in Chapter 1, efforts being made to improve collaboration

and information sharing are being called into question. Social computing efforts are part

of this effort to improve, but questions about their effectiveness and usefulness now

persist.

Social computing, by its nature, involves groups of people collectively creating

information, and thus creating value for the larger environments in which it is used.

Theories of collective knowledge assume value is derived from that which is produced

from a group of entities. More directly, the theories of collective intelligence and

collaborative intelligence are applicable to the use of social computing tools within the

USIC. Collective intelligence gathers the knowledge of individuals via a multi-agent

system and often used to deduce conclusions. Collaborative intelligence involves using

expertise, knowledge and competing interpretations of subjects to build solutions through

52

critical problem solving. Thus, collaborative intelligence is applicable by identifying that

it is a way of exercising intelligence to determine conclusions.

The collective, as some would describe as a base for tools such as Intellipedia, also has

application to complexity theory. Here, the collective intelligence gathered in a

crowdsourcing environment is self- governed, and rules are formed from the bottom-up;

not top-down. In this sense, rules that govern the information sharing environment create

dynamic and constantly-changing intelligence that is responsive to the demands of a

national intelligence environment with thousands of users. This is the thrust of Andrus’

assertion in The Wiki and the Blog (2004). This theory, however, might best represent an

environment that uses dynamic intelligence as a ‘finished’ product. Conventional and

bureaucratic intelligence production processes still persist in the USIC, where static

reporting is entrenched in defining what is considered ‘finished’ intelligence. The issue is

further compounded by the complimentary role that major social computing services are

used for, rather than as a mandatory, integrated set of tools to improve intelligence as it

moves through the intelligence cycle. Therefore, there are limitations in applying

complexity theory to social computing strategy in the USIC.

The focus of this paper, however, is not to analyze the USIC’s information sharing

strategy, nor the effectiveness of social tools, within a purely theoretical framework. The

constantly changing nature of web technology, including new programming languages,

system and database interaction and the development of new social computing tools to

increase efficiency and efficacy of communication between users is an ongoing process.

The improvements in data usage and knowledge management improvements through

computers providing faster, more stable and more user-friendly services has been a

53

constant objective since the success of the first mass-marketed personal computers in the

late 1970’s. While information sharing in organizations can be composed of using

multiple computing tools performing a variety of functions to different audiences, success

with an individual tool or subset of tools can be used as a benchmark for success. Web

2.0 and social computing tools, as described in Chapter 1, often have similar

technological foundations achieving similar results, even in large organizations (e.g.

technologies such as Microsoft SharePoint and IBM SameTime used as enterprise

content management platforms). The USIC employs a variety of tools, but as identified

earlier, Intellipedia and A-Space, a wiki and a social network respectively (and which

often overlap and integrate in its use) are the two central social computing tools that

compose the USIC’s strategy. In this respect, a comparison must be made against other

successful implementations of public sector wikis and social networks. Additionally, such

social computing implementations in other public sectors are not vastly different in their

aims – internal wikis are used to build organizational knowledge used for the betterment

of end-product value, and social networks are used internally to connect employees who

would otherwise not interact without such technology.

3.1 Successful Web 2.0 and Enterprise 2.0 strategies

In order to examine the effectiveness of the social computing strategy in the USIC, the

core principles in Web 2.0 organizational adoption must be identified. Two terms often

used in Web 2.0 implementations in public sectors are “Government 2.0” and “Enterprise

2.0”. While both terms implicate the use of social computing tools, each describes a

particular vein of using Web 2.0 technology in organizations. According to Tim O’Reilly,

Government 2.0 involves the use of a software platform to bring together citizens,

54

agencies and governments to help solve problems (Howard 2010). More specifically,

Gartner identifies characteristics of Government 2.0 to be “citizen-driven”, “employee-

centric,” “transformational”, and continually “evolving”. While these terms are generally

vague, it involves greater collaboration among citizens and public employees to

“socialize and commoditize government services, processes, and data” (Di Maio 2009).

Enterprise 2.0 involves the use of Web 2.0 technology in organizations to help streamline

business processes and/or improve business value derived from web-based technology

use. Collaboration is a primary focus in Enterprise 2.0 – it involves connecting

employees and stakeholders together within an organization and the resources they need

to conduct business. Additionally, the term can also mean collaboration between an

organization and its customers and/or consumers. However, Enterprise 2.0 involves the

use of Web 2.0 tools – blogs, wikis, social networks and other tools to create community

areas where informal, horizontal interaction between users comprises business process.

As demonstrated in Chapter 2, Enterprise 2.0 and Government 2.0 are evident in the

USIC and other intelligence agencies among governments. These two terms, however,

can overlap. IdeaFactory, the collaborative social computing tool of the TSA, is an

example of both Enterprise 2.0 and Government 2.0. Additionally, data.gov is both a

Government 2.0 and Enterprise 2.0 initiative, as consumers of the site can retrieve data

from RSS feeds and XML-based website widgets to be displayed dynamically in real-

time. These tools also extend down levels of government. Fusion Centers, which gather

information pertaining to terrorism and suspicious activities, provide access to lower

levels of government (e.g. state, local and tribal) with access to a variety of IC networks

55

and their tools, including Intelink-U (Interagency Threat Assessment and Coordination

Group 2011, 36,54). Access, however, is managed by the user’s security clearance level.

The success of Government 2.0, Enterprise 2.0, and general Web 2.0 use within

organizations then becomes a matter of identifying certain principles which have worked

in the past and make current implementations of Web 2.0 technology successful. Due to

the relatively new business model these terms employ, some have made efforts at

defining key success factors relevant to successful implementations. Multiple major

business and software consulting firms have made efforts to define what these key factors

are. This chapter will outline four sources for building successful Web 2.0 strategy. These

four sources base their models of success on various components, including factors to

consider for organizational implementation, unique public sector challenges, and national

security information sharing considerations.

Before a model can be devised, this chapter examines two studies conducted within the

USIC. These studies collected the perspectives and observations made by users of

Intellipedia and A-Space, the largest and most actively used social computing tools in the

USIC. These studies are examined first to provide grounds for establishing a successful

social computing model, as the particular organizational processes, bureaucracies and

cultures exist within the USIC that makes intelligence communities different from

standard organizational implementations of social computing tools.

3.2 The DIA Knowledge Laboratory Pilot Project studies

The studies conducted by Nancy M. Dixon and Laura A. McNamara entitled Our

Experience with Intellipedia: An Ethnographic Study at the Defense Intelligence Agency

(2008) and How A-Space is Shaping Analyst’s Work (2009) are two publicly released

56

DIA laboratory projects that provide insight into the effectiveness of social computing

strategy within the USIC. Both of these ethnographic studies were conducted with the

aim of identifying trends and perspectives directly from the users that engage in

Intellipedia and A-Space. As this paper identified earlier, Intellipedia and A-Space are the

most prominent social computing tools in the USIC; information retrieval, discussion

boards and social networking are supported by subsequent tools available on Intelink that

are integrated within them, such as RSS feeds, Inteldocs, Gallery and iVideo. Therefore,

opinions and perspectives within these studies sufficiently represent the use of the

USIC’s larger social computing tool set, and not just Intellipedia and A-Space alone.

Dixon and McNamara, in both studies, conclude general usage assertions regarding how

people use Intellipedia and A-Space, what they think of the tools, how they are used to

integrate with their daily tasks, and what challenges have resulted from their use. Both

studies use participants from the DIA, and observes the opinions and perspectives of

users regarding numerous categories related to using the tool, including its networking,

situational awareness and content generation capabilities. As described in the Limitations

section of Chapter 1, both studies stress that their results are not “findings”, but rather

“observations” due to the relatively small sampling size and constrained laboratory

parameters when conducting the tests.

There are further limitations to both studies. First, both lab pilot projects only interviewed

individuals within the DIA – all respondents were analysts, officers, or general staff

within one intelligence agency. Therefore, the sample of those with responses was limited

to a particular vein of intelligence staff whose experience with Intellipedia and A-Space

may not represent the comprehensive viewpoints of the greater USIC.

57

Second, the studies were conducted in 2008 and 2009, and therefore the responses may

not represent any changes and upgrades that have been made to both Intellipedia and A-

Space likely to augment the user experience. Since both of these social computing tools

were initially created as pilot projects, it is likely that upgraded services, improved data

retrieval techniques, and larger buy-in from senior managers and users has occurred since

their inception. Therefore, these samples are accurate of viewpoints and opinions that

exist in a particular point in time. The same studies may produce different results if

conducted today or again in the future.

Finally, because the studies were conducted within two years of the launch of each

service, the results may not be accurate of longer-term viewpoints and opinions. This

constraint is particularly important in that the long-term strategic use of these tools is

meant to reform the intelligence community as a whole-of-community movement to a full

information sharing environment. The full effects of these reforms may not yet be felt,

especially in the relatively little time these social computing tools have been

implemented. Social computing tools can be considered disruptive technology

(Ackerman 2007; Dixon and McNamara 2008), and immediate results and long-term

benefits may not be fully understood within only two years of launch. Organizational

change and attitudes that accompany significant reform through disruptive technology

may be more accurately assessed given longer timeframes for study and analysis.

58

3.2.1 Our Experience with Intellipedia: An Ethnographic Study at the Defense

Intelligence Agency (2008)

Dixon and McNamara engaged in an ethnographic study of Intellipedia users in the DIA

to determine the need for a larger research project to identify patterns in the software’s

use and adoption rates. Primarily, the study aimed to look at three areas (Dixon and

McNamara 2008, 1):

- How intelligence analysts use Intellipedia;

- Factors that influence adoption of Intellipedia; and,

- How Intellipedia affects collaborative behavior.

As mentioned earlier, the researchers conducted interviews to gather perspectives and

opinions on Intellipedia use. Although the sample size remains relatively small, common

responses were observed by Dixon and McNamara. The following are the core

observations as a result of the project (1-2):

1. There are multiple generations of users contributing to Intellipedia; its use is

not solely the realm of younger generations with digital expertise;

2. Users enthusiastically adopt new spaces to share information;

3. Intellipedia and blogs on Intelink project professional identities among USIC

agencies;

4. Intellipedia and blogs allow agencies to promote their own work among USIC

agencies;

5. Wikis are providing an innovative space for communication;

6. Intellipedia is becoming a knowledge marketplace; and,

59

7. Intellipedia has the potential to change the nature of work done by analysts.

The observations show a largely optimistic viewpoint of the use of Intellipedia and its

functionality. An underlying theme appears to indicate that respondents understand the

purpose of the tool and why it is there, although there are reservations about how

information sharing appears to be comparatively “open” and lacking proper control

methods8. Additionally, some users expressed concern about letting anyone contribute to

topical articles of which they themselves were not an expert. These concerns were

manifested by a variety of users not exclusively within a particular analyst age bracket.

The report also did not specifically find any users that collaborated on an Intellipedia

article that led to the creation of a finished intelligence product. Rather, usage of

Intellipedia was often defined by simply ‘sharing information,’ and that users were not

mandated to use Intellipedia for completing work tasks and deadlines. However,

associated discussion with articles was not being engaged; there was relative lack of

actual “collaboration” happening outside of building an article (Dixon and McNamara

2008, 8-9).

An important observation made from the responses was that many users perceived

Intellipedia as part of the same mutually reinforcing technology with other social

computing tools, such as Intelink and Intelink blogs. Analysts oftentimes used these

services interchangeably when describing the sharing of information that others had

access to read and respond to. Dixon and McNamara suggest that this may be resulting

from the rules that Intellipedia is built on, including analysts’ names that are published

8 Under core observation #1, concerns were expressed regarding the nature of information exchange, and

suspicions of sharing sensitive information. More details and respondent examples are found on pages 5-7.

60

with articles they comment on. Thus, credibility is built given a poster’s contributions to

Intellipedia and blog articles (Dixon and McNamara 2008, 9-10).

Intellipedia, as per the project observations, was also being used for a variety of manners

in terms of producing content. Whereas some members were using it to post their travel

schedules and information to reduce email exchanges with managers and colleagues,

others were using it to create emergency content to show to others, use it for social

networking purposes, create indexed pages for cataloging, and build hyperlink trees to

replace file shares (11). Additionally, users were creating Intellipedia content using

finished intelligence found internally, and on the Internet9, suggesting that the tool is

being used in combination with official intelligence deliverables as well as open-source

intelligence that has potentially not been approved or vetted. Regardless, users found that

Intellipedia provides quicker access to finished intelligence and provides a more

accessible repository of relevant information, including looking up terms, acronyms and

other business support material that isn’t direct intelligence itself. Users view it as a

marketplace of information that would be otherwise spread out in various repositories and

networks. However, the “marketplace” description was observed to be emerging, and that

a common, single source of information was still not yet fully evolved (13,15).

As mentioned earlier, observations found included behavior and attitudes that indicated

Intellipedia fell under the definition of disruptive technology. Much of this assertion is

drawn from responses that are positive towards its use, and likely resulting from the

9 Dixon and McNamara use the word “intelligence bricolage” to explain the process of building new

artifacts out of existing artifacts, although the project does not aim to identify if this was not an intended

usage of Intellpedia.

61

convenience and relative simplicity in retrieving otherwise difficult-to-find intelligence

and intelligence products. However, the authors also warn that the significance of

Intellipedia being a critical tool for information sharing reform is complicated by mixed

messages from management regarding its use and a lack of rules governing its usage,

including how much time to spend using it and how contributors are rewarded10

.

3.2.2 How A-Space is Shaping Analyst’s Work (2009)

Dixon conducted a second laboratory project within the DIA entitled How A-Space is

Shaping Analysts’ Work (2009). This ethnographic study was designed similar to Our

Experience with Intellipedia (2008). Twenty interviews were conducted with DIA

analysts to ascertain how they use A-Space on a regular basis and the impact it has on

their analytic work11

. Since A-Space is designed only for analysts and other members of

the USIC with top-secret clearance (that is, with access to JWICS and top-secret SCI), the

study only focuses on social networking among a particular vein of USIC employees, and

thus does not represent a complete picture of social networking efforts in the larger

USIC. However, since A-Space is the largest and most popular social computing tool for

internal social networking (and other “spaces” exist and are growing in usage

(Rasmussen 2010)), the project provides relevant observations of those who use it and

have impacts in the analysis stage of the intelligence cycle.

10 Observation #7 (13-14) discusses more detail around these concerns and makes recommendations, which

align with recommendations made later in this paper. 11

For more information on the parameters of the project, refer to the executive summary and introduction

of How A-Space is Shaping Analysts’ Work.

62

Through the observations made from the project’s respondents, Dixon identified benefits

and challenges that A-Space poses to users. Additionally, she identifies that the culture of

A-Space also plays a significant role in the effectiveness of the tool and the perspectives

of the respondents. Similar to Our Experience with Intellipedia, the project observations

show a relatively optimistic outlook of A-Space as a functional social computing tool.

However, reservations are still made regarding its use and how it relates to the greater

analytical process.

Dixon identified the following benefits (Dixon 2009, 6-15):

1. Analysts have found that A-Space’s most valuable characteristic is the easy in

ability to find others within the USIC.

2. The tool gives the analysts better “situational awareness” regarding topics they

are currently working on, and helps discover information that might not otherwise

be found without the use of A-Space.

3. A-Space has become a place where analysts can ask questions, challenge

assertions, and test new ideas that would otherwise not be tabled in normal

intelligence production processes. It is considered a place where informal

conversations and discussion can take place. This interaction contributes to

situational awareness.

4. It is a place to share classified intelligence documents through hyper-linking or

hosting.

63

Conversely, the project identified the following challenges (27):

1. A duality exists between the analysis conducted in A-Space and the existing

models of production. There is uncertainty into how these two models will co-

exist.

2. The many social computing tools in the USIC have the potential to confuse

analysts, who in addition to legacy process systems are expected to participate in

using these tools.

3. A-Space is meant primarily for analysts; intelligence collectors, data processors

and customers have no access, and thus only a narrow realm of USIC employees

can truly participate.

4. “Silent” members of A-Space and the role they play are not fully understood. It

remains uncertain if users who do not participate are integrating knowledge found

in A-Space into their everyday business processes and analysis output.

5. Senior management monitoring usage of A-Space by users is not fully

understood. The role of management remains to be determined if they are

expected to play a top-down enforcement role, a coaching or mentoring role, or

become a passive member in A-Space knowledge building.

Dixon highlights and examines the growing “culture” that A-Space is creating, one that is

“collaborative, informal, non-hierarchical and appreciative” (Dixon 2009, 16). Pertaining

to the benefit listed above of holding conversations, the ability to converse in a relatively

casual and unobtrusive manner is a significant aspect of A-Space. The platform provides

users with a chance to be informal, challenge and vet ideas on various topics and build

64

contacts throughout the USIC. Dixon also observes that this culture is developing an

atmosphere that may be conducive to training and to become a platform for peer-initiated

knowledge transfer (18).

Additionally, the study reviews patterns of usage with many users being digital natives or

being invited by others to join. Most discussion and collaboration is conducted in a

“workspace” that is owned by a user, and others are allowed to contribute where

necessary. However, some challenges are present regarding its use, such as low-

contributing members or having owners of workspaces that ignore their ownership duties

(Dixon 2009, 20-24).

Despite challenges and uncertainty regarding its future usage or relevance, the project

surmises that A-Space is becoming an open, collaborative environment that analysts are

buying into. The growing culture of “trust” is becoming a direct result of the functionality

A-Space provides. The study concludes with pragmatic optimism that A-Space is the type

of collaborative space that analysts need to truly connect for better information sharing.

3.3 Additional Observations and Feedback

The Dixon and Dixon/McNamara studies produced observations based on feedback and

commentary from analysts within a particular vein of USIC intelligence. Again, these

studies cautioned that, while their results represented users who directly use social

computing tools themselves, they many not represent the larger USIC experience, nor

does it represent longer-term viewpoints on the usage of the tools.

Commentary has been made by other users, both within the DIA and in other USIC

agencies. In Andrew McAfee’s book Enterprise 2.0: New Collaborative Tools for your

Organization's Toughest Challenges (2009), feedback from analysts in the DIA and other

65

agencies were collected. This feedback, however, was not directly pertaining to

Intellipedia or A-Space, but towards social computing tools in general (McAfee 2009)12

.

The following examples were among others provided on intelligence analyst Don

Burke’s internal blog:

“…Tracking down a colleague with similar mission interests involved finding

reports on Intelink or in our databases, and trying to find whoever wrote them.

But establishing a rapport or cultivating exchanges of useful information this way

was unlikely at best. After nearly two years of involvement with Intellipedia,

however, this has changed. Using Intellipedia has become part of my work

process, and I have made connections with a variety of analysts outside the IC.

None of the changes in my practices would have been possible without the

software tools . . . I don’t know everything. But I do know who I can go to when I

need to find something out. Broadening my associations outside my office, and

outside my agency, means that when someone needs help, I am in a better

position to help them get it.”

- NSA Analyst (McAfee 2009, 115-16)

“The first aspect that comes to mind when I contemplate how these tools have

improved my ability to do my job is the ease of shar[ing] ideas and working

collaboratively with intelligence professionals around the world . . . without

leaving my desk. This is probably an incremental change—although a huge

increment—because I could always do these things to a certain extent using

traditional techniques (e.g. the telephone). On the other hand, I am actively

involved in an early stage project that would be impossible without these tools.

The ability to link information and people together, as wikis and blogs do, makes

possible an activity that I truly believe will transform our Community. The tools

fundamentally altered the course of this project. I know that my example is only

one of many similarly transformational activities that are germinating or will

germinate when these tools reach a greater level of penetration of the IC

workforce.”

- CIA Analyst (McAfee 2009, 116-17)

12 While this thesis paper focuses on social computing tools, McAfee defines such tools as Emerging Social

Software Platforms (ESSP). Feedback provided by USIC analysts in this section pertain to the package of

social computing tools in the USIC, which have identical similarities to ESSP’s.

66

Other comments regarding the use of the tool have been made from high-level managers,

officials, and supporters of social computing software and its use. However, these

comments are usually in support of the tool, rather than direct feedback; comments

provided by officials such as Michael Wertheimer, James Clapper, Michael McConnell,

Tom Fingar are usually in response to press inquiries regarding the performance and

effectiveness of these tools, of which it is unlikely that they would speak negatively of

the products in which their administrations are currently promoting. Other commentary

exists from internal USIC employees such as Chris Rasmussen, Don Burke or Sean

Dennehy, individuals who champion the use of social computing technology and Web

2.0. However, as champions of social software and its reform, their views cannot be

considered ‘feedback’ in the same way that Dixon and McNamara’s studies observed13

.

3.4 Effective use of social computing software

To examine effective use of social computing software, examples from real-world

solutions must be applied and compared to the experience of the USIC. Additionally,

research has been conducted on the effectiveness of social computing software in

academia, particularly the use of Web 2.0 technology. Companies also often draw on the

best practices learned from other industry competitors or colleagues. The USIC can learn

from the experiences of other large-scale organizations with multiple agencies and

departments collaborating together using technology platforms. However, similar

functions of “effective” social computing must be identified, both within academia and

13 Various sources in this thesis refer to the works and comments of the aforementioned names, and their

support of social computing and Web 2.0 tools in the USIC.

67

among industry experts so that recommendations can be drawn. This thesis looks at four

models of “effective” Web 2.0 adoption in organizations, and identifies similar factors

that can be applied to the USIC’s social computing adoption.

This thesis adopts the key success factors identified by four sources: two private sector

management and technology consulting firms (McKinsey & Company and Gartner),

academic theories from a leading researcher in social software and enterprise tools

(Harvard professor Andrew McAfee), and a consulting firm that advises public sectors on

their business processes. The fourth source is also a private sector body, but unlike the

other two private sources, the Markle Foundation is a not-for-profit philanthropic

foundation. The foundation’s intent is to advise public sector leaders, particularly in

health care and national security, on information technology trends and solutions that can

be put to practical use. Additionally, the Markle Foundation (more specifically the

Markle Task Force (MTF)) is also the organization the 9/11 Commission Report

referenced as having made recommendations for improved information sharing that

should be adopted (National Commission on Terrorist Attacks upon the United States

2004, 418). Therefore, it is this paper’s position that the Markle Foundation becomes a

critical source to build a model for effective social computing.

3.4.1 Source #1: McKinsey and Company

McKinsey & Company and Gartner provide management and technology consulting

services to clients, both in the private and public sectors. McKinsey & Company is one of

the leading management consulting firms in the United States and overseas, and works

with almost 75 per cent of the 200 leading global corporations and more than half of the

68

listed Fortune 500 companies (Byrne 2002). While the company offers branches of

consulting in both media and business technology, McKinsey Quarterly, the company’s

publication about business consulting, offers advice for organizations regarding

information technology practices, including social computing strategy.

In an article entitled Six Ways to Make Web 2.0 Work (2009) published in McKinsey

Quarterly, McKinsey staff Michael Chui, Andy Miller and Roger Roberts detailed

various methods of how to achieve effective social computing in organizations. The

paper includes drawing conclusions from over 50 early adopters of social computing

technology, and how “success” was achieved from various implementation environments.

Six factors for successful adoption of Web 2.0 technology in large organizations (Chui,

Miller, and Roberts 2009) were identified as:

1) Transforming to a bottom-up culture instead of from the top (with executive

support of the technology);

2) Technology use is defined by grassroots direction (that is, letting users define

what works and what doesn’t);

3) The tools must be in the business workflow; participation must be made

mandatory so as to reduce duplicating work;

4) Participants must feel needed and be rewarded by the tools; recognize

contributors through incentives for their contributions;

5) Relevant users must push the technology; certain users need to serve as

motivation for others to participate and to enhance technology adoption; and,

69

6) Organizations must find a balance between risk management of content and

participation and the ability for users to participate without fear of reprisal.

Again, these six factors were derived from over 50 organizations who have implemented

social technology and strategy, both successful and unsuccessful. Therefore, these ideas

can be considered from a large sample size of various-sized organizations. These ideas

serve as the basis for McKinsey’s contributions to building a successful social computing

model.

3.4.2 Source #2: Gartner

Gartner is a business consulting firm with a focus on information technology, research,

and enterprise adoption and strategy of this technology. The company has over 11,000

clients and operates in 80 countries. Gartner produces in-depth research on information

technology, including developing the Magic Quadrant analysis for technology

investment that is highly lauded among client organizations of all sizes. This and other

research methods are provided for the public sector, including consulting on national

defense and security.

Gartner Industry Research produced an article in 2009 entitled Government 2.0: Gartner

Definition, authored by Andrea Di Maio, a Distinguished Analyst with Gartner and a

veteran in government IT sector consulting. In the article, Di Maio (2009, 1) identifies

Government 2.0 as “the use of information technology to socialize and commoditize

government services, processes and data”. Of note in Di Maio’s definition is that it is a

response to multiple existing definitions of what Government 2.0 entails, where some

70

believe it is simply the use of Web 2.0 technologies and social media in general, or

describing “government as a platform” (2). Di Maio’s definition attempts to define the

phrase on a higher level, which doesn’t necessarily limit it to a particular set of

technologies (although the technologies used have social applications). Additionally, it is

also intended to encompass the data requests and interactions between governments, their

civil servants and citizens.

Di Maio asserts that Government 2.0 has the following characteristics (Di Maio 2009, 3-

4):

It is “citizen driven”: the public have the ability to transform government

information and integrate this data with sources external to government networks.

In the same vein, governments will be impacted by this same externalization of

information.

It is “employee centric”: Government 2.0 efforts require a bottom-up and

grassroots approach from employees, empowered to be collaborative, innovate

and accountable through participatory technologies.

Government 2.0 is constantly evolving: new communities are created and old

ones disappear as social networks are built and augmented. Therefore, no circle of

employees discussing particular issues is static.

Government 2.0 is transformational: business processes are transformed by the

efficiencies found in the tools, and establishes a two-way, transparent relationship

with both employees and citizens.

71

Governments need to plan and nurture: communities need to have a purpose and

goal for existing, and evaluate their behaviors so they can be leveraged into a

government initiative.

Pattern-based strategies must be established: internal and external patterns in data

usage and the content of that data must be determined, and the impact on

organizational strategy they have. Identified patterns can augment existing

strategy or change internal operations within a government body.

Management must think differently: since the bottom-up approach empowers

employees, management must let civil servants and technology users determine

which tools works best, and engage in a performance-measurement and rewards

program to support it.

Since Government 2.0 includes the use of Web 2.0 tools, this Gartner publication is not

directly referring to specific technologies public sector bodies need to implement in order

to achieve an effective social computing. Rather, it is the cultural and organizational

behavior needed to foment an effective environment for introducing possible disruptive

technologies such as social computing tools.

3.4.3 Source #3: Andrew McAfee

Developing an effective social computing model must not exclude theory provided by

academia. Professor Andrew McAfee (MIT) is credited with pioneering the term

“Enterprise 2.0” in his seminal piece Enterprise 2.0: The dawn of emergent collaboration

(2006). McAfee’s research around the use of Web 2.0 tools within organizations to

72

improve business processes and create greater value is highly regarded in both the

software and IT consulting industries, as well as among contemporary scholars in

research institutions. Enterprise 2.0 is a term growing in acceptance and used in

businesses and organizations - both Gartner and McKinsey have expanded upon Web 2.0

and Government 2.0 adoption strategies based on McAfee’s original assertions, as well as

on the expansion and sophistication of Enterprise 2.0 in businesses and organizations

since the term was first introduced in 2006. Intellipedia is also often an example McAfee

references in implementing Enterprise 2.0 technology14

.

In Enterprise 2.0: The Dawn of Emergent Collaboration, McAfee identifies six

technologies that compose a roll-out of Enterprise 2.0 software (McAfee 2006):

Links: linkages exist between content and leads to discoverability of information;

Search: users must be able to do keyword searches and retrieve results from

databases of information (the application of metadata to searchable content);

Authoring: users must be empowered to produce and edit content, and this content

is refined and corrected by the crowd-sourced nature of this process

Tags: categorization of content – applying metadata to discoverable information

by users is the process of creating a categorization system called ‘folksonomy’15

;

14 McAfee operates a blog called Andrew McAfee’s Blog: The Business Impact of IT. There are multiple

articles in which he interviews USIC employees, and references Intellipedia and other social computing

tools. For more, see http://www.andrewmcafee.org/. 15

Folksonomy is a phenomenon whereby users categorize and annotate information from a grassroots

effort. The process is also called “social tagging”. For more information, see Peters, Isabella, and Paul

Becker. 2009. Folksonomies: indexing and retrieval in Web 2.0. Berlin: De Gruyter/Saur.

73

Extensions: Social platforms need to extend tagging to introduce suggested

material and make recommendations to other pieces of content (e.g. pushing data

towards users); and,

Signals: users must be signaled to new content; content must be aggregated based

on algorithmic or discernible patterns of use by individual or group users.

While these pieces of functionality are now common among major enterprise social

software platforms (e.g. Microsoft SharePoint, Lotus Notes, Lotus Connections, Mango

Apps, etc.), those listed by McAfee are critical to providing an organizational experience

using Web 2.0 technology.

Similar to the theories of McKinsey and Gartner, McAfee also listed organizational

requirements to make the technology effective (McAfee 2006, 25-26). First, these social

software tools must be easy to use, so as to promote interest and consistent use in them.

Second, the tools should not be imposed, but rather organically shaped by the users to

derive their maximum usage. This assertion is similar to the second success factor

discussed by the McKinsey article, in that the technology grows to fit the user, rather than

implementing pre-conceived notions of ‘what works’ by upper management.

McAfee continues by asserting that, while managers should not impose hard rules on

content creation, they do play a role in creating a culture of use (26). A receptive culture

is cultivated by managers promoting a “collaborative culture”, and not penalizing those

who move horizontally across organizations for information. Second, management must

provide a common platform for collaboration, rather than multiple instances of “small”

environments – for example, a wiki must be a company-wide wiki rather than team-

74

specific, although teams can have their own pages within the larger wiki deployment.

Third, a roll-out of social computing software must be done in small increments, and

expectations of usage must start small. These technologies can be disruptive to

conventional workflow models, and so must be implemented incrementally and with

transparency so users can expect upcoming changes in information sharing practices.

Finally, managers and prominent users must express support of the tool and use it

themselves to encourage others to participate.

3.4.4 Source #4: The Markle Foundation

The Markle Foundation is the fourth source used to build an effective social computing

model. As described earlier in this chapter, the 9/11 Commission Report based many of

its findings and recommendations on the MTF’s work. The Markle Foundation

recognizes that a “virtual reorganization” is now being undertaken by the USIC and that

information sharing procedures have grown considerably since the events of 9/11 (Ten

Years After 9/11: A Status Report On Information Sharing (Statement of Zoe Baird

Budinger and Jeffrey H. Smith) 2011). Additionally, it is through this recognition that

social networking and social computing software are the results of developing a better

information sharing environment, which the MTF supports as a general IT trend (4).

In actuality, the Markle Foundation wrote four reports regarding the creation and

development of an information sharing environment in the USIC between 2002 and 2009.

Their reports are recommendations on policy and technology issues that affect the

creation of trusted information environments. Additionally, many of the principles in the

reports have been adopted by Congress into official legislation (e.g. IRTPA (2004) and

Intelligence Community Directive 501 (Markle Foundation 2009c, 9,12)). However,

75

progress to embed their recommendations into legislation is ongoing (1). Many of the

recommendations have not been translated from approved policy into action, and as such,

the Markle Foundation continues to make recommendations based on better information

sharing best practices and changing the “need to know” culture that still permeates.

Four areas of concentration for fomenting an effective collaborative and information

sharing environment are put forth by the foundation. In Meeting the Threat of Terrorism:

Authorized Use (Markle Foundation 2009a), the MTF recommends implementing a

mission-based authorized use standard. In this model of permission-based clearance, a

user is granted access to data that pertains to projects and missions they are assigned.

Data is classified and tagged so that it is available to the analyst where necessary, and

that accessible data is aggregated and “pushed”. Current rules governing access to

intelligence information is based on pre-9/11 standards (2), and as such, is made obsolete.

The second report, Meeting the Threat of Terrorism: Discoverability (Markle Foundation

2009b), continues this assertion. Users need to be able to discover information they are

required to find, where legacy data and information is tagged and classified appropriately

so that it is discoverable. This information should be audited to ensure accountability, and

discoverability is contingent on authorized use to access.

The third report is Meeting the Threat of Terrorism: Culture Change (Markle Foundation

2009d). This report specifically focuses not on technology, but on conventional

bureaucratic practices that have become obsolete in the context of information sharing.

The report suggests that performance metrics and incentives can help guide information

sharing culture on the correct path. Further, information sharing should become part of

performance reviews, where promotions and funding is contingent on user collaborative

76

efforts. Users should also be rotated through agencies to understand the information

sharing requirements, but also to spread common collaborative cultures throughout the

community. Users should also be empowered to determine preferred information sharing

tools and environments (that is, those who share information should dictate the

effectiveness of these tools and where they operate best).

The final report, entitled Meeting the Threat of Terrorism: Privacy and Civil Liberties

(Markle Foundation 2009e), makes assertions regarding protecting collaboration and the

data collected in the intelligence process. Data must be protected in a “clear, calibrated

and predictable system of oversight and accountability” that does not infringe on the

rights of those using the information for national security purposes, but those who are the

subject of the material (2). The report asserts that technology used to govern data privacy

should be sophisticated to the point of conducting audit trails, building rules, and

synchronizing with permissions and discoverability tags in distributing information to

users.

3.5 A comprehensive model for social computing and information sharing

The four sources of social computing and information sharing theories each have their

own respective assertions about what makes an effective enterprise, an effective

information sharing environment, and effective deployments of Web 2.0 tools. In the

context of social computing, all three of these concepts are linked. The USIC has had its

challenges in addressing each of these components in the post-9/11 world.

The purpose of building a framework from the assertions of the four sources is to produce

a working model for which to measure effectiveness of the USIC’s use of social

computing tools. The tools exist in an environment in which information sharing is a

77

legislated and recognized goal of the USIC, and these tools are used to move attain this

goal.

Using only one of these models for measuring the USIC social computing tool use would

not be as effective. The McKinsey & Company article is largely targeted towards private

sector organizations, whereby the sample used for effective social computing lessons

were all from private sector deployments (although deployment of enterprise social

platforms will have similar challenges and lessons in the public sector as well). The

Gartner article by Di Maio introduces the particular challenges and considerations of

Web 2.0 principles in the public sector environment, but the article focuses on solutions

that are both citizen and employee-centric, and its assertions are likely geared towards

more open public sectors that don’t have elevated security requirements like that of the

USIC or national intelligence agencies. The McAfee article, like the McKinsey article,

provides insight into successful Web 2.0 and Enterprise 2.0 deployments, but the article

is dated. The risk is that, by the very nature of Web 2.0, information sharing tactics and

techniques change over time (although the principles McAfee explains are consistent as

Enterprise 2.0 grows in usage). Additionally, McAfee’s work does not account for public

sector deployments, which differences can exist from private sector deployments1617

. The

16 There are differences in the deployments of Enterprise 2.0 and Government 2.0 solutions. Among them

include information security risks, managerial and administration changes, inter-agency collaboration

expectations, bureaucracy, budgetary matters, stakeholder expectations, and other considerations. For more

detail, see Radick, Steve. 2008. "What Makes Government 2.0 Different from Enterprise 2.0?" Social

Media Strategy, (blog). October 13, 2008. http://steveradick.com/2008/10/13/what-makes-government-20-

different-from-enterprise-20/. 17

Additionally, academic analyses of public sector deployment differences can be found in Anthony,

Hollingworth, and Cooper Tim. 2011. "Why Government Must Do Better With Web 2.0." Public Manager

no. 40 (3):30.http://ezproxy.lib.ucalgary.ca:2048/login?url=http://proquest.umi.com.ezproxy.lib.ucalgary

.ca/pqdweb?did=2557233681&Fmt=2&clientId=12303&RQT=309&VName=PQD.

78

Markle Foundation’s recommendations for better information sharing are specifically

targeted towards the U.S. national security services and intelligence agencies, but do not

focus on operational and tactical technologies that are the premise of this paper. The

Markle Foundation reports explain challenges and recommendations within a community

and strategic approach; the use of social computing tools are only supporting factors for

better information sharing and are part of a larger set of recommended solutions for

improving intelligence processes.

The most prominent theme found in all of the sources is that of information being

discoverable. In the McKinsey article, points #3 and #5 are the most pertinent to

information discoverability. The authors assert that when Web 2.0 tools are integrated

into daily work processes, people are more akin to find what they need using these tools.

If these tools are integrated into business workflows, then data sources are manipulated

because it is part of the mandatory process (which ties into McKinsey’s point #1

regarding bottom-up use of the technology). Additionally, point #5 refers to participants

of the technology leading the discussions and information sharing efforts. These

‘champions’ are expected to drive users towards contributing and refining content, which

is creates information in the community realm and is accessible by those who have access

to it. Therefore, information becomes discoverable. The Gartner article touches on

information discoverability in its pattern-based capabilities assertion, where employees

can access information that is relevant and timely through internal and external

operational capabilities (Di Maio 2009, 4). McAfee’s article and the Markle Foundation

both make overt points that information discoverability is central to adopting both

Enterprise 2.0 tools as well as enhancing a trusted information sharing environment. They

79

discuss the application of appropriately tagging data to be searchable and discoverable,

and providing tools users (analysts) can adopt without difficulty of use. USIC

information sharing proponents have also made the case that discoverability of

information is paramount to the success of Web 2.0 technology adoption and social

computing tools in general (Rasmussen 2010).

All of the sources emphasize empowering users to provide grassroots, bottom-up

population of information sources, rather than top-down implementation of content

requirements. This assertion is central to Web 2.0 technology: social computing tools are

participatory in nature, and thus, larger numbers of users contributing content generates

greater business value from the technology. The McKinsey, Gartner and McAfee articles

all assert that social computing efforts must by driven by those who directly use the

technology, and that information is edited and refined by other users (crowdsourcing).

The MTF reports stress that better information sharing is the responsibility of the analysts

who provide the information to policy makers, consumers, and management that support

their efforts. However, better analysis through improved information sharing efforts

ultimately rests with the capabilities analysts have to perform their jobs properly18

.

Information sharing must also be incorporated into the workflows of users who engage

the technology. The McKinsey article explains the necessity of this component in factor

#3, which describes that participatory technologies have the “highest chance of success”

when incorporated into a user’s daily workflow” (Chui, Miller, and Roberts 2009, 12).

18 The Markle Task Force reinforce in various reports that information sharing environments must be

supported from top-down direction, including the President, the Director of National Intelligence and

Congress. Information sharing at the bottom levels cannot be sustained without administration support.

80

Additionally, Di Maio makes the case that commoditization of commercially-available

(“off-the-shelf”) products incorporated into daily public-sector workflows would support

crowdsourced models of information processing (Di Maio 2009, 3). McAfee’s Enterprise

2.0 business model is intended to be used as a “platform” rather than simply information

channels; platforms allow content to be generated, approved by and edited by users as

opposed to simple “channels” of information (McAfee 2006, 22-26), such as emails and

instant messaging alone. This model is further explained by McAfee to support

“lightweight workflow” in organizations and using Enterprise 2.0 technology in an

integrated structure with conventional enterprise applications (McAfee 2010). Finally, the

Markle Foundation’s reports are part of a larger, USIC-wide recommendation to change

internal information sharing systems to focus on decision-making and user’s goals rather

than simply being used for “exchanging data” (Markle Foundation 2009c, 19). The MTF

also asserts that when users demand better information in these new systems, better

information workflows are likely to emerge to meet these needs (19).

All four sources also recommend that in order for these social computing tools to be

effective and adopted in the workplace, management support is required. One common

theme among all sources is that management must act as ‘champions’ of the technology

and also promote a culture of technology use. In addition to the six factors of successful

Web 2.0 adoption being leadership imperatives, Chui, Roberts and Miller recommend

that organizational leadership need to encourage its use (point #1), as well as being

responsible for mitigating risk through strong governance of usage policies (point #6;

(Chui, Miller, and Roberts 2009). Di Maio argues that successful Government 2.0

practices requires a more liberal management style to empower employees without

81

traditional hierarchical bureaucratic processes (Di Maio 2009), while McAfee argues that

management needs to provide the structure to avoid “information anarchy” (McAfee

2006, 27). The Markle Foundation also recommends that leadership in public

administrations, including the ODNI and the White House, needs to reaffirm information

sharing as a top priority, including coordinating all information sharing policy and

implementation across intelligence, law enforcement and homeland security communities

(Markle Foundation 2009c, 10). It is also a consistent recommendation by all four

sources that management need active incentive programs to encourage the use of social

technology among users, and to use metrics on contributions of users of social computing

technology as part of performance reviews.

These commonalities provide a basis for a comprehensive model of effective social

computing, both in terms of technology and in terms of organizational governance and

behaviour. With these considerations, a model can be devised from common principles

among these sources that underline effective social computing in organizations. There are

similarities that exist among all the sources regarding the nature of effective social

computing and information sharing practices.

In summary, the following factors will serve as an effective model for which to provide

analysis of the USIC’s social computing strategy:

Information must be “discoverable”;

Content must be driven by bottom-up forces, not from top-down imposition of

management;

Information sharing works better when tools are incorporated into business

workflows to produce information;

82

A dedicated group of users must ‘champion’ the use of the tools to serve as

leaders for proactive use (management or otherwise); and,

Performance measurements and incentives must be available for tool use.

This model combines similar factors in developing an effective social computing model

for the USIC, as they combine effective measures for organizations, public sector bodies,

and the U.S. national security environment. The next chapter will apply this model to the

efforts of the USIC in social computing usage and strategy.

83

CHAPTER 4: ANALYSIS

This section will apply the model devised in the previous chapter to the social computing

tools used in the USIC - tools that exist because of the community’s larger information

sharing and intelligence strategies created after the attacks of September 11, 2001. The

chapter will largely focus on applying the model to the community`s two largest tools,

Intellipedia and A-Space, including analyzing the results of the two pilot projects

produced by Dixon and McNamara, as well as other relevant secondary sources. This

thesis acknowledges that supporting social computing tools introduced in Chapter 2 are

also present on the different secure networks, including NIPRNet, SIPRNet and JWICS.

However, these tools serve as secondary, supporting social computing tools on Intelink to

Intellipedia and A-Space. In addition, Intellipedia and A-Space have the most publicly

available information on their usage; most metrics and analysis on the supporting tools

are not as prevalent in open sources such as industry trade articles, books or academic

studies.

The previous section proposed five factors for an effective social computing model:

Information must be “discoverable”;

Content must be driven by bottom-up forces, not from top-down imposition of

management;

Information sharing works better when tools are incorporated into business

workflows to produce information;

A dedicated group of users must ‘champion’ the use of the tools to serve as

leaders for proactive use (management or otherwise); and,

84

Performance measurements and incentives must be available for tool use.

Ultimately, the purpose of a model is to assist in identifying the information sharing

challenges that continue to exist after to the events of September 11, 2001. Even since the

attacks and the publishing of the 9/11 Commission Report, recommendations put forth

into policy (e.g. the IRTPA (2004) and the 9/11 Commission Recommendations

Implementation Act (2007)) have not been fully implemented into action, which are

supplemented by conflicting legislation and an absence of community-wide adoption

strategy19

. These concerns are compounded by community and industry professionals and

organizations who claim that these tools are stagnating and have reached their maximum

effective use.

4.1 Discoverable Information

Discoverability is one of the five success factors in which the social computing strategy

of the USIC must not only implement into policy, but also adopt into practice.

Compartmentalized information that was not discoverable nor ‘pushed’ outwards towards

agencies was one of the primary criticisms of information sharing deficiencies outlined in

the 9/11 Commission Report. The “need to know” culture and information “stove-piping”

resulted in missed opportunities to discover information that other agencies may have had

on similar intelligence (National Commission on Terrorist Attacks upon the United States

19 The Markle Foundation, in their recommendations, determined that gaps exist between policy vision and

implementation of the legislation’s acts. For more detail, see Markle Foundation (2009c).

85

2004, 417)20

. Similarly, in a more recent incident, intelligence that was discoverable

might have helped prevent Umar Farouk Abdulmutallab from boarding Flight 253 in

2009. President Barack Obama has also cited the “mix of human and systemic

(intelligence) failures” by the USIC to thwart the incident before it happened as

“completely unacceptable” and identified that information existed but “should have been

pieced together” (Meyer, Nicholas, and Semuels 2009).

Intelligence Community Directive 501 was issued in 2009 to make “discovering”

information a community policy. The Directive made “all information collected and all

analysis produced available for discovery by automated means” (ODNI 2009b, 2). It also

requires agencies to subscribe to the “responsibility to discover” and “responsibility to

provide” model of information sharing for mission-critical needs (2). Therefore,

discoverability of information has already been acknowledged and approved by the U.S.

government as a means to improve information sharing in the USIC. Additionally,

subsequent programs, such as the AT program and the creation of a national information

sharing strategy under a larger national intelligence strategy serve as supporting for

which to promote information discoverability. Agencies are also mandating their own

information discoverability programs. The Department of Defense’s Defense Discovery

Metadata Standards, the CIA’s Electronic Record-Keeping System (ERKS) and the FBI’s

Records Management Architecture frameworks offer best practices on applying metadata

to their respective in-house, agency-centric data. These programs are intended to keep a

20 The 9/11 Commission Report used the example of the NSA withholding information on hijacker Nawaf

al Hazmi as an example of information sharing gaps.

86

measure of control over data (sensitive or otherwise) while being properly classified and

indexed in legacy databases.

Discovering information is in the process of becoming a reality in the USIC through a

number of initiatives. The community employs one of the most powerful search engines

on its’ internal networks – Google Search Appliance. The USIC is working with Google

to develop search and discoverability features that cater particularly to the USIC’s large

information demands (Schroeder 2011, 19). As of 2009, over 92 million documents have

been indexed and community staff conducts over two million queries on the search tool

per month (Hoover 2009). The AT program also provides solutions for analysts to

organize large volumes of data and improve the quality of analysis through better training

standards and technology use. In other words, the program seeks to “change how

(intelligence analysts) approach analysis” with a focus on better interagency collaboration

(Lowenthal 2008a, 102).

The Dixon and McNamara pilot projects have indicated that discovery of information is a

large component of using both Intellipedia and A-Space. In the Intellipedia project, part

of the appeal for users to engage in using the service was being able to find the

information they were looking for. While the process of searching and finding

information has been improved because of crowdsourced content (as will be analyzed in

the next section), Intellipedia, however, still operates as a “pull intelligence system.” This

means that a user must know exactly what they are looking for when conducting a search;

there is little software support for data to be pushed from backend databases (both

internal and cross-agency databases) proactively to Intellipedia’s user interface. There are

little data mining standards and content aggregation capabilities on Intellipedia. However,

87

A-Space does offer some capabilities towards “pushed” intelligence. RSS feeds are

enabled on the platform, and a service called the “GeoRSS” standard allows users to tag

RSS feeds with spatial information that can be displayed in map readers, such as Google

maps (Joch 2009).

The GeoRSS feed introduces an important element to information being discoverable.

The process of tagging products, discussions or other data involves empowers users to

attach relevant keywords and create a layer of searchable data. Whether the data being

tagged is a discussion on A-Space or a digitized PDF file of an intelligence report

produced 50 years ago, tagging enables documents and data to be found. Tagging and

applying metadata helps raise situational awareness of a given topic. Information that is

tagged or has metadata can be beneficial not only for search purposes, but also through

other intranet services and functionality, such as using a tag cloud or other software

‘widgets’ based on algorithmic formulas to push otherwise obscured information outward

to analysts or users. However, while information is beginning to be appropriately tagged

to intelligence products, such information is still not being propagated towards users to

the extent that metadata can be useful for organizational benefits (Ben Eli and Hutchins

2010; Ackerman 2007), nor has it been applied to the entirety of intelligence databases in

a properly governed manner (Joch 2009).

In a similar vein, discovering information is a possible solution for preventing legacy or

older content from becoming obscure or irrelevant due to the amount of information

available to USIC analysts. There are over 50,000 documents produced annually, many

of which are never used or overlap (Rasmussen 2010). Similarly, there is large amounts

of raw and finished, current and long-term intelligence that sit in databases in USIC

88

agencies. Analysts, without proper technology, can overlook content which has already

been produced, or simply do not make the connection between multiple sources of similar

information (e.g. information held on Umar Abdulmutallab). Data management and

warehousing has been a large challenge for the USIC, especially considering overall

information architecture was not designed for cross-agency data consolidation when

systems were set up before 9/11 and before the IRTPA in 2004. This has resulted in a

sprawling, loosely-connected and uncoordinated intelligence data structure that has

created duplication in intelligence and redundancy in the systems and databases that store

them (Rasmussen 2010; Priest and Arkin 2010). After the events of Flight 253, Obama

commented that data collection is not an issue; how to make sense of the information

they was the primary challenge to better intelligence (Ben Eli and Hutchins 2010, 5).

Three services have been created within the AT program to improve discoverability of

data. Catalyst is a program that uses metadata to correlate and discover information from

multiple intelligence sources, without combining all the sources into one central

repository. It tags intelligence products (with metadata) such as names, places and people

through various software services (e.g. A-Space, iVideo, Gallery), and will grow as

platforms become further integrated with stand-alone applications for doing intelligence

analysis (e.g. software services being offered on Intelink). The second service is the

Analytic Resources Catalog, which is a database maintained by the ODNI that collects

contact data on IC analysts, and also includes their expertise, years of experience, and

projects completed so as to avoid creating redundant tasks for analysts that might have

otherwise already been assigned or produced (this service also ties into having a user

account on A-Space (ODNI 2008a, 13)). The third service is the Library of National

89

Intelligence (LNI), which was created to be a central repository to host all finished

intelligence products, which are searchable and retrievable based on security-based

permissions (e.g. Unclassified, secret and top-secret). Over 20,000 products are being

added per week from various agencies (7).

From a strategic perspective, discoverability of information appears to be improving in

the USIC. The multiple programs that exist, both at the IC (ODNI level) and agency

levels appear to show a clear and obvious trend towards better content organization.

Through this organization come better discoverability practices, and applying metadata

and empowering users to contribute through tagging supports this general trend.

Additionally, the USIC has proposed and implemented legislation that makes information

discoverable and using phrases such as being responsible to “discover information” and

abandoning “need-to-know” habits. Through metadata standards and ODNI and agency

support, users can better find data, both in terms of finished intelligence products and

ongoing drafts, discussions and content not yet complete in the intelligence cycle.

However, it remains to be seen if discovering information is simply a result of the tools

available (especially those through social computing efforts), or is a sustained effort to

reforming the culture of agency information “stove-piping”. Additionally, there is a lack

of publically available metrics of how these programs have improved quality and

timeliness of intelligence products, and the large numbers of databases and their content

continue to pose a large-scale challenge for making all content discoverable (which

community experts on social tools Sean Dennehy and Don Burke have expressed doubt

about whether all data will be discoverable (Ackerman 2007)) . The Markle Foundation

suggested that “data finding data” during the Northwest 253 scenario might have been an

90

antidote to terrorist threats such as these (The Lessons and Implications of the Christmas

Day Attack: Watchlisting and Pre-Screening 2010). The USIC continues to suffer from

information overload and not being able to connect the dots through discovering

information, which can hinder thorough analysis and lead to risks such as Umar Farouk

Abdulmutallab slipping through detection.

Social computing tools such as A-Space and Intellipedia are built to integrate these

expanding discoverability standards, and are already delivering benefits for users.

However, work is still needed to implement these discoverability standards, and it will be

a gradual progression towards full discoverability rather than an instantaneous fix. As

such, the USIC is successfully adopting discoverability of information into its social

computing strategy, technology and organizational culture, albeit it at a slower

progression than may have been envisioned by policy makers within the community, and

with infrastructure and technical challenges.

4.2 Crowdsourcing from the bottom-up

Calvin Andrus’ proposition for adopting “self-organizing” technologies to improve

information sharing (Andrus 2005) was the key to better intelligence analysis in the

USIC. He proposed that information sharing starts with intelligence officers that work

with the information on a daily basis and are the ones who need to connect with other

agencies to build comprehensive, quality intelligence. People, not the technology, are the

main resource for better information sharing, and emerging technologies to share

information and connect community members must be used to improve on the

information sharing deficiencies that prevailed before and after September 11, 2001.

91

Crowdsourcing enables people to participate in creating content, empowering them to

build better intelligence that can reach all corners of the USIC.

There are two elements in information crowdsourcing to consider. First, that people,

including all users regardless of whether they are front line staff, mid-level managers or

senior officials, are expected to participate in generating content. The second is that

content must be created organically: discussions and information are the responsibility of

the participants, regardless of the level of hierarchy in the intelligence community

structure. That is, front-line staff, analysts, managers and officials can provide content

without vetting through the top levels of organizational hierarchy (other than to simply

“participate”).

Based on the Dixon and McNamara reports, crowdsourcing content through A-Space and

Intellipedia is showing signs of growing in acceptance and reliability. Users of both of

these systems are responding to the functionality these tools offer by engaging in

crowdsourcing, even if the effort seems minimal or complementary to the larger pieces of

intelligence they produce through conventional streams of intelligence collection,

analysis and dissemination. The observations found in How A-Space is Shaping Analysts

Work (Dixon 2009) show that users who actively engaged in the social networking tool

did so to discover and contribute to conversations and find contacts that other tools could

not previously be accomplished by users. These tools, as evidenced in the studies as well

as those who helped develop them, have considered these tools to be disruptive

technology that can change methods in producing intelligence (Dixon and McNamara

92

2008)21

. Additionally, the introduction of this functionality is aligned with the growth of

mainstream social computing websites such as Wikipedia and Facebook as well as the

changing demographics of the USIC workforce, both important factors in influencing

adoption rates and the value of tools that produce crowdsourced content (Ackerman

2007; Dixon and McNamara 2008; Dixon 2009).

Multimedia is also an important factor in building crowdsourced content. Services

available on Intelink such as iVideo and Gallery enable users to upload, tag and comment

on videos and images that are relevant for intelligence personnel to build intelligence

products. Whereas Intellipedia and A-Space are primarily used for information sharing

(e.g. textual data), multimedia brings visual and spatial elements that users require for

building better intelligence. Other text and document-based tools, such as IntelDocs,

iNews, Intelink Blogs, microblogging services and social bookmarking tool tag|Connect

also provide avenues for USIC personal and intelligence analysts to share content,

opinions and insight. Many social computing tools available in the USIC are conducive to

crowdsourcing content and empowering personnel to collaborate for better information

sharing using Web 2.0 technology.

Crowdsourced material and efforts from the bottom-up does not always pertain to

producing products – participating in content generation is also about the discussion,

debate and collaboration that is involved in product production as well. Therefore, central

software platforms are ideal for these user-engagement activities to take place. Dixon and

21 The “disruptive technology” argument was found in Dixon and McNamara’s Intellipedia project, while

other experts considers it disruptive more for veteran analysts and personnel rather than younger

generations of intelligence staff whom understand and use the technology more often(Ackerman 2007) .

93

McNamara (2008, 15) indicated that users were in favor of working with a single site to

access “everything they needed”. Users also indicated similar sentiments with A-Space,

as they were unsure how the content generated in A-Space fits with the current processes

for intelligence production, as well as generating some confusion with the number of

social computing tools available for use (Dixon 2009).

Modern enterprise software platforms are intended to create more value for end users

who ultimately use the platform for their central location to conduct business. Therefore,

greater value is derived from these systems when users can reduce their reliance on other

systems or stand-alone applications to build and retrieve information for intelligence

collection and analysis. A-Space and Intellipedia are considered platforms in that they are

tools for building crowdsourced information and accessing data from other analysts

across USIC agencies. Platforms which are conducive to both creating new information

and accessing existing information then become conducive to business models that

support crowdsourcing efforts. Major enterprise platforms such as Microsoft SharePoint,

Jive Engage and IBM Lotus Notes offer much of the collaborative functionality that

crowdsourcing requires to be effective, including wiki and social networking capabilities.

As described in Chapter 2, A-Space is built on Jive Clearspace (Engage) technology, and

Intellipedia is built on the MediaWiki engine. Therefore, these major software computing

tools are using platforms that are conducive to valuable information sharing practices

using social computing technology. Further, these platforms are integrated into each of

NIPRNet, SIPRNet and JWICS, which include Intelink as a larger platform to access the

various social computing tools as well.

94

The platforms are also supported when software and enterprise systems (such as

Intellipedia or A-Space) are referred to as central sources of information that can

aggregate data from other databases. As Dixon and McNamara have shown through their

pilot project, Intellipedia, in particular, can act as an “information marketplace” for

central access to knowledge repositories (Dixon and McNamara 2008, 12). An emerging

behavior towards treating information as a marketplace means that a variety of

information products (including finished intelligence such as Defense Analysis Reports

(DAR), Executive Highlights and National Intelligence Estimates as well as informal

discussions and draft workspace data) will be constantly refined to produce the most up-

to-date sources for USIC analysts and to access and use. Once the marketplace becomes

the central point of access to retrieve intelligence, the static nature of documents sitting in

agency repositories (often duplicate data) will become available for access and refined for

use. Data that would otherwise be left in obscurity will filter into central data streams,

and users will refine that which is redundant, obsolete, or of little intelligence value. This

information refinement falls in line with the U.S. federal government’s data-consolidation

strategy (the Federal Data Center Consolidation Initiative), which aims to close 1,080

data centers across the government by 2015 (Hoover 2011). It also serves as a basis for

making information discoverable, as the previous section identified.

Another element of crowdsourcing material in the intelligence community is that

“finished” intelligence becomes a constantly evolving, dynamic piece of information,

rather than an un-editable, static product. Previously, such material was essentially a

“snapshot” of information on a particular topic – once new developments or events

happened on the topic in question, the finished intelligence product becomes obsolete.

95

Using new crowdsourced information and the ability to push new and existing content to

support intelligence topics, intelligence can create a ‘real-time picture’ on a topic that

would otherwise have to be sourced from numerous databases and agency material.

Dixon and McNamara (2008) discovered that users are making new intelligence out of

finished intelligence products, suggesting that there is a common perception among

Intellipedia users that finished intelligence products produced with conventional methods

are not adequate enough alone to serve customer demands or to complete project goals.

Crowdsourced content, however, does not come without risks. The pilot projects have

both suggested that user-generated material on these platforms pose risks, especially

under a national security banner. Primarily, both studies confirmed that, while users are

keen to use these technologies to improve their production and intelligence quality, there

is an underlying sentiment of risk with informal conversations and pooling information.

Posting content which is sensitive or classified in nature may compromise the security

clearance level that an individual has – some people may find they have access to

information or discussions that do not pertain to their project, and thus, security issues

present themselves (even if A-Space and Intellipedia have stringent security access

measures). Additionally, these discussions and information must be managed and

monitored to minimize redundancies in content being generated. Enterprise platforms that

support crowdsourcing material must be designed with strong permission-based access

protocols, but also a comprehensive information architecture strategy to consolidate

efforts in managing user-generated content. While A-Space exists only for analysts with

top-secret clearance on JWICS and Intellipedia on all three security networks, it is

intelligence created (and retrieved) in each of these internal netwoks that must be

96

monitored. Failure to prevent personnel from accessing information they are not allowed

to access on secure networks can result in sensitive material being stolen or obstructed,

which was the case when whistle-blowing website Wikileaks received thousands of

diplomatic cables stolen by a low-level USIC employee from JWICS and SIPRNet

(Zetter 2011).

A second inherent risk in letting users manage intelligence content is that such content

may be factually false, or incorrectly marked as reliable. The potential for someone to

post incorrect information in a wiki, on a social networking service or tag information or

metadata to finished intelligence, whether malicious or not, may produce inaccurate

assessments or faulty intelligence products. It is unknown if the USIC has encountered

this issue with its social computing tools, but the possibility of incorrect or unchecked

information in intelligence systems proved a realistic threat when Pvt. Bradley Manning

accessed sensitive information systems (including services on Intelink) in the Wikileaks

scenario. Untrustworthy users have the potential to damage intelligence crowdsourced

content. Users have expressed hesitancy using Intellipedia and other social computing

tools because they are not official channels for which to produce officially “finished” or

approved intelligence (Dixon and McNamara 2008; Dixon 2009; Ackerman 2007). To

support this assertion, a National Intelligence Estimate on Nigeria that was attempted to

be developed using Intellipedia did not succeed because intelligence analysts didn’t fully

trust what they found (Joch 2009). The Markle Foundation has also determined that

authorized usage and allowing only “mission-based” authentication to users may be a

solution to information quality and security issues (Markle Foundation 2009a).

97

A similar risk raised by crowdsourced content is the community relying on unqualified

personnel and analysts to produce information and contribute to topics and discussions,

potentially diluting the expertise needed for intelligence products. By letting anyone

contribute content to these tools, information is left to be refined and corrected by the

community of users; there is no overall expert watching over all topics of intelligence that

personnel may be contributing to. Therefore, crowdsourcing works best if the crowd also

performs a quality assurance role; without this element, crowdsourcing is likely to

produce low-quality, possibly unreliable information. For example, to brand a person a

suspect is a sensitive and action-provoking noun in intelligence communities. If a user

begins a discussion on A-Space or starts an article on Intellipedia, and uses the term

“suspect” without fully knowing the USIC’s full definition of the term, it could create

unnecessary or unwarranted action in intelligence creation that might misrepresent the

finished product available for intelligence consumers. Although the ODNI is responsible

for defining sensitive terms such as “suspect,” the term “suspect” may mean different

things to the FBI, the NSA, and or other agencies within the branches of the ODNI due to

conventional intra-agency culture. This is a potential security challenge when bottom-up

content is created.

Both DIA projects have observed, however, that despite the risks of information access

and the exposure of sensitive or incorrect data, users are optimistic about the social

computing tools they engage in. Users in both projects provided perspectives that support

the new information channels that previously did not exist, and the ability to find, identify

and connect with users who contributed other content. The rates of adoption for both A-

Space and Intellipedia are indications that users are open to the benefits these tools offer.

98

Addtionally, improvements in technology have the potential to improve the ease of

crowdsourcing content. Platforms such as Jive Engage (A-Space), MediaWiki

(Intellipedia) and collaborative workspaces such as Microsoft SharePoint can be

integrated with open-source solutions from the software development community and the

companies that build, implement and maintain them. For example, the intelligence

community has invested in image search capabilities that allow IC personnel to draw

images and have the search engine return image results that match the drawing (Hoover

2009). Such technology would likely be integrated into existing search capabilities,

whether on the aforementioned platforms or in accessible locations on Intelink for

permissible users to find. Other technology, such as Intelink Passport, allows users to

provide a single sign-on authentication method for multiple applications. Capabilities

such as these are conducive to crowdsourcing content by avoiding multiple login screens

and authentication requests, which can be confusing for users and costly for IC agencies

to maintain and monitor.

The risks that challenge the USIC’s deployment of social computing tools are not overly

preventing their general adoption. The variety of tools available to build and share both

text-based and multimedia content has seen positive adoption among analysts and users

across the intelligence community. From the analysis it appears that the technology is

generally in place for crowdsourcing content to thrive; however continual “stove-piping”

of information and a lack of managerial support or direction in sharing this information is

creating lingering apprehensiveness among users who may not trust intelligence produced

with these tools. Additionally, the tools are subject to concerns that a community with

significant amounts of sensitive data would normally have – permissions, security

99

clearances and agency-centric intelligence that may not be trusted fully by other agencies.

However, these are the challenges large organizations face as they continue the process of

adopting disruptive technologies internally. Therefore, while the USIC’s concerns about

crowdsourced content are warranted, they are commonplace when producing content that

ultimately has implications for policymakers and citizens alike who depend on this

intelligence to support a national security strategy.

4.3 Information Sharing in Collection and Analysis workflows

A-Space and Intellipedia are both tools that serve analysts to support their intelligence

analysis and production, but are not incorporated into conventional intelligence

workflow, or are mandatory technologies to use in the intelligence cycle. The Dixon and

McNamara projects have both shown that intelligence produced in social computing

environments do not compose intelligence that is part of the official “workflow”. The

content produced using these tools serve a complementary role to pre-existing

bureaucratic processes in producing intelligence. From another perspective, that which is

built in Intellipedia and A-Space rarely is considered the finished product; the content

only serves as a supporting role to conventional methods for producing consumer-ready

intelligence, such as NIEs, DARs and Presidential Briefs. Chris Rasmussen described

Intellipedia and the social computing tools in the USIC as a large “digital water cooler” –

a non-official input stream for the major intelligence agencies such as the CIA, NSA and

FBI (McAfee 2011).

The reasons for this are varied. As detailed in the previous section, it could be that

information is not seen as “official” or with an agency’s seal of approval; there is no

authority over the content that may be used to produce finished intelligence, and therefore

100

may be found unreliable. Conversely, the intelligence may actually be vetted and quality-

checked more thoroughly than conventional, face-to-face and hierarchical intelligence

collection and analysis, but a resistance to change due to an engrained culture of agency

“stove-piping” may hold reform back. Institutionalized methods of producing

intelligence, before September 11, 2001, may also be preventing social computing tools

from being in the collective workflow, whether from veteran leaders resisting change or

agencies acting competitively to produce intelligence better than their community

counterparts.

Forrester has determined that, as of 2011, only 22 per cent of enterprises consider their

Web 2.0 and Enterprise 2.0 tools (social computing tools) to be vital to their business,

and those who do mostly rely on one social computing tool, which doesn’t necessarily

mean the tool is a mandatory requirement in their daily workflows to conduct business.

Organizations have also had difficulties spreading the “network effect” with these tools,

as most users only see relevance in adoption or participation based on a defined business

purpose (Nakano 2011a). USIC and community officials have identified using these tools

as part of a larger intelligence and information sharing strategy, particularly under the

guidance of the ODNI and with the efforts of former DNI Michael McConnell, who was

a large proponent of using social computing tools to improve intelligence cycle

management and analytic tradecraft (Ackerman 2008). However, using social computing

tools is not the end goal in and of themselves, but rather one of many means to an end to

improve information sharing and intelligence.

The results from Forrester show commonalities among those who believe the tools have

“leveled out” in the USIC. The thrust of Forrester’s research shows that social computing

101

tools are not mandatory requirements as part of the business workflow, and rather, these

tools sit on the periphery of conducting business in organizations. In the USIC, such

attitudes are frequent regarding their own deployment of social computing tools.

Rasmussen and others in the community consistently refer to that tools reaching their

maximum usage, and that the tools need to be implemented into the official intelligence

cycle workflow to continue being relevant and effective.

Existing problems that pertain to legacy business processes include both data duplication

and wasted resources spent on producing similar intelligence. After the end of the Cold

War, the intelligence community cut back on budgets and spending, as the intelligence

community was designed to accommodate Cold War threats and bipolar security matters.

During the 1990’s, the intelligence community froze its growth and expansion, and did

not hire new analysts at the same rates as during the Cold War. However, after September

11, 2001, spending once again resumed, and the revision of the way the USIC conducts

the intelligence cycle (collection, processing, analysis and dissemination) demanded that

new technologies were required and new practices engrained. Rasmussen suggests that

the spending surplus after 9/11 not only created duplication of data and agency efforts,

but also led to a sprawling IC structure and a fragmented intelligence process that

perpetuates siloed analytical reporting and cemented the bad habits of “stove-piping”

(Rasmussen 2010). Such problems were echoed by the Pentagon, identifying data

duplication as problematic from supporting two wars since 2001 (Ferguson 2010).

Social tools sitting outside official intelligence cycle workflows was also a concern of

those using Intellipedia and A-Space in the Dixon and McNamara projects. As described

earlier, Intellipedia users are using the tool to create new intelligence from finished

102

intelligence, or are using finished reports on Intellipedia as starting points for new

intelligence creation. However, Intellipedia sits on the outside of conventional workflows

to produce intelligence – any production conducted with this tool is from pro-active users

who are attempting to maximize usage of the tools. Still, there has been no major change

in the way USIC personnel conduct their work. An industry professional in knowledge

managment asserted that Intellipedia “has yet to transform intelligence production. It’s a

matter of changing how work is viewed. If you keep closed-loop work sacrosanct and tell

people ‘share when you’re done,’ you’ll get minimal compliance and near-zero value”

(Krzmarzick 2010).

A-Space users felt that the service was better suited for “testing ideas” rather than

actually creating finished products. Any attempts to produce finished products were the

results of a limited number of pro-active users championing discussions among other

analysts to use in intelligence in conventional analysis workflows. Most analysts did not

coordinate products through A-Space. Observations drawn from the project detailed that

the service was good for periphery tasks, but not actual intelligence coordination and

production. Dixon considers that analysts use A-Space in this manner because they do not

feel free to alter the current process of conventional workflow, but are comfortable using

it as a supporting tool for improving their outputs in these workflows (Dixon 2009, 14-

15).

Complicating the use of these social tools in official intelligence workflows is the fact

that they sit on various security networks. Content that is produced on one system does

not automatically filter down to a network with lesser required clearance. Therefore,

content on Intellipedia is often spread out, and the most popular and widely-used version

103

is on JWICS, which only allows top-secret access. A resulting problem is that work may

often get duplicated on the same service, simply because of the authentication and server

infrastructure that Intellipedia is built on. As identified earlier, there may be issues with

trusting the content or “owning” that which an agency produces (Schroeder 2011, 22-23),

and therefore, is not open to using a joint production process with other agencies using

these social tools.

Similarly, A-Space is only available on JWICS, and thus important conversations that

others in the USIC may need to access on this service are limited to only intelligence

analysts with top-secret clearance. Collectors, processors and consumers are withheld

from accessing this information, and thus, only a narrow vein of intelligence personnel

can use a system that may be just as beneficial to others (Cermani and Engel 2010, 99).

Managerial review, high amounts of coordination among both registered and non-

registered users of A-Space and the difficulty of achieving (or engaging in) virtual peer

review are also concerns for A-Space not functioning within mandatory intelligence

production workflows (99).

Social computing tools that exist on multiple networks requiring multiple channels for

holding discussions and information sharing complicates the overall use of these tools as

a package of technology services. As the Dixon project (Dixon 2009) showed, users

desire a central environment or platform to work from; having to work with multiple

authentications and limited access to information based on permission-level or system

access is not conducive to having an effective production workflow for intelligence.

Difficulty in using these social tools is increased when users are expected to use multiple

sources of information. Therefore, although these tools exist and are available on Intelink

104

(and its various versions), the current environment is not conducive to using a central,

one-source access point for producing intelligence.

Whether the production workflow of intelligence involves social tools as mandatory steps

in the process or not, the tools serve as technology designed to improve analytic

tradecraft. While not in the scope of this thesis, there has been much research and debate

both within the USIC and in academia on the effectiveness of the intelligence cycle and

improving analytic tradecraft for producing better intelligence. However, the ODNI has

made efforts to improve intelligence and analysis. The Analytic Transformation program

was intended to “shift longstanding intelligence operations in the direction of greater

collaboration” (ODNI 2008a, 4). A-Space and Intellipedia are considered part of this

program.

The tools appear to enable conducting peripheral tasks that are supportive of intelligence

processes in place, but are not incorporated into the official production workflows. They

enable discussion, content sharing and social networking, and analysts and users are

optimistic that they can benefit their work. The tools also enable collaboration and idea

sharing and information quality refinement, but these are done voluntary by active users.

In Dixon (2009), the author recommends that tracking work done in A-Space more

thoroughly or experimenting actual analytic production in A-Space may prove successful.

However, such experiments would require a high degree of analyst coordination above

simply using the technology, and therefore, without upgrades to the way A-Space

processes information or supports the workflow, it may never be useful for creating

finished intelligence. Therefore, the requirement for using social computing tools in

analysis workflows to produce intelligence is not satisfied according to the effective

105

social computing model. The next section elaborates on the use of champions of the

social tools as applied to the effective social computing model.

4.4 Champions

According to Chui, Miller, and Roberts (2009), an executive reporting to the Chief

Information Officer (CIO) at defense aerospace contractor Lockheed Martin became one

of the largest supporters of blogs and wikis when they were introduced into the

organization. The CIO evangelized the benefits of Web 2.0 technology towards not only

front-line staff and users, but also senior leaders who were otherwise apprehensive about

engaging in these new work tools. The CIO established his own blog, set goals for

adoption of the technologies across the organization, and participated in wikis and user-

generated content to encourage others to use them. This resulted in widespread usage of

the tools and increased collaboration across the company’s divisions (11).

The model for effective social computing in the USIC requires that personnel in the

community act as ‘champions’ to promote the use of the technology, such as the

executive described above. The ‘champion’ role can be provided from a variety of

intelligence community personnel, not necessarily managers or senior officials; users

who provide consistent, high-quality content by engaging social computing tools

regularly serve as exemplary champions as well. Therefore, championing the technology

is not always a vertical-based responsibility in large organizations. Although for the

USIC, support from the top does serve as encouragement to change to the “responsibility

to provide” culture that the USIC is attempting to move towards.

Cultural resistance is a large part as to why champions are needed to promote the

adoption and usage of new social computing technology. The change towards a more

106

collaborative culture has not been as fast or fluid as the USIC had originally intended.

“There are 16 agencies in the intelligence community, and 15 of them work for someone

other than the DNI,” Former DNI Michael McConnell asserts. “Part of this challenge is to

get willing cooperation and collaboration across a community that has departmental

responsibilities as its first priorities … This is a huge cultural change, and that is why it is

taking a long time” (Ackerman 2008).

Entrenched policies and agency cultures serve as reoccurring problems when information

sharing attempts are made, or when agencies are expected to collaboration with each

other. One opinion from within a USIC agency demonstrated an engrained culture

resistant to change when he commented that “real men don’t type” (Zegart 2005, 102).

Examples of cultural resistance through suspicion of the new technology were also found

in Dixon and McNamara’s Intellipedia project (Dixon and McNamara 2008), as well as

from various sources recognizing the relatively drastic change to transparency, openness

and inter-agency collaboration (Ten Years After 9/11: A Status Report On Information

Sharing (Statement of Zoe Baird Budinger and Jeffrey H. Smith) 2011; Dixon 2009;

Ackerman 2007).

Progress, however, is being made. The Dixon and McNamara projects have indicated that

some users are open to using the tools regularly to perform their job duties and improve

their intelligence production output. Often, those who contribute in Intellipedia, A-Space

and Intelink blogs will receive supportive feedback from their peers, including accolades

for contributing content or to discussions. Dixon identifies this behavior as part of a

“virtuous cycle” (Dixon 2009, 17) whereby encouragement from peers and other users

107

(senior or not) foments sustained collaborative behavior. Therefore, champions appear to

find their forms in those that are vocal and support others in their contributions and effort.

Appreciative feedback and reciprocity from other users encourage collaborative behavior

– two-way communication among users is one of the central tenets for contributing using

Web 2.0 tools. However, users of the major social computing tools are receiving mixed

messages from senior staff regarding their use. On one hand, observations made by

Dixon and McNamara have shown that management are encouraging their staff to

contribute content to Intellipedia and browse the tool’s articles. On the other, users were

observed to feel uneasy because their respective managers (Dixon and McNamara 2008,

8-13) were concerned it was distracting from their normal tasks.

However, such confusion may have been founded by an intentional strategy from senior

management. Upon releasing collaboration tools, including Intellipedia and A-Space into

the community, the ODNI identified what users could and could not do with regards to

what they posted, but did not specify how users could use the tools to do their jobs. The

community was required to use the tools in the “best way possible” to satisfy their

mission requirements (Ackerman 2009). Therefore, while a lack of governance from

senior management was an initial challenge, an open-ended strategy to tool use may have

likely led to user confusion on their social computing expectations. Confusion may exist

among officials as well if support is not broad and mandated throughout the USIC

management structures. ICD 501 requires that mid-to-high level intelligence officials act

as “stewards” of collected or analyzed activity by making it available to be discovered

108

through automated means22

. As a result, USIC management may not be coordinating to

provide the required support for personnel to use these tools. A recent study conducted

with Government of Canada employees regarding adoption factors of Web 2.0

technology in public sector workplaces showed one of the largest barriers to effective

collaboration was a lack of departmental and managerial support (Markova 2009, 51-52).

Despite this, direct support for using the tools has not been absent from top-level

management and senior officials. In addition to evangelists such as Chris Rasmussen,

Don Burke and Sean Dennehy acting as champions of the tools and tasked with focusing

on implementing Web 2.0 technology in the USIC, directors have shown their support.

Michael McConnell, who brought Intellipedia into the mainstream set of web tools for

USIC personnel in 2007, cited Intellipedia’s ability to help experts pool their knowledge,

form virtual teams, and make quick assessments (Confronting the Terrorist Threat to the

Homeland: Six Years after 9/11 (Statement of the Record of J. Michael McConnell)

2007). Deputy Director of National Intelligence for Analysis (DDNI/A) Thomas Fingar

praised Intellipedia after an article on chlorine gas use in explosive devices in Iraq

created a “serviceable set of instructions in two days” (Calabresi 2009). Similarly,

Michael Wertheimer, Assistant Deputy Director in the ODNI, described the ODNI’s

position on Intellipedia and A-Space as “game-changing initiatives — new ways to work

and do business — that will radically broaden intelligence work” (Wertheimer 2008).

Beyond top-level support, Intellipedia is governed by policies that enable champions to

manage the use of the tool. Volunteers of Intellipedia users (or “shepherds”) watch over

22 The role of stewards and their role in discovering information are mandated in sections D, E, F, and G of

Intelligence Community Directive 501 (ODNI (2009b)).

109

the content and answer questions other users may have. These volunteers provide a level

of support that encourages others to use the service, and provide positive feedback. Dixon

and McNamara’s Intellipedia project also determined that adoption of the internal wiki

was often the result of users keen to see other people’s contributions to use the service

(Dixon and McNamara 2008, 9-10); the same has been found to be true for A-Space

(Dixon 2009, 20). Certain features of Intellipedia also recognize those users for their

contributions, which latently establish them as champions (as will be discussed in the

next section). Intellipedia users were also found to be searching for credible sources; that

is, certain users who they can repeatedly refer to, including both their Intellipedia

contributions and other areas their work is showcased, such as an Intelink Blog or with

contributions on iVideo or Gallery. It is likely that active, senior users may become

champions simply by being found, rather than identified. This concept supports the

principles of information discovery – users are looking for contacts in which they can

turn to for reliable advice and/or content.

In identifying the effectiveness of ‘champions,’ there is often questions about the

seniority of those who use wiki and social networking technology. Younger generations

of analysts are often identified as digital natives, or high-end users of social computing

tools because they have grown up using Web 2.0 technology in their daily lives

(Ackerman 2007) (e.g. having Facebook accounts, visiting Wikipedia for information,

and watching videos on YouTube). The users of USIC tools, however, are not indicative

that only younger generations of analysts were using the tools. Dixon and McNamara’s

project on Intellipedia observed that it was not only “twenty-somethings” participating in

contributing content, but rather users of all ages – although older, more senior users

110

expressed more concern with sharing content freely (Dixon and McNamara 2008, 5-7).

Additionally, Dixon’s project on A-Space observed that younger analysts are keen to use

A-Space as a discussion tool with senior analysts (or “greybeards” as the project

identified (Dixon 2009, 18)). Therefore, there is benefit for analysts and personnel of all

ages to use the tools, although various forms of relationships may be generated from

users who connect using these tools.

From a distance, it appears that champions are being generated and identified in the USIC

as the social computing model requires, whether it is through vocal senior management

support, or whether the community identifies them from a grassroots effort. Additionally,

to be a “champion” can mean a variety of roles: it could come in the form of an industry

supporter of Web 2.0 technology, a pro-active user contributing quality content, a senior

analyst providing knowledge transfer and training to younger users, or a high-ranking

intelligence official sourcing social tools as beneficial to collaborative efforts. Still, the

“need to know” culture persists, and champions must serve as the benchmarks for which

the tools may not only further adopted, but actively used at the same time to build trust

and reliance between people and the technology. These champions must also have the

ability to connect with other users in efforts to encourage use, as feedback and peer

support play a crucial role in crowdsourcing content and discovering information.

4.5 Performance and Incentives

From the outset, there appears to be no active performance measurement or reward

system for using Intellipedia, A-Space, or any other social computing tools within the

USIC. While these tools have been growing in use and are spoken of highly from senior

IC officials, there is little (publicly) known whether they factor into employee

111

performance measurement programs. Additionally, there is little information on whether

rewards are given to users who contribute content to social computing tools.

Performance measurement and rewards, however, have been written into USIC policies

that mandate information sharing. A series of Intelligence Community Directives were

devised to integrate USIC performance measurement, compensation (including

performance-based pay), and rewards for (junior and senior) IC employees and

administration staff into agency strategies23

. Particularly, ICD 651 establishes “common,

core policies and processes for managing the performance of IC employees that are to be

incorporated into the performance management systems” by IC administration (ODNI

2007c). ICD 656 outlines similar procedures for senior officers and administration

(ODNI 2008c). Both directives link these performance measurements to the strategies

and goals of the larger NIS. More broadly, the ODNI also recommends to IC agencies to

tie “responsibility to provide” efforts to performance measurements of both the IC

workforce and administration (ODNI 2007d, 5). However, none of these directives

overtly mention that social computing tools or technology are areas for measurement

specifically. Rather, they imply that intelligence production as a whole must be

measured, and does not identify tactical tools to process and measure this intelligence

production.

Despite not being officially recognized within larger performance measurement and

rewards systems, these tools offer internal rewards systems that help promote their use.

The observations from Dixon and McNamara detail how Intellipedia offers users the

23 For a list of all ICDs that pertain to employee performance, compensation and rewards, refer to all “600”

level ICDs at http://www.dni.gov/electronic_reading_room.htm.

112

chance to have featured articles showcased on the welcome page to recognize excellence

in content development (Dixon and McNamara 2008, 14). Featuring work on the

welcome page also highlights content that is associated with the agency of the user that

produced it – both agency and user get recognized for their contributions. Additionally,

active users on Intellipedia were initially rewarded with a personalized shovel that

commended their contributions (symbolizing the “gardening” of wiki content), and has

since been replaced with a mug inscribed with the words “Intellipedia: It’s what we

know” (McAfee 2009, 193).

There also appears to be less tangible incentives to register as a user with the USIC’s

social computing tools. Being invited by colleagues was one of the strongest reasons for

joining A-Space, many of which were drawn to use the tool because of its informality and

the casual atmosphere in which to begin discussions with other analysts. Some users were

simply registering as users of A-Space to “see what the curiosity was about” (Dixon

2009, 20). Additionally, most of the A-Space users interviewed were users of social

media websites at home (20). Tools that users are familiar with at home may be

contributing factors to registering with social computing tools as users and contributing to

them. Former Deputy Director of National Intelligence for Analysis (DDNI/A) Thomas

Fingar inferred that the rapid adoption rates of Intellipedia and A-Space were from

younger analysts accustomed to using collaborative tools at home (Fingar 2011, 18).

The opportunity to be recognized by peers was another reason for joining Intellipedia and

A-Space. The tools are based on rules of transparency; every user who makes an edit to

an article or discussion is visible for others to see. As a result, individuals are judged

primarily on their editing record and information on their profile page (Harris 2008, 51).

113

This system also allows users to build a reputation based on their contributions, where

accolades given by fellow users provide incentive to engage in tool use and collaborative

behavior. Both Intellipedia and A-Space recognize contributions from users, regardless of

position within their respective agencies and the USIC hierarchical structure. Intellipedia

attributing every contribution made to the author is one of the reasons that some consider

it a “successful” tool (Olsen 2007).

Transparency is also incentive for managers and officials to participate as much as front-

line personnel. Intellipedia and A-Space have functionality to allow users to create a

“watch list” of other users and track their edits and contributions. This functionality

allows managers to quickly review their own analyst’s or users work, and can monitor

their contributions, both in volume and quality. The “watch list” also allows users to track

feedback provided by other users, both on articles as well as profile pages on each

system. Therefore, the design of the tool provides incentive for managers to use

Intellipedia and A-Space as a “360 degree” evaluation tool across the IC (Harris 2008,

50).

Despite the tools being available to monitor user contributions, both managers and

employees are unsure how rewards can factor into contributions using social computing

tools. Dixon and McNamara found that some managers were unsure whether their staff

should get credit for Intellipedia contributions. Employees were also found to not expect

receiving performance credit from management nor it being factored into their

114

performance reviews24

. This confusion is exemplified with the “featured article,” where

the Intellipedia project observed that users weren’t aware of how the featured article was

selected, nor was it clear how it balanced into performance review or reward programs

(Dixon and McNamara 2008, 14).

Individual rewards and collective rewards are not independent of each other; rather, users

who are required to share information within organizations do so more often when

collective rewards for a division or unit of workers is awarded as well (Lee and Ahn

2007). Second, it is not ideal to measure performance and distribute rewards based on the

volume of contributions made. Rather, organizations must balance the volume of

contributions against its overall value these contributions provide25

. The credibility

systems that are built into Intellipedia and A-Space, in addition to blogs, Gallery and

iVideo contributions, are designed to showcase individual performance through

contributions (e.g. transparency of users who posted the content). Additionally,

measurement tools exist on enterprise platforms such as Microsoft SharePoint or SAP-

based intranets that record metrics such as employee site visits, visitation times, number

of contacts, the number of crowdsourced deliverables, and other usage metrics.

Therefore, the technical feasibility of measuring employee contributions on social

computing tools is available.

24 Users in the Intellipedia project (Dixon and McNamara 2008) assumed that the finished intelligence that

was available on Intellipedia was already credited to users for their work; social computing tools were

simply to encourage discussion after rewards and credit were given. 25

For more detail of volume versus quality in data analysis, see Jones, Calvert. 2007. "Intelligence reform:

The logic of information sharing." Intelligence & National Security no. 22 (3):384-401. doi:

10.1080/02684520701415214.

115

Rewards for users may be tied to policies entrenched by the USIC’s human resources

policies. These policies are likely measuring overall job performance, rather than simply

the frequency and quality of content contributions to social computing tools. Therefore,

performance measurement of users is likely to be tied to individual agency rewards

programs, and as a measure of their overall production in each employee’s respective

positions. The CIA, for example, uses a “robust performance measurement system”.

Further, “the system empowers employees to be involved in the development of

performance objectives, to discuss progress throughout the evaluation period with their

managers, and concludes with written performance narratives” (Central Intelligence

Agency 2007). The CIA, in developing their current performance management system,

has borrowed heavily from the private sector in employee rewards structuring;

confidence was relatively low in being rewarded in the early 2000’s, and the agency was

forced to revamp their performance measurement and reward system (under the guidance

of an inter-agency effort from the ODNI to improve performance measurement and

rewards systems (Valero 2008)). While the Valero article indicates that 84 per cent of

those surveyed in the 16 IC agencies consider team collaboration to be critical to their

mission, it is not clear if this includes time and efforts spent contributing to tools such as

Intellipedia and A-Space or between agencies in general.

Still, these tools are not directly tied to user’s performance; one assertion may be because

the social computing tools serve as complementary to pre-existing tools for intelligence

production, and aren’t critical enough to be measured. Conversely, managers may not

have the technical knowledge or adequate resources (e.g. time, budget) to measure social

computing efforts effectively (although contemporary collaboration technologies are

116

often designed around transparency and ease-of-use, with the ability to track metrics).

Improvements in measuring performance on social computing tools is also likely

impacted by these tools serving as peripheral services to formal intelligence production

workflows. If they are not mandatory tools, management and senior officials may not see

them as a priority item to measure in the daily business of intelligence analysts and staff.

Performance measurement in the USIC is not robust enough in measuring the use of

social computing tools. Outside of tool-specific practices and word-of-mouth support, no

entrenched reward system exists for users either. Conversely, any inclusion of these tools

in such performance measurement systems will likely be tied to larger, overall

performance, and any inter-agency collaboration efforts would not likely focus on social

computing tools specifically. Therefore, while the USIC and the ODNI have improved

their performance and rewards system since September 11, 2001, social computing tools

appear to not be prioritized over overall intelligence analysis and pre-established

intelligence performance measurements and incentives.

4.6 Summary

Much of the deficiencies of the USIC’s social computing tools appear to be cultural-

based, rather than technological. A lack of direction regarding the use of these social

computing tools is absent from a U.S. national intelligence strategy, although these tools

serve to support reaching the stated goals as outlined in the NIS and the ISS. While the

USIC is adequate to some extent in meeting the requirements as outlined in the effective

117

social computing model26

, it remains deficient in others. Primarily, Intellipedia, A-Space

and other social computing tools are left outside of the primary production workflows,

serving a complementary role instead. Because of this, content may not be considered

trustworthy or “finished”, and are likely contributing to slower adoption rates and

hesitation by others to be actively engage in social computing usage. Additionally,

crowdsourcing content and discovering information is not sophisticated to the point

where new intelligence avoids duplication or made redundant, although the processes for

minimizing such overlap is improving. The technology in use provide grounds for

streamlining data, and with general support from upper management, the technology will

continue to improve in order better serve its users for better information sharing. Finally,

the lack of mandated incentives and performance review for using these tools will hinder

the development and adoption of social technology, although community practices are

emerging for rewarding users for their contributions and collaboration.

26 For a full comparison chart of the social computing tools used in the USIC in the effective social

computing model, see Appendix D.

118

CHAPTER 5: RECOMMENDATIONS

Malcolm Gladwell, author of the best-selling novel The Tipping Point, asserts that a

critical mass is necessary for a major change to take effect. That is, there must be

significant enough adoption from users in a given environment of a new technology or

process whereby there is less benefit in returning to conventional methods of doing

business. The USIC is similar from this perspective – adoption of social computing tools

must be accepted on a community-wide basis for their maximum value to be derived.

These tools must not only be adopted into everyday community business, but also must

be provided as part of a larger strategy for information sharing. They must serve a

specific purpose in improving intelligence analysis and dissemination – a challenge in the

USIC since the attacks of September 11, 2001 and into the new millennium, which saw a

marked difference in the security environment since the end of the Cold War.

The analysis has shown that the U.S. intelligence community is not meeting all of the

required factors for a successful social computing model. While usage is growing and

their appears to be a pragmatic sense among community members as to what these tools

can achieve, their adoption rates have leveled off, and the tools are viewed in a peripheral

light to established community business processes for producing intelligence. This thesis

has also shown that agencies are still “stove-piping” information and are relatively

territorial of their own intelligence, which has shown to be problematic in detecting and

preventing terrorist threats early enough to counter them. While the U.S. has not suffered

any terrorist attacks on home soil since 2001, there have been numerous ‘close calls’

119

where luck and timing, rather than solid, inter-agency produced intelligence, was the

difference between successful threat prevention and attacks against American targets.

Additionally, the analysis has shown that community members are not fully informed or

aware of how they can use the tools to perform their job. The “unofficial” nature of

information produced on tools such as Intellipedia and A-Space has led to hesitation and

uncertainty on the part of community members in engaging these tools and pushing for

their integration into pre-existing bureaucratic processes, or replacing them all together.

This is a topic that will need to be examined more closely within the USIC and by

academics, where private sector social computing tool implementation examples may

serve as benchmarks for large organizational change.

Users are aware of the purpose of these tools being implemented. The need for better

inter-agency collaboration and improved intelligence has been pushed by leaders in the

community since the fallout from the September 11, 2001 attacks and the creation of the

IRTPA in 2004. The need to improve intelligence analysis and information sharing is not

lost on those who produce it. However, because of the uncertainty of how the tools fit

into the overall structure of intelligence analysis and intelligence production, and the fact

that they have not displaced conventional bureaucratic processes to produce intelligence

is causing confusion, even with the support of community leaders.

Improvements in the tools used in intelligence analysis and production must also not be

seen as an endeavor exclusively for better information sharing. The NIS and Information

Sharing Strategy are tied together for reforming inter-agency cooperation and

collaboration. Therefore, improvements in social computing tools must be seen within a

larger effort to improve analytic tradecraft and intelligence as a whole.

120

The next section of this chapter recommends methods and ways in which the USIC can

improve the use and adoption of social computing tools into everyday community

business. These recommendations are not short-term fixes for better information sharing;

success of these tools are tied to both cultural and organizational change within each

agency and under the umbrella of the ODNI, and are intended to be from a whole-of-

community approach. As stated earlier, social computing tools are not the end in and of

themselves; they simply serve as new methods to overall improved intelligence. In this

case, better intelligence through better information sharing practices are necessary to

prevent contemporary threats to U.S. national security.

Recommendation #1 is based on improving analytic tradecraft in the USIC. Much has

been researched and written about analytic tradecraft (especially after September 11,

2001), but it is a large and complex field with contributions by experts from both

academia and industry where many solutions are proposed for improving the quality of

intelligence analysis. While this recommendation could be analyzed as a topic unto itself,

identifying the key points of improving analytic tradecraft will serve as a basis to

understand the subsequent recommendations. As this paper has shown, social computing

tools are tied to the attitudes, perceptions and culture that define the USIC, and

improvements in tradecraft would positively affect the way social computing tools are

used in the community and their place in the intelligence production process.

121

5.1 Recommendation #1: Improve faucets of analytic tradecraft to improve the

culture of information sharing as a basis for using social computing tools.

The USIC analytic community today is focused on short-term, current intelligence with

an emphasis on tactical and operational concerns. While the NIS consistently indicates

that long-term strategy is the goal, the intelligence cycle has and continues to be

engrained with a short-term focus. There is an emphasis on intelligence reporting rather

than gaining a deeper understanding of adversaries and threats (Gabbard and Treverton

2008, 1). Yet, for the intelligence shortcomings that resulted in the attacks of September

11, 2001 and the Iraq WMD Commission Report in 2002, lessons have not been acted

upon. Changing the nature of analytic tradecraft requires a fundamental shift in cultural

thinking towards better information sharing, inter-agency trust and promoting a

pragmatic view of collaboration. While much academic and industry research is based on

what constitutes accurate intelligence and the standards that analysts must meet in order

to produce it, an environment must exist in which those standards can be reaches and the

maximum quality and accuracy of intelligence can be produced. This recommendation

aims to achieve this goal.

Recognized efforts are being made to transform analytic tradecraft. The creation of the

Deputy Director of National Intelligence for Analysis (DDNI(A)) role and the

implementation of the AT program have been a concerted effort to improve the quality of

analytic products (e.g. NIEs, Presidential Daily Briefs) and further integrate analytic

operations across the community. On operational and tactical levels of analysis,

documentation has been produced for analysts to engage in fact-checking, source-

checking, and general steps for improving the quality of their analysis deliverables. The

122

CIA provides its analysts with “structured analytical techniques” to “assist analysts in

dealing with the perennial problems of intelligence: the complexity of international

developments, incomplete and ambiguous information, and the inherent limitations of the

human mind” (Central Intelligence Agency 2009) Such techniques include Key

Assumptions and Quality Information Checking, Red Team Analysis, High-Impact/Low

Probability Analysis, and Alternative Futures Analysis, and “Outside-In Thinking” (30).

Additionally, other improvements have been made. For example, whereas before

September 11, 2001 the President’s Daily Brief was solely developed by the CIA, it is

now a multi-agency product delivered, for all intents and purposes, from the ODNI

(George 2011, 73-74).

Challenges, however, remain in fostering an environment beyond procedural and

technological solutions. One such proposal for improving analytic tradecraft with respect

to inter-agency culture building involves establishing a National Intelligence University

(NIU) that achieves more for intelligence staff than only on-boarding procedures. The

CIA’s Kent School of Intelligence does not satisfy the demand for intra-community

culture building or agency collaboration that needs to be implemented in the work styles

of new recruits and agents. A bricks-and-mortar institution where this learning can take

place must be considered as a place where agents can benefit from community-wide

analytic training, methodologies, subject matter, and strategic thinking (in addition to

building cross-agency contacts). An NIU would also help reduce “gold-standard”

thinking among agencies that believe their own intelligence is the best available (George

2010). Engrained cultures among agencies foster divisions whereby competition, rather

123

than collaboration, drives intelligence production that is often incomplete, uncoordinated

or unreliable.

Performance measuring and incentives must also be part of an improved analytic

tradecraft in the USIC. While monetary incentives have been used to attract and retain

high-performing analysts, there remains a larger challenge in reforming the work patterns

and habits of existing analysts to move to a more collaborative, transparent mode of

information sharing. When one considers that incentives to provide analysis goes beyond

simple monetary rewards for productive individuals or groups, the infrastructure must be

in place to meet the expectations of analysts to perform their roles. Therefore, the USIC

must meet the expectations of a growing and diverse workforce, including providing the

opportunities and resources needed to retain them. As generations of analysts move

through the community and the baby-boomer generation retires, younger analysts are

more digitally-oriented than their predecessors, and expect a level of technology to

permeate their work environments. These new analysts use social Web 2.0 technology in

their daily lives, and expect as much in their workplaces. Therefore, compensation and

incentives must be based on how the individual contributes to an information sharing

environment, including their contributions via social computing tools and their level of

engagement with other agencies (joint analysis) in producing intelligence. Performance

metrics must reflect their ability to discover and share information and conduct joint

analysis, but also measure their ability to contribute to strategic intelligence building, a

challenge that remains cross-generational and central to improving analytic tradecraft as a

whole.

124

There must also be a shifted focus towards all-source analysis and formulating strategic

intelligence. In the 1990’s, three forms of analysis were present: technical processing,

single discipline, and all-source analysis. However, technology and the changing nature

of threats (e.g. threats becoming non-state, amorphous entities) has led to all-source

analysis being the most comprehensive method for which to produce accurate

intelligence. The problem of “failing to connect the dots” in intelligence that led to the

attacks of September 11, 2001, is a glaring instance of when all-source analysis and

strategic (not current) intelligence would prove effective in safeguarding the homeland –

strategic intelligence provides the logic for creating and implementing a strategy, one

which was absent prior to the attacks. Conversely, it has been suggested that too many

dots were connected in the production of the Iraq WMD Commission Report, which

equally created intelligence failures internal to the USIC (Lowenthal 2008b, 306).

Regardless, social computing tools help bridge the various sources available on a given

intelligence topic, and thus, helps foment an environment where all-source analysis is

possible. Social computing users can search, discover and collaborate on HUMINT,

SIGINT and IMINT intelligence, supported by previous intelligence products and topical

data. From channeling these various forms of intelligence from multiple sources in

multiple agencies, all via a software platform that enables information sharing, all-source

analysis, which leads to better strategic intelligence, can be realized27

.

27 This thesis acknowledges that in some situations, all-source analysis is not required, and that particular

intelligence types or particular sources are required for specific problems. However, this recommendation

is based on the lessons to be learned from Sept, 11, 2001, the Iraq WMD Commission Report, and various

intelligence failures that have occurred since where inter-agency collaboration and strategic thinking was

absent.

125

The RAND Corporation asserts that tools – whether social technologies, private sector

products or business process devices – must only be available if it allows analysts more

time to “think” (Gabbard and Treverton 2008, 19). Therefore, social computing tools are

simply a subset of a larger movement towards analytic tradecraft reform through better

human judgment. But it is often the tools that get identified first or solely as that which

improves or impedes analysis. Furthermore, the effectiveness of these tools is often the

perspective of the agency using them, rather than a broader strategic body such as the

ODNI that measure their effectiveness. Sponsors and supporters of tools and technology

used to improve analysis of intelligence have, in recent history and since 2001, been

uncoordinated, and there exists a lack of community-wide consensus on tool

standardization and practices that must apply to all USIC agencies (20). However, with

tools such as Intellipedia and A-Space and their strong adoption rates, such ambiguity on

tool use and understanding may be less pronounced. In support of this, the workforce of

the USIC is growing younger, and as one generation prepares to retire, these tools are

being seen as solutions to capturing knowledge and experience of older analysts leaving

the community (as observed in the Dixon and McNamara projects). Social computing

tools are one such way in which analytic tradecraft can help reduce the “green/gray”

problem (Lowenthal 2008a, 145) and improve analytic tradecraft by capturing analyst

knowledge. In the larger picture, shifting demographics must be a major focus of

reforming tradecraft for better intelligence.

Improving analytic tradecraft is an on-going challenge for the USIC - heavy scrutiny is

placed on the community if intelligence failures happen and are exposed to the public,

whether internal (e.g. Wikileaks and the Iraq WMD Commission Report) or external (e.g.

126

Northwest Flight 253). But it is through a culture of information sharing to overcome the

engrained “stove-piping” mentalities of agencies that progress can be made, and

subsequently address the concerns of culture change proposed by the Markle Foundation

as a major reform to national security. Additionally, better efforts at inter-agency

collaboration through the aforementioned examples will assist in fomenting a greater use

for social computing tools, progressing it toward mandatory technology in the production

cycle rather than as complementary to existing workflows.

5.2 Recommendation #2: Social Computing tools must be mandated into intelligence

production workflows.

Common criticism leveled towards the USIC is that, in terms of the intelligence cycle and

intelligence analysis, conventional intelligence production methods produce faulty

intelligence that may present information to policymakers that is obsolete, misleading,

and/or factually wrong. The overreliance on questionable human sources and hasty

assembly of the NIE that produced the 2002 Iraq WMD Commission Report is such an

example. While efforts have been made to improve the drafting process of NIEs and

other deliverables to consumers, such as a mandatory review by the National Clandestine

Service and a “concerted effort . . . to highlight difference among agencies and explain

the reasons for such differences” (Bruno and Otterman 2008, 6), reform efforts do not go

far enough. Agencies still produce agency-centric intelligence where collaboration is only

encouraged and expected, rather than made as a mandatory part of the production

process. Additionally, these conventional workflow models produce intelligence that is

not crowdsourced, created independently of each other, and creates duplicate intelligence

without proper joint analysis. The process for intelligence production is disjointed,

127

inefficient, and unnecessarily repetitive. It is with these concerns that social computing,

as an effort to increase collaboration and promote a culture of information sharing with

other agencies, can be implemented into the mandatory intelligence production

workflows28

.

A number of benefits can result from integrating social computing tools into these

workflows. First, these tools will continue to grow in their use, and promote, as a front-

line technology, social computing tools that are strategically placed to encourage better

information sharing among analyst, employees and agencies. Management support and

clear direction on the purpose of these tools must be made to all users of social

computing technology. Second, as this thesis has identified, the content that is produced

using social computing tools in the USIC is often seen as complementary and supportive

of pre-existing workflows in producing conventional intelligence products. They do not

penetrate the official, vertical-based drafting of intelligence products that is currently in

place. Additionally, Intellipedia and A-Space are places to collect knowledge and ideas,

but remain unstructured in terms of vetting similar content; the systems are used in a

wide-open basis for collaboration, and therefore, little governance exists to combine and

solidify products that may be delivered to consumers using these platforms. There is a

prevailing sense in the USIC that Web 2.0 technologies have been “good for

collaboration, but not the product” (Rasmussen 2010).

28 This thesis focuses on the analysis phase of the intelligence cycle, whereby analysis of process

intelligence draws conclusions and establishes fact, perspectives and recommendations to present to

policymakers.

128

As identified earlier, the USIC produces 50,000 reports each year, many of which go

unread or untouched after initial use. Filtering and corroborating duplicate content

(whether intentional or not) must be a mandatory step in the process, but only during the

drafting process can this truly prevent replication of information over time. Therefore, a

new production workflow must be used.

The above workflow concerns have not gone unnoticed by the community. Web 2.0

evangelists and advocates in the USIC are prominent voices for workflow change. Under

the guidance of Rasmussen and other community experts, there is a prototype of a new

joint production method using social computing technology currently being used entitled

Intellipublia. This prototype uses the MediaWiki engine (same as Wikipedia and

Intellipedia), but is heavily customized to accommodate publishing workflows that fit

USIC requirements (Rasmussen 2010). It includes an approval workflow process which

allows crowdsourced content to be drafted, edited and approved by the various agencies

on a particular intelligence topic. For example, a topic on the use of improvised explosive

devices in Somalia could be started by the DIA as an article, edited by the NSA and the

CIA, and published for review and final approval. These agencies could provide their

physical seals of approval (e.g. the logo is embedded on the wiki article page), and can

subsequently go back into edit mode for the other IC agencies to edit and approve. Once

all relevant agencies have approved the content, the article becomes an official,

consumable intelligence product. This process can help reduce duplicate efforts in

multiple agencies on similar topics, and can act a final product to aggregate data

exclusive to agencies where information may be obscured or otherwise undiscoverable.

While ownership of the information becomes a collective among agencies, the process

129

does not abandon the necessity for agencies to provide individual approvals that are

crucial to existing methods of production. This is the “Joint Production Line” of the

Intellipublia workflow model (Rasmussen 2010), and would likely replace the current

Intellipedia software, although draft and unofficial data for casual collaboration would

likely still be hosted on the wiki via linked sections or topically-associated workspaces so

as to be used as one central and accessible platform.

To recognize the benefit of implementing social computing tools into the workflow and

reduce poor technology solutions that create data redundancy, lack of collaboration and

overlap, Rasmussen and other community professionals advocate a reduction in

technology solution funding (Kash 2010). This is likely to come to fruition, as the USIC

has prepared budget cuts in excess of $85 billion over the next ten years, including

providing cuts and improving efficiencies in information technology (Walcott 2011).

Implementation of such technology improves efficiency and reduces data duplication as

stated above, but reforming the culture of accepting crowdsourced content as official

material requires a change in not just technology use, but also an openness to accept

community consultation and information creation. The hierarchy ‘flattening’ effect of

crowdsourcing content and its publishing process have seen positive examples in public

sector implementations. The Peer-to-Patent program, a program designed through the

New York Law School and endorsed by the U.S. Patent and Trademark Office, allows

the general public to use web-based social platform tools in screening and providing

voluntary expertise on pending patents. Once the public consultation process is complete,

the evidence is submitted to the U.S. Patent Office for evaluation and decision. The

publishing workflow of a patent review has taken the contributions of public participants

130

and integrated their efforts into a process that was previously backlogged with patent

applications. Augmenting this process to let the public participate has resulted in more

efficient output from the U.S. Patent Office, as well as allowing more technical experts

into the application process. The program has included almost 2000 expert participants

volunteer for the program, and between 2007 and 2008, has seen participants submit 170

instances of prior art (pre-patented material) for 35 applications (Osimo 2008, 25-26).

This example lends credence to social tools working in an environment that is part of

mandatory public sector workflow, as well as crowdsourced content. Again, the goal of

this recommendation is to use technology in order to create an environment of better

information sharing in an information-sensitive environment, but one that needs to

balance efficiency with high-level strategy established by the ODNI and Congressional

decisions.

5.3 Recommendation #3: Intelligence products must become topical and dynamic

rather than specific and static.

If the recommendation of adopting the Intellipublia model of intelligence production is

put into place, the system changes the way in which intelligence products are

disseminated to consumers and policymakers. An intelligence article that is published in

the Joint Production Line method (Intellipublia) means that only one version of the story

is official, but is approved and sourced by agencies as a collaborative product. With this

method, all agencies are aligned on the official version when it is disseminated to

consumer.

However, intelligence is a dynamic field. Intelligence can quickly become outdated, and

consumers of intelligence produced by the USIC are only as knowledgeable as that which

131

is provided through the intelligence cycle. Historically, intelligence products are

produced to reflect intelligence at a particular point in time. They are static, and become

obsolete based on the nature of the content. Additionally, multiple products often contain

similar information, and can be written with conflicting, sometimes contradicting

information to policymakers.

In an attempt to curb this problem of consumable intelligence, the Intellipublia model

produces a wiki page that can be constantly reviewed, edited and updated by agencies

based on newly-collected intelligence that pertain to a particular topic. In the earlier

example of Somalia, the article itself (that is, the wiki page) can be considered the

deliverable, and, when requested from a consumer, the continually updated article reflects

the most up-to-date and real-time account of intelligence that all agencies currently have.

In this sense, intelligence production is dynamic rather than simply a static point in time,

and creation, drafting, review, approval and publishing are managed in one spot.

But the article produced, is, again, only one vein of an intelligence topic that is produced

and provided to consumers. Improvised explosive devices in Somalia only reflect one

tangential area of the larger Somalia focus. Additionally, such sub-topics of a larger topic

may prove to be useful in discovering IED intelligence and research in other regions of

Africa, or perhaps relevant to intelligence on materials used in explosive material.

Articles can be part of one topic or many topics, thereby increasing their exposure as

meaningful intelligence products that can be discovered more intuitively. In this respect,

it is not the articles that are the end products themselves, but rather are part of the content

that contributes intelligence to a particular topic. This is an important part of the “Living

Intelligence System” that is advocated by Rasmussen and other community supporters.

132

The technology that most resembles this capability is Google’s Living Stories software.

Although the project was eventually not used in the alpha phase of the Intellipedia

prototyping project (Rasmussen 2011), it was the basis for presenting vetted, social

computing-based intelligence as a deliverable to consumers29

. Appendix C is a screenshot

of how the Living Stories software looks to end users.

The software allows for an interactive, all-source on-screen representation of a particular

topic, akin to an encyclopedia article. The presentable information is the published

intelligence (from Intellipedia, Intellipublia or other database platforms) meant for

policymakers to consume. The software offers a ‘feed’ of all the articles that are relevant

to the topic at hand, a timeline of important events regarding the topic, options for sorting

the data based on user needs (e.g. based on keywords, metadata or dates and time), and

allows XML feeds via page widgets to be added from other topics. The system also

allows presentation of audio, video and other forms of intelligence that can read from

legacy database systems. In this sense, the software is providing a central platform for

users to discover collaboratively-built articles and intelligence that have the ability to

draw from historical and current data.

This technology (and similar news feed technology like it) also allows for significantly

reforming the dissemination phase of the intelligence cycle. Dynamically-organized

intelligence has the potential to improve budgetary performance for agencies under the

ODNI structure, as agency consensus in intelligence products helps streamline

29 In 2010, Google release the Living Stories software as open source code. Therefore, the code base exists

in the public realms for developers to use as a base for implementation in private and public sector

deployments. Additionally, due to it being open source code, it can be modified and extended to meet the

needs of a business or organization, and is scalable to be used with RSS feeds, sortable lists, XML feeds

and parsable data.

133

bureaucratic processes used by Congress or policymakers for review. 50,000 intelligence

reports and products are produced each year; often with duplication. As a result,

resources budgeted towards intelligence production are often spent inefficiently due

duplicated work, conflicting intelligence reports, or lack of required approvals from

certain agencies.

Additionally, digital formats of intelligence means that consumers and policymakers have

quick, up-to-date access on pertinent intelligence topics (and reports) via security-

clearance access. However, considerable resistance and a hesitation to use technology

solutions to disseminate intelligence remain (Berkowitz 2007). One such example

includes CIASource, a security-protected web news service that exists separate from

Intelink. Although CIASource mainly publishes sensitive analysis and reporting (and is

populated by the CIA), it is strongly restricted to very few users who qualify for

individual terminal access, even if said individuals have Top Secret/SCI clearance. This

may be due to a chronic and lingering fear of losing control of intelligence through

dissemination, and as such, contributes stifling environments looking to build accessible

IT systems. Therefore, in order to achieve using dynamic intelligence to improve

intelligence collaboration, resource allocation and fiscal responsibility in intelligence

production, agencies need to recognize that major reform comes with not only

improvements in information technology, but also a cultural shift as a whole. Such fears,

however, are not without good reason. The recent Wikileaks scandal involving the release

of thousands of sensitive diplomatic cables and documents to the public exemplifies the

type of scenarios that intelligence agencies strive to avoid, and can create atmospheres of

134

technology distrust. Proper information architecture can help alleviate these security

concerns. This issue will be addressed in the next section.

5.4 Recommendation #4: Social computing tools must be designed within an

information architecture model that improves ease-of-use and access but maximizes

security

Information technology improvements in the USIC have been a mandate since the 9/11

Commission intended to improve the information sharing environment among

intelligence agencies. However, such a large task is not easily implemented nor

automatically adopted overnight. The amorphous nature of software and computer

technology, which continually upgrades and reinvents itself with the sophistication of

information systems and web-based platforms, has created challenges for the USIC in

terms of keeping pace with technological change. Additionally, engraining a culture of

information sharing and a lack of implementation strategy for this has created additional

challenges with technology in order to “connect the dots” in intelligence. These

challenges are further complicated by the tightening of budgets and downsizing in the

USIC to accommodate a more streamlined and efficient federal government structure.

It is for these reasons that IT needs to be examined and re-examined in the USIC with a

practical, realistic and long-term strategy in mind. Information Technology systems must

reflect both the tightly-controlled, compartmentalized information architecture that

sensitive systems require, but also provide fluid access to collectors, analysts and

consumers who need the information in order to make better decisions at the policy-

making level. Therefore, a balance needs to be struck between efficient information

systems and policies of transparency and openness where possible.

135

One such solution of technology reform in large organizations has been a move towards

cloud computing. This is the process of keeping data and software on separate networks

from the organization that uses them – similar to terminal and server access structure

where the server hosts all the programs and databases. With cloud computing, users

access their software and data using web-based services. Some of these services are being

used in large enterprises to streamline the costs and resources needed to host data and

software internally. The USIC is also currently moving towards the cloud computing

model. In October, 2011, James Clapper confirmed that cloud computing will be used as

a model in which to meet aggressive budget reduction targets, in line with a reduction in

intelligence funding and federal spending (Jackson 2011). Prior to this initiative, the

National Reconnaissance Office, DIA, National Geospatial-Intelligence Agency and the

National Security Agency (known simply as the Quad) were developing common

information architectures and IT infrastructures to better facilitate collaboration and

identity access management (Rosenberg 2011). The intention of the Quad was to move

away from “federated environments” of information architecture towards an “integrated

environment” where openness and trust among participant agencies is easier to achieve

(4)30

. In this vein, a move towards new, private-sector solutions such as cloud computing,

and considering open-source solutions to better integrate with private sector-developed

30 Social computing tools are often included in cloud computing services as part of enterprise we-based

platforms. These tools are tied into cloud-hosted data, and as such, represent a comprehensive software

solution for hosting data, information architecture and social computing tools access that lend to the

effectiveness of Enterprise 2.0 technology. However, there are concerns about cloud distribution. For more

detail, see http://www.businessweek.com/news/2011-11-14/intelligence-budget-cuts-mean-u-s-will-have-

more-blind-spots.html

136

software will bring IT reform in line with modern technological trends that other large-

scale organizations are concurrently moving towards as well.

Moving towards a streamlined information architecture that better encompasses ease of

access and reduces data duplication requires an effective permission management system.

Currently, social tools exist on the various networks that are based on clearance levels

(NIPRNet, SIPRNet and JWICS). Each network has particular databases, applications

and access rules that are pertinent to their individual users. Because of this, crowdsourced

material (e.g. Intellipedia content) is often unequally distributed on each network. As

identified earlier, the most popular version of Intellipedia exists at Top Secret/SCI level

on JWICS, as this has the largest amount of searchable content; Intellipedia content that

qualifies for lower-level clearance is often duplicated on JWICS as a matter of procedure.

As such, up to three different versions of similar content exists on each version of

Intellipedia, and physically disconnected information between IC staff with varying

clearance levels is a constant risk to effective information sharing. Likewise, A-Space

exists only at the JWICS level; subsequent social networks (e.g. J-Space for non-analytic

staff workers and C-Space for intelligence collectors) are not integrated within a unified

information architecture system (Rasmussen 2010).

The use of multiple networks physically separated from each other creates “air gaps” that

are neither efficient nor conducive to effective information sharing (Swamy 2008, 3).

One alternative to this is to employ an information redaction engine under a multilevel

security model using one central system. This design is based on the Bell-La Padula

security model, where users are allowed to read information at or below their respective

permission level, and create information accessible at their level or higher. Such a system

137

is designed on a user querying an information object, which recognizes permission

clearance and a set of categories under which the object is classified. The query will

return information only permissible by their level of security about the given topic. In

other words, a SBU-level employee could access a particular topic article, but only the

permissible parts of the topic are displayed. One recommendation for implementing this

technology with a MediaWiki engine (the same as the one Intellipedia uses) involves

“introducing an intermediary authorization and cross-domain guard layer between the

data storage and presentation portions of the wiki” (Gehres et al. 2010). Wiki content is

separated with permission-based metadata that is cross-referenced against a user’s

permission level, which then filters presentation material for the specific user accessing

it. Additionally, information is redacted by marking it with “redaction tokens” that can be

applied to anything from a single letter to an entire page (3). This model can be further

secured by incorporating security-based programming languages such as Jif or

FlowCaml, which analyzes and mediates queries on information that require policy-based

decisions (e.g. querying metadata that initiates permission-based actions; (Swamy 2008,

4-5).

Reforming information architecture to meet the large data needs of multi-agency

organizations must also involve technology that bridges governance and technology

together. Metadata, a foundational component to connecting different types of data, is

central to any information architecture model. In Chapter 4, it was identified that

community terminology, as an example, has the ability to cause confusion among

agencies in classifying and prioritizing intelligence products. Emerging content

management platforms such as Microsoft SharePoint and SAP, which are used in the

138

intelligence community and the security industry, have the capability to manage metadata

in an organizational-wide context. Additionally, in-house software in the USIC, such as

Catalyst, have the ability to algorithmically discover and source information in legacy

databases by similar terminology, including places, people, and other nouns or phrases.

The work already done by Catalyst is creating more uniformity in tagging information

during the raw intelligence collection and analysis cycles (ODNI 2008a, 9), and by

integrating with the LNI and A-Space search and discover capabilities (9). Rigorous and

standardized semantic data ontology, however, must be matched with an equally effective

training program for the IC workforce tasked with tagging such information. If (for

example) an intelligence product is produced by an FBI agent on Fidel Castro and Cuba

and the product is not properly tagged with contextual keywords (e.g. Only tagging the

article with “FBI” rather than “Fidel Castro” or “Cuba”), the product would become

obscured along agency-centric lines and become difficult to find (within and among

agencies). Therefore, an information architecture that supports sophisticated metadata

functionality (in hand with an effective training system) helps minimize community data

loss, improves response time to national security issues, and provides intelligence

consumers with higher-quality, relevant information31

.

There are elements to secure IT infrastructures that cannot be answered with

technological solutions alone. Bradley Manning accessed SIPRNet to download and

distribute sensitive information (through the use of unapproved data mining software).

However, the cause of this incident should not be solely placed on software and technical

31 For more about the importance of properly tagging data and the role of information aggregation and

metadata in the USIC, see Ben Eli and Hutchins (2010).

139

shortcomings: SIPRNet is designed for a level of clearance that is ultimately provided by

human decision-makers and with certain expectations of conduct. Manning did not

“hack” or break in to SIPRNet to steal information; his access was provided through the

pre-existing security clearance process and used query-based software that exploited

information search capabilities (Zetter 2011). Therefore, any solutions moving forward

for improving information security must include similarly sophisticated human approval

processes, information governance and security policies, all as part of a comprehensive

security solution. Additionally, Wikileaks, from a technology standpoint, was not to

blame either – the release of the information was a partnership between those who

operated Wikileaks and media organizations that handled such information. Wikileaks, as

a website, was technology agnostic; it is the decisions made by humans that resulted in

the leakage of sensitive information that created public emergencies for numerous federal

governments.

From a more strategic level, any such development towards a more efficient, safer and

accessible model of information sharing needs to account for the human aspect.

Ultimately, human decision-making either grants or prevents users from certain

information and capabilities. The Wikileaks scandal set off a wave of apprehension

towards information sharing in the USIC, as Hilary Clinton, Secretary of State, remarked

that the leaks amounted to an “attack on America’s foreign policy interests” and on the

“international community”, and has “put people’s lives in danger” (Aroon 2010).

Director of National Intelligence James Clapper also commented that the leaks posed a

“chilling effect” on the community’s willingness to share information (Aroon 2010). In

his testimony before the Senate Committee on Homeland Security and Governmental

140

Affairs in 2011 regarding the Wikileaks security issue, U.S. Under Secretary of State for

Management Patrick F. Kennedy outlined that robust governance and effective policies

must be first put in place before technological solutions (Information Sharing in the Era

of WikiLeaks: Balancing Security and Collaboration (Patrick F. Kennedy) 2011).

However, Kennedy’s testimony (along with others in the intelligence industry (Chomik

2011, 106)) argued that compromising progress made in information sharing through

technology use is not ideal for controlling information leaks, and stress that one incident

should not dictate the general movement towards in-place collaboration solutions for

intelligence. The human factor is central to controlling information leaks, and the

technology is secondary to the core problem of human decision-making.

141

CHAPTER 6: CONCLUSION

The intelligence community is continuing a transition into a post-9/11 security

environment. While significant reforms have been made in organizational structure,

strategy and policies guiding strategy, the community is still attempting to find better

solutions to improving all aspects of the intelligence cycle to protect Americans at home

and abroad. The recent intelligence failures in The Times Square Bomber incident and

the case of Umar Farouk Abdulmutallab have highlighted that the intelligence

community is still finding challenges in how intelligence is gathered, analyzed and used

by consumers in order to prevent threats from non-state actors on homeland security.

Additionally, the USIC is also encountering threats from more conventional actors such

as Iran and China in the form of nuclear, military, and cyber security concerns. However,

as these threats go, information systems and those that engage in using these tools (both

state and non-state actors) will continue to impact the abilities of the USIC to collect,

process, analyze and disseminate intelligence.

Additionally, as the IC moves into the second decade of the 21st century, the community

is likely to be besieged with budgetary concerns, resource allocation, and increasing

demands to improve intelligence products due to new technologies and processes. The

United States is facing daunting austerity measures as it looks to curb government

spending and reduce the national debt level over the next decade, including through

budget cuts in intelligence (Zakaria 2012). Overspending, duplication of agency efforts

and a slow reform in organizational culture towards better information sharing have left

the USIC encountering similar problems to those which ultimately led to the tragic events

142

of 2001 – a lack of information sharing, agency cooperation, and too much focus on

current and short-term intelligence that has left the community still challenged to

“connect the dots”. Even though adoption rates for Intellipedia and A-Space have reached

significant levels, there remain only so many users who actually contribute or engage the

tools, and only so much support that management can give before another disruptive

technology is introduced as a panacea for information sharing. Michael Kennedy, director

of enterprise solutions for the USIC admitted that, while Intellipedia is still considered a

success and a central Intelink feature, the community always needs to look forward to the

“next great tool” (Miller 2011). Further, DIA Deputy Director, David Shedd, in

describing a go-forward model with a tightened intelligence community budget,

suggested that “budget austerity … drive(s) you to think about different models (of

intelligence)”(Zakaria 2012). Whether this includes minor adjustments to community

strategy or implies radical changes (including reassessing the use of social computing

tools) remains to be determined.

This thesis has identified social computing efforts within the USIC, devised a model to

measure effective social computing tools in a sensitive intelligence environment,

analyzed the USIC’s strategy and tool deployments against the model, and made

recommendations for improvements that would support better and further integration of

social computing tools. The results of the analysis in Chapter 5 show that, while

significant progress has been made in some areas, gaps are prevalent in others.

Particularly, the authoring of intelligence products in mandatory intelligence workflows

is the largest and most overt challenge the USIC is experiencing. In addition, the USIC

must address performance measurements and incentives for employees in using social

143

computing tools, and will require an examination into better access management,

information security and authorized usage to prevent compromising intelligence sharing

efforts. Additionally, the culture of the USIC is having a significant impact on how these

tools are used. The “need to know” culture that persisted during the pre-9/11 era has not

disappeared. Some of the challenges of agencies “stove-piping” information, whether for

bureaucratic or agency-centric reasons, still prevent the USIC from adopting a fully

successful model of social computing. As the analysis has shown, the full cultural buy-in

for these tools to grow organically in their use and be used as trusted, reliable IT

solutions remains to be fully realized.

However, the USIC is making adjustments to how information sharing is conducted. The

NIS continues to guide the intelligence community, and information sharing continues to

be a high priority for the USIC. The Strategic Intent for Information Sharing 2011-2015

document produced by the ODNI reaffirms that information sharing is a priority and

outlines five goals and objectives (ODNI 2011), similar to what the Information Sharing

Strategy outlines. It appears to have adopted an approach to information sharing that

more strongly supports responsible sharing through better access management,

governance and standards. This shift in focus from simply more information sharing to

responsible information sharing is likely a result of lessons learned from the Wikileaks

scandal, as well as using software technology and IT systems in a more effective and

collaborative manner. This is also likely to address the lack of managerial support and

tactical, operational and strategic guidelines that have highlighted the initial years of Web

2.0 use in the USIC.

144

Regardless of how collaboration in intelligence production is achieved, Web 2.0

technology and its application to private and public sectors will continue to grow in use.

It provides a direct bridge between users to communicate and share information,

regardless of it being analysts, staff or the general public. As the USIC and the global

intelligence community moves forward into the 21st century, people both inside and

outside intelligence agencies will expect a level of digital information services at their

disposal to use, as the Internet, web technology and social computing practices continue

to further permeate how humans interact and produce information. Challenges still persist

in improving information sharing in a post-September 11, 2001 world, but the attacks

marked the beginning of a cultural shift into accepting organizational reform into the

digital age of communication. Intelligence strategy and the ability to prevent threats to

the American homeland and abroad will continue to be defined by this shift as well as the

USIC moves into the second decade of the 21st century.

145

BIBLIOGRAPHY

Ackerman, Robert K. 2007. Intellipedia Seeks Ultimate Information Sharing. SIGNAL

Online, October 2007. Accessed October 15, 2012.

http://www.afcea.org/signal/articles/templates/SIGNAL_Article_Template.asp?ar

ticleid=1400&zoneid=31.

———. 2008. Future Threats Drive U.S. Intelligence. SIGNAL Online, April 2008.

Accessed April 7, 2012.

http://www.afcea.org/signal/articles/templates/Signal_Article_Template.asp?articl

eid=1548&zoneid=231.

———. 2009. Intelligence Community Embraces Virtual Collaboration. SIGNAL Online,

May 2009. Accessed October 11, 2011.

http://www.afcea.org/signal/articles/templates/Signal_Article_Template.asp?articl

eid=1927&zoneid=259 .

Andrus, Calvin. 2005. "The Wiki and the Blog: Toward a complex adaptive intelligence

community." Studies in Intelligence no. 49 (3).

https://www.cia.gov/library/center-for-the-study-of-intelligence/csi-

publications/csi-studies/studies/vol49no3/html_files/Wik_and_%20Blog_7.htm.

Arellano, Nestor E. 2008. Canada embarks on major Web 2.0 initiative. itbusiness.ca.

Accessed October 23, 2011.

http://www.itbusiness.ca/it/client/en/home/News.asp?id=48569.

Aroon, P.J. 2010. "Clinton: WikiLeaks disclosure is 'attack on the international

community'." Madam Secretary: An Obsessive Blog about Hilary Clinton, (blog).

November 29, 2010. Accessed February 12, 2012.

http://hillary.foreignpolicy.com/posts/2010/11/29/clinton_wikileaks_disclosure_is

_attack_on_the_international_community.

Australian Government. 2009. Engage: Getting on with Government 2.0 (Report of the

Government 2.0 Taskforce). Department of Finance and Deregulation. Available

from

http://www.finance.gov.au/publications/gov20taskforcereport/doc/Government20

TaskforceReport.pdf.

Australian Government. 2010a. Government Response to the Report of the Government

2.0 Taskforce "Engage: Getting on with Government 2.0". Department of Finance

and Regulation. Available from

http://www.finance.gov.au/publications/govresponse20report/doc/Government-

Response-to-Gov-2-0-Report.pdf.

146

Australian Government. 2010b. National Security Information Environment Roadmap:

2020 Vision. Department of the Prime Minister and Cabinet. Available from

http://www.dpmc.gov.au/national_security/docs/national_security_information_e

nvironment_roadmap.pdf.

Bain, Ben. 2008. iVideo: YouTube for the intell community. Federal Computer Week,

12. http://ezproxy.lib.ucalgary.ca:2048/login?url=http://proquest.umi.com

.ezproxy.lib.ucalgary.ca/pqdweb?did=1472236211&Fmt=2&clientId=12303

&RQT=309&VName=PQD.

Baird, Zoe and Barksdale, James. Creating a Trusted Network for Homeland Security.

New York City: Markle Foundation, 2003. Accessed September 30, 2011.

http://www.markle.org/sites/default/files/nstf_report2_full_report.pdf.

Bastien, Douglas. 2009. "About the Government of Canada wiki, GCPEDIA."

Government of Canada 2.0, (blog). February 24, 2009.

http://blog.gc20.ca/index.php/about-the-gc-wiki-gcpedia/.

Ben Eli, Ariane, and Jeremy Hutchins. Intelligence after Intellipedia: Improving the Push

Pull Balance with a Social Networking Utility. Fort Belvoir, VA: Pratt Institute,

School of Information and Library Science, 2010. http://www.dtic.mil/cgi-

bin/GetTRDoc?Location=U2&doc=GetTRDoc.pdf&AD=ADA523538 .

Berkowitz, Bruce. 2007. "Failing to Keep Up With the Information Revolution." Studies

in Intelligence no. 47 (1). March 4, 2012. https://www.cia.gov/library/center-for-

the-study-of-intelligence/csi-publications/csi-

studies/studies/vol47no1/article07.html.

Treasury Board of Canada Secretariat. 2009. Collaboration and Innovation: How Web

2.0 is changing theway we work. Ottawa, ON: Government of Canada. Available

from

http://www.publicpropertyforum.ca/library/social_media_web2.0_presentation_2

009_braybrook-34pp.pdf.

Bruno, Greg, and Sharon Otterman. 2008. National Intelligence Estimates. Council on

Foreign Relations, May 14, 2008 [cited March 3 2011]. Available from

http://www.cfr.org/iraq/national-intelligence-estimates/p7758.

Business Wire. 2009. "ManTech Awarded $11M 'A-Space Next' Contract from the

Defense Intelligence Agency". Bloomberg, August 12, 2009. Accessed October

14, 2011.

http://www.bloomberg.com/apps/news?pid=newsarchive&sid=aikzvesMLL74.

Byrne, John A. 2002. Inside McKinsey. BusinessWeek, July 8, 2002. Accessed December

4, 2011. http://www.businessweek.com/magazine/content/02_27/b3790001.htm.

147

Calabresi, Massimo. 2009. Wikipedia for Spies: The CIA Discovers Web 2.0. TIME,

April 8, 2009. Accessed Februrary 15, 2012.

http://www.time.com/time/nation/article/0,8599,1890084,00.html.

Central Intelligence Agency. 2007. HR FAQs. Central Intelligence Agency, April 5, 2012

[cited Februrary 13 2012]. Available from https://www.cia.gov/offices-of-

cia/human-resources/hr-faqs.html.

———. A Tradecraft Primer: Structured Analytic Techniques for Improving Intelligence

Analysis. Central Intelligence Agency, 2009. Accessed Februrary 23, 2012.

https://www.cia.gov/library/center-for-the-study-of-intelligence/csi-

publications/books-and-monographs/Tradecraft%20Primer-apr09.pdf.

Cermani, Joseph R., and Jeffrey A. Engel. RETHINKING LEADERSHIP AND “WHOLE

OF GOVERNMENT” NATIONAL SECURITY REFORM: PROBLEMS,

PROGRESS, AND PROSPECTS. Carlisle, PA: Strategic Studies Institute, 2010.

Accessed Februrary 12, 2012.

http://www.strategicstudiesinstitute.army.mil/pdffiles/PUB992.pdf.

Chapman, Mary M. 2010. "Judge overrules terror suspect's request on evidence". The

New York Times, 2010/10/15/, na(L). Accessed March 5, 2012.

http://go.galegroup.com/ps/i.do?id=GALE%7CA239435115&v=2.1&u=ucalgary

&it=r&p=AONE&sw=w.

Chomik, Andrew. 2011. "Making Friends in Dark Shadows: An Examination of the Use

of Social Computing Strategy Within the United States Intelligence Community

Since 9/11." Global Media Journal: Canadian Edition no. 4 (2):95-113.

http://ezproxy.lib.ucalgary.ca:2048/login?url=http://search.ebscohost.com/login.a

spx?direct=true&db=ufh&AN=66717554&site=ehost-live.

Christensen, Clayton M. 1997. The innovator's dilemma: when new technologies cause

great firms to fail. Boston, Mass: Harvard Business School Press.

Christensen, Clayton M., and Michael E. Raynor. 2003. The Innovator's Solution:

Creating and Sustaining Successful Growth. Boston: Harvard Business School

Press.

Chui, Michael, Andy Miller, and Roger P. Roberts. 2009. "Six ways to make Web 2.0

work." McKinsey Quarterly (2):64-73.

http://ezproxy.lib.ucalgary.ca:2048/login?url=http://search.ebscohost.com/login.a

spx?direct=true&db=bsh&AN=37815632&site=ehost-live.

ODNI. 2007. Senate Homeland Security and Governmental Affairs Committee.

Confronting the Terrorist Threat to the Homeland: Six Years after 9/11

(Statement of the Record of J. Michael McConnell). September 10, 2007.

Accessed February 15, 2012. http://dni.gov/testimonies/20070910_testimony.pdf.

148

Central Intelligence Agency. 2005. Curing Analytic Pathologies: Pathways to Improved

Intelligence Analysis. Washington, D.C.: Center for Study of Intelligence.

Council on Foreign Relations. 2008. Intelligence Analysis: Continuity and Change

(Thomas Fingar) [Video]. YouTube. Accessed October 14, 2011.

http://www.youtube.com/watch?v=T4zqdzkCR_I.

Daily News Reporter. 2011. "From fresh-faced schoolboy to jihad warrior: Underpants

bomber pictured in Al Qaeda training camp". The Daily Mail - Mail Online,

October 13. Accessed March 3, 2011. http://www.dailymail.co.uk/news/article-

2048260/Underwear-bomber-trial-Umar-Farouk-Abdulmutallab-pictured-Al-

Qaeda-training-camp.html.

Dean, Joshua. 2000. "Intelink: The real secret spy network." Government Executive no.

32 (4):64. http://ezproxy.lib.ucalgary.ca:2048/login?url=http://proquest.umi.com

.ezproxy.lib.ucalgary.ca/pqdweb?did=52470570&Fmt=2&clientId=12303

&RQT=309&VName=PQD.

Department of the Prime Minister and Cabinet. 2008. SUMMARY AND CONCLUSIONS:

Report of the Review of Homeland and Border Security (Smith Review). Available

from http://www.royalcommission.vic.gov.au/getdoc/0be3af5e-16eb-4ba5-93c0-

b83cb3a55860/TEN.004.002.0431.pdf.

Di Maio, Andrea. Government 2.0: Gartner Definition. Gartner, 2009. Accessed

December 13, 2011.

http://dc.gov/DC/OCTO/About+OCTO/Publications/Government+2.0:+Gartner+

Definition.

Dixon, Nancy M. 2007. CompanyCommand: A Professional Community That Works.

ASK Magazine. Accessed October 13, 2011.

http://askmagazine.nasa.gov/issues/27/27i_company_command.html.

———. How A-Space is Shaping Analysts' Work. Defense Intelligence Agency, 2009.

DIA Knowledge Laboratory Project. http://conversation-

matters.typepad.com/A_Space_Study.pdf.

Dixon, Nancy M., and Laura A. McNamara. Our Experience with Intellipedia: An

Ethnographic Study at the Defense Intelligence Agency. Defense Intelligence

Agency, 2008. DIA Knowledge Laboratory Project.

http://www.au.af.mil/au/awc/awcgate/sandia/dixon_mcnamara_intellipedia.pdf.

Eaves, David. 2011. "Why Social Media behind the Government Firewall Matters."

eaves.ca, (blog). Accessed October 23, 2011. http://eaves.ca/2011/08/11/why-

social-media-behind-the-government-firewall-matters/.

149

Federal Computer Week. 2008. 2008 Federal 100 award winners. Federal Computer

Week, March 19. http://fcw.com/Articles/2008/03/19/2008-Federal-100-award-

winners-Per--Wy.aspx?Page=3.

Fenwick, Nigel. The CIO’s Guide To Social Computing Leadership. Cambridge, MA:

Forrester Research, Inc., 2010. Accessed March 12, 2012.

http://us.blackberry.com/business/leading/RIM_CIO_SocialComputingLeadership

.pdf.

Ferguson, Barbara. 2010. "Clapper and secret America in the spotlight". Arabnews.com,

July 21, 2010. Accessed February 9, 2012.

http://arabnews.com/world/article87375.ece.

Fingar, Thomas. 2011. Reducing uncertainty: intelligence analysis and national security.

Stanford, California: Stanford Security Studies.

Gabbard, C. Bryan, and Gregory F. Treverton. 2008. Assessing the tradecraft of

intelligence analysis. Vol. TR-293. Santa Monica, CA: RAND Corporation.

Gehres, Peter, George Louthan, Nathan Singelton, and John Hale. 2010. Toward

Sensitive Information Redaction in a Collaborative, Multilevel Security

Environment. Paper read at WikiSym 2010, July 8, 2010, at Gdańsk, Poland.

Available from http://georgerloutha.nthefourth.com/wp-

content/uploads/2009/07/louthan_securewiki.pdf.

George, Roger Z. 2010. "Beyond Analytic Tradecraft." International Journal of

Intelligence and CounterIntelligence no. 23 (2):296-306. 2012/04/09. doi:

10.1080/08850600903566124. http://dx.doi.org/10.1080/08850600903566124.

———. 2011. "Reflections on CIA Analysis: Is it Finished?" Intelligence and National

Security no. 26 (1):72-81. 2012/04/09. doi: 10.1080/02684527.2011.556360.

http://dx.doi.org/10.1080/02684527.2011.556360.

Harfoush, Rahaf. 2009. Yes we did: an inside look at how social media built the Obama

brand. Berkeley, CA: New Riders.

Harknett, Richard J., and James A. Stever. 2011. "The Struggle to Reform Intelligence

after 9/11." Public Administration Review no. 71 (5):700-6. doi: 10.1111/j.1540-

6210.2011.02409.x.

http://vnweb.hwwilsonweb.com/hww/jumpstart.jhtml?recid=0bc05f7a67b1790e7

70910268787c47359e8c368bca5b3ee4504acf3bca9d8867689609d594aba57&fmt

=P.

Harris, Brian. 2008. "Intellipedia: Leading a Transformation in Knowledge Management

within the Intelligence Community." Military Intelligence no. 34 (1):47-53.

February 21, 2012. http://www.fas.org/irp/agency/army/mipb/2008_01.pdf.

150

Havenstein, Heather. 2008. CIA, Pfizer, Wachovia and Sony execs suggest options for

adopting Web 2.0. Computerworld, June 11, 2008.

http://www.computerworld.com/s/article/9096658/CIA_Pfizer_Wachovia_and_S

ony_execs_suggest_options_for_adopting_Web_2.0?pageNumber=1.

Hepburn, Aden. 2011. "Facebook Statistics, Stats & Facts For 2011." Digital Buzz Blog,

(blog). January 18, 2011. Accessed March 7, 2012.

http://www.digitalbuzzblog.com/facebook-statistics-stats-facts-2011/.

Hickman, Gill Robinson. 2010. Leading change in multiple contexts: concepts and

practices in organizational, community, political, social, and global change

settings. Los Angeles: SAGE.

Hinchcliffe, Dion. 2006. "The shift to Social Computing." Enterprise 2.0, (blog). March

12, 2006. Accessed March 12, 2012. http://www.zdnet.com/blog/hinchcliffe/the-

shift-to-social-computing/21.

Hinck, Robert. n.d. "Intellipedia." Rhinck's Blog, (blog). Accessed October 13, 2011.

http://csis.org/blog/intellipedia.

Hoover, Nicholas. 2009. CIA, NSA Adopting Web 2.0 Strategies; In addition to

Intellipedia, social and Web-inspired software is becoming the next great tool for

the intelligence community. InformationWeek, 2009/03/10/. Accessed August 30,

2011.

http://go.galegroup.com/ps/i.do?id=GALE%7CA195301482&v=2.1&u=ucalgary

&it=r&p=AONE&sw=w.

———. 2011. Feds Set Tougher Data Center Consolidation Goals. InformationWeek,

December 22, 2011. Accessed April 8, 2012.

http://www.informationweek.com/news/government/enterprise-

architecture/232300936.

Howard, Alex. 2010. "10 Lessons for Gov 2.0 from Web 2.0." O'Reilly Radar, (blog).

October 6, 2010. http://radar.oreilly.com/2010/10/top-10-lessons-for-gov-20-

from.html.

IARPA. n.d. The "P" in IARPA is Very Important. Intelligence Advanced Research

Projects Activity [cited October 11 2011]. Available from

http://www.iarpa.gov/offices.html.

iNews Justification. 2007. Leon`s Wiki [cited October 13 2011]. Available from

http://leonbloom.pbworks.com/w/page/16375716/iNews%20Justification.

Information Sharing Environment. 2011. Scope of the ISE [cited October 19 2011].

Available from http://www.ise.gov/scope-ise.

151

U.S. Department of State. 2011. Statement Before the Senate Committee on Homeland

Security and Governmental Affairs. Information Sharing in the Era of WikiLeaks:

Balancing Security and Collaboration (Patrick F. Kennedy). March 10, 2011.

Accessed Februrary 20, 2012.

http://www.state.gov/m/rls/remarks/2011/158400.htm.

Intelligence Community Chief Information Officer. 2010a. Intelink. National Defense

Industrial Association. Available from

http://www.ndia.org/Divisions/Divisions/C4ISR/Documents/Breakfast%20Presen

tations/2010%20Presentations/Intelink%20Basic%20presentation.pdf.

Intelligence Community Chief Information Officer. 2010b. XML DATA ENCODING

SPECIFICATION FOR INTELLIGENCE PUBLICATIONS VERSION 4

(PUBS.XML.V4): ICTechSpec 500.D.3-V4. Available from

http://www.dni.gov/ICEA/pubs/pubsxmlv4/DesPubsXml.pdf.

Interagency Threat Assessment and Coordination Group. 2011. Intelligence Guide for

First Responders. Information Sharing Environment. Available from

http://ise.gov/sites/default/files/ITACG_Guide_2ed.pdf.

Jacks, Jason. 2009. Updated global information grid would bring web 2.0 to the Defense

Department. 94 (669): 47.

Jackson, Joab. 2009a. Intellipedia suffers midlife crisis. Government Computer News,

February 18. http://gcn.com/Articles/2009/02/18/Intellipedia.aspx?Page=1.

———. 2009b. A-Space melds social media and intelligence gathering. Government

Computer News, November 20. Accessed October 15, 2011.

http://gcn.com/articles/2009/11/30/a-space-dia-intell-sharing-wiki.aspx.

Jackson, Kevin L. 2011. "It's Official! US Intelligence Community Is Moving To The

Cloud!" Kevin Jackson: Cloud Musings on Forbes, (blog). October 17, 2011.

Accessed February 27, 2012.

http://www.forbes.com/sites/kevinjackson/2011/10/17/its-official-us-intelligence-

community-is-moving-to-the-cloud/.

Joch, Alan. 2009. Intelligence community wrestles with Web 2.0 tools for information

sharing. Federal Computer Newsweek, May 14, 2009.

http://fcw.com/Articles/2009/05/18/Data-sharings-new-mandate.aspx?Page=1.

Kaplan, Andreas M., and Michael Haenlein. 2010. "Users of the world, unite! The

challenges and opportunities of Social Media." Business Horizons no. 53 (1):59-

68. 2010/2//. doi: 10.1016/j.bushor.2009.09.003.

http://www.sciencedirect.com/science/article/pii/S0007681309001232.

152

Kash, Wyatt. 2010. Reduced funding might actually improve intelligence work. Federal

Computer Week, May 28, 2010. Accessed March 3, 2012.

http://fcw.com/articles/2010/05/28/reduced-funding-might-help-intelligence-

work.aspx.

Kostoff, Ronald N., Robert Boylan, and Gene R. Simons. 2004. "Disruptive technology

roadmaps." Technological Forecasting and Social Change no. 71 (1–2):141-159.

March 24, 2012. doi: 10.1016/s0040-1625(03)00048-9.

http://www.sciencedirect.com/science/article/pii/S0040162503000489.

Krzmarzick, Andy. 2010. The high-impact approach to knowledge sharing. Federal

Computer Week, May 19, 2010. Accessed February 7, 2012.

http://fcw.com/Articles/2010/05/24/BACK-TALK-GovLoop-Knowledge-

Management.aspx?sc_lang=en&Page=1.

Laipson, Ellen. 2005. The Robb-Silberman report, intelligence, and nonproliferation.

Arms Control Today, 2005/06//, 20+. Accessed 2012/4/6/.

http://go.galegroup.com/ps/i.do?id=GALE%7CA133830754&v=2.1&u=ucalgary

&it=r&p=AONE&sw=w.

Lee, Dong-Joo, and Jae-Hyeon Ahn. 2007. "Reward systems for intra-organizational

knowledge sharing." European Journal of Operational Research no. 180 (2):938-

956. doi: 10.1016/j.ejor.2006.03.052.

http://www.sciencedirect.com/science/article/pii/S0377221706003468.

Lowenthal, Mark M. 2008a. Intelligence: from secrets to policy. Fourth Edition ed.

Washington, DC: CQ Press.

———. 2008b. "Towards a Reasonable Standard for Analysis: How Right, How Often

on Which Issues?" Intelligence & National Security no. 23 (3):303-315. doi:

10.1080/02684520802121190.

http://ezproxy.lib.ucalgary.ca:2048/login?url=http://search.ebscohost.com/login.a

spx?direct=true&db=hlh&AN=32771180&site=ehost-live.

Lynch, CG. 2008. Intelligence Community to Start Using Virtual Worlds. CIO.com,

March 21. Accessed October 11, 2011.

http://www.cio.com/article/201800/Intelligence_Community_to_Start_Using_Virt

ual_Worlds.

Markle Foundation. Authorized Use: An Authorized Use Standard for Information

Sharing Strengthens National Security and Respects Civil Liberties. New York,

NY: Markle Foundation, 2009a.

http://www.markle.org/sites/default/files/20090825_authusestndrd.pdf.

153

———. Improve Information Sharing, Create a Trusted System, Facilitate Access to

Critical Data. New York, NY: Markle Foundation, 2009b.

http://www.markle.org/sites/default/files/MTFBrief_Discoverability.pdf.

———. Nation At Risk: Policy Makers Need Better Information to Protect the Country.

New York, NY: Markle Foundation, 2009c.

http://www.markle.org/sites/default/files/20090304_mtf_report.pdf.

———. New Thinking on Information Sharing Critical to Strengthening National

Security. New York, NY: Markle Foundation, 2009d.

http://www.markle.org/sites/default/files/MTFBrief_CultureChange.pdf.

———. Protecting Privacy and Civil Liberties in a Networked Information Sharing

Environment. New York, NY: Markle Foundation, 2009e.

http://www.markle.org/sites/default/files/MTFBrief_Privacy.pdf.

Markova, Iveta. 2009. Web 2.0 technology adoption by Government Departments.

Available from http://www.proquest.com.ezproxy.lib.ucalgary.ca (publication

number AAT MR47517).

McAfee, Andrew P. 2006. "Enterprise 2.0: The Dawn of Emergent Collaboration." MIT

Sloan Management Review no. 47 (3):21. March 1, 2012.

http://ezproxy.lib.ucalgary.ca:2048/login?url=http://proquest.umi.com.ezproxy.lib

.ucalgary.ca/pqdweb?did=1020961391&Fmt=2&clientId=12303&RQT=309&VN

ame=PQD.

———. 2009. Enterprise 2.0: new collaborative tools for your organization's toughest

challenges. Boston, Mass: Harvard Business Press.

———. 2010. "Is Lightweight Workflow Here at Last?" Andrew McAfee's Blog, (blog).

March 30, 2010. Accessed December 17, 2011.

http://andrewmcafee.org/2010/03/is-lightweight-workflow-here-at-last/.

———. 2011. "Living Intelligence." Andrew McAfee's Blog, (blog). January 28, 2011.

Accessed February 6, 2012. http://andrewmcafee.org/2011/01/living-intelligence/.

McConnell, Mike. 2007. "Overhauling Intelligence." Foreign Affairs no. 86 (4):49-58.

http://go.galegroup.com/ps/i.do?id=GALE%7CA165827017&v=2.1&u=ucalgary

&it=r&p=AONE&sw=w.

McCracken, Jill. In the Shadow of 9/11. Falls Church, VA: Noblis Inc., 2011.

http://www.noblis.org/NewsPublications/Publications/Publications/Sigma_RE_In

TheShadowOf911.pdf.

154

McIntyre, Joan. 2009. A-Space: Changing the Way the Intelligence Community Does

Analysis. Paper read at gov 2.0 Expo, September 8, 2009, at Washington, D.C.

Available from

http://www.gov2expo.com/gov2expo2009/public/schedule/detail/10255.

McManus, Doyle. 2010. "Connecting the intelligence dots will require clout". Los

Angeles Times. Accessed March 7, 2012.

http://articles.latimes.com/2010/jan/10/opinion/la-oe-mcmanus10-2010jan10.

Meyer, Josh, Peter Nicholas, and Alana Semuels. 2009. "Obama cites intelligence

failures in Northwest airline attack". Los Angeles Times, December 30, 2009.

Accessed December 7, 2011. http://articles.latimes.com/2009/dec/30/nation/la-na-

terror-obama30-2009dec30.

Miller, Jason. 2010. "Intellipedia provides lessons for FedSpace initiative". 1500 AM -

Federal News Radio, May 5. Accessed October 12, 2011.

http://www.federalnewsradio.com/?nid=697&sid=1949950.

———. 2011. "Information Sharing boosted by recognizable tools". 1500 AM - Federal

News Radio, October 17, 2011. Accessed February 20, 2011.

http://www.federalnewsradio.com/?nid=&sid=1625863.

Nakano, Chelsi. 2011a. Forrester's 2011 Enterprise 2.0 User Profile: Social Business isn't

Essential. CMSWire, November 7, 2011. Accessed February 6, 2012.

http://www.cmswire.com/cms/social-business/forresters-2011-enterprise-20-user-

profile-social-business-isnt-essential-013405.php.

———. 2011b. New Twitter Statistics Reveal 100M Monthly Active Users & 250M

Daily Tweets #w2s. CMSWire, October 18, 2011. Accessed March 7, 2012.

http://www.cmswire.com/cms/social-business/new-twitter-statistics-reveal-100m-

monthly-active-users-250m-daily-tweets-w2s-013103.php.

National Commission on Terrorist Attacks upon the United States. 2004. The 9/11

Commission report: final report of the National Commission on Terrorist Attacks

Upon the United States. Edited by Lee Hamilton and Thomas H. Kean. New

York: W.W. Norton.

Negroponte, John D. 2006. Remarks by the Director of National Intelligence

Globalsecurity.org, September 25, 2006 [cited October 18 2011]. Available from

http://www.globalsecurity.org/intell/library/news/2006/intell-060925-dni01.htm.

O'Hara, Kelly, and Anthony Bergin. Information Sharing in Australia’s National Security

Community. Australian Strategic Policy Institute, 2009. Accessed October 23,

2011.

http://www.aspi.org.au/publications/publication_details.aspx?ContentID=232.

155

O'Reilly, Tim. 2005. What is Web 2.0 - Design Patterns and Business Models for the

Next Generation of Software. O'Reilly Media Inc., September 30, 2005 [cited

March 5 2012]. Available from http://oreilly.com/web2/archive/what-is-web-

20.html.

ODNI. 2007a. 100 Day Plan for Integration and Collaboration. Office of the Director of

National Intelligence. Available from http://www.fas.org/irp/dni/100-day-

plan.pdf.

ODNI. 2007b. 500 Day Plan: Integration and Collaboration. Office of the Director of

National Intelligence. Available from http://www.odni.gov/500-day-plan/500-

day-plan.pdf.

ODNI. 2007c. Intelligence Community Directive Number 651: PERFORMANCE

MANAGEMENT SYSTEM REQUIREMENTS FOR THE INTELLlGENCE

COMMUNITY CIVILIAN WORKFORCE. Washington, DC: Office of the Director

of National Intelligence. Available from

http://www.dni.gov/electronic_reading_room/ICD_651.pdf.

ODNI. 2007d. Intelligence Community Policy Memorandum Number 2007-200-2. In

2007 Intelligence Community Policy Memorandums, edited by Office of the

Director of National Intelligence: Federation of American Scientists. Available

from http://www.fas.org/irp/dni/icpm/2007-200-2.pdf.

ODNI. 2007e. Intellipedia for Diplomats. Office of the Director of National Intelligence.

Available from https://ronna-

afghan.harmonieweb.org/Lists/Submitted%20Content/Attachments/634/Intelliped

ia%20for%20Diplomats.pdf.

ODNI. 2008a. Analytic Transformation: Unleashing the Potential of a Community of

Analysts. Office of the Director of National Intelligence. Available from

http://www.dni.gov/content/AT_Digital%2020080923.pdf.

ODNI. 2008b. Critical Intelligence Community Management Challenges. Federation of

American Scientists. Available from http://www.fas.org/irp/news/2009/04/odni-

ig-1108.pdf.

ODNI. 2008c. Intelligence Community Directive Number 656: Performance Management

System Requirements for Intelligence Community Senior Civilian Officers.

Washington, DC: Office of the Director of National Intelligence. Available from

http://www.dni.gov/electronic_reading_room/ICD_656.pdf.

ODNI. 2008d. United States Intelligence Community Information Sharing Strategy.

Office of the Director of National Intelligence. Available from

http://dni.gov/reports/IC_Information_Sharing_Strategy.pdf.

156

ODNI. 2009a. 2009 National Intelligence: A Consumer's Guide. Washington, D.C.:

Office of the Director of National Intelligence. Available from

http://www.dni.gov/reports/IC_Consumers_Guide_2009.pdf.

ODNI. 2009b. Intelligence Community Directive Number 501: Discovery and

Dissemination or Retrieval of Information Within the Intelligence Community.

Washington, DC: Office of the Director of National Intelligence. Available from

http://www.dni.gov/electronic_reading_room/ICD_501.pdf.

ODNI. 2009c. The National Intelligence Strategy of the United States of America. Office

of the Director of National Intelligence. Available from

http://www.dni.gov/reports/2009_NIS.pdf.

ODNI. 2010. National Security Strategy. Office of the Director of National Intelligence.

Available from

http://www.whitehouse.gov/sites/default/files/rss_viewer/national_security_strate

gy.pdf.

ODNI. 2011. Strategic Intent for Information Sharing 2011-2015. Office of the Director

of National Intelligence. Available from

http://www.dni.gov/reports/11152526_strategic_intent_info_sharing.pdf.

Office of the Auditor General of Canada. 2009. Status Report of the Auditor General of

Canada to the House of Commons - National Security: Intelligence and

Information Sharing (Chapter 1). Ottawa, ON. Available from http://www.oag-

bvg.gc.ca/internet/docs/parl_oag_200903_01_e.pdf.

Olsen, Florence. 2007. Intell's wiki pied piper. Federal Computer Week, December 13,

2007. Accessed February 23, 2012. http://fcw.com/Articles/2007/12/13/Intells-

wiki-pied-piper.aspx?Page=1.

Osimo, David. Web 2.0 in Government: Why and How? Seville, Spain: Institute for

Prospective Technological Studies, 2008. Accessed March 4, 2012.

http://ftp.jrc.es/EURdoc/JRC45269.pdf.

Priest, Dana. 2006. "Top-Secret World Loses Blogger; CIA Contractor Is Fired When

Internal Post Crosses the Line: FINAL Edition". The Washington Post, A.15.

http://www.washingtonpost.com/wp-

dyn/content/article/2006/07/20/AR2006072001816.html.

Priest, Dana, and William M. Arkin. A hidden world, growing beyond control.

Washington, DC: The Washington Post, 2010.

http://projects.washingtonpost.com/top-secret-america/articles/a-hidden-world-

growing-beyond-control/print/.

157

Program Manager - Information Sharing Environment. ISE Information Sharing

Environment: Annual Report to The Congress. Washington, D.C.: PM-ISE, 2011.

http://ise.gov/sites/default/files/ISE_Annual_Report_to_Congress_2011.pdf .

Rasmussen, Chris. 2010. "Increasing “Jointness” and Reducing Duplication in DoD

Intelligence." CTOVision.com, (blog). October 6, 2010. Accessed March 3, 2012.

http://ctovision.com/2010/10/increasing-%E2%80%9Cjointness%E2%80%9D-

and-reducing-duplication-in-dod-intelligence/.

———. 2011. "The Limits of Informal "2.0" Collaboration and Why Changing the

Official Process Matters." Hacks, (blog). July 28, 2011. Accessed January 30,

2012. http://www.managementexchange.com/hack/limits-informal-%2220%22-

collaboration-and-why-changing-official-process-matters.

Federal Register. 2008. The President: Executive Order 13470—Further Amendments to

Executive Order 12333, United States Intelligence Activities. National Archives

and Records Administration. Available from

http://edocket.access.gpo.gov/2008/pdf/E8-17940.pdf.

Rein, Lisa. 2010. "A new batch of younger employees finding their place in federal

workforce". The Washington Post. Accessed April 5, 2012.

http://www.washingtonpost.com/wp-

dyn/content/article/2010/08/06/AR2010080606461.html?sid=ST2010080700240.

Rice, Condoleezza 2011. Transformational Diplomacy: Shaping US Diplomatic Posture

in the 21st Century (January 18, 2006). Council on Foreign Relations [cited April

1 2011]. Available from http://www.cfr.org/us-strategy-and-

politics/transformational-diplomacy-shaping-us-diplomatic-posture-21st-

century/p9637.

Rosenberg, Barry. 2011. Agile intelligence is the name of the game in the DIA. Defense

Systems, March 29, 2011. Accessed February 27, 2012.

http://defensesystems.com/articles/2011/03/29/interview-grant-schneider-dia-

initiatives.aspx.

Schroeder, David A. 2011. Efficacy and adoption of central Web 2.0 and social software

tools in the U.S. intelligence community, Department of Security and Global

Studies, American Military University, Charles Town, WV. Available from

http://das.doit.wisc.edu/amu/Schroeder_Thesis_MAR11_Redacted.pdf.

Shane, Scott. 2007. "Logged In and Sharing Gossip, er, Intelligence". New York Times,

4.1. Accessed October 13, 2011.

http://www.nytimes.com/2007/09/02/weekinreview/02shane.html?pagewanted=al

l.

158

Shaughnessy, Larry. 2008. "CIA, FBI push 'Facebook for spies'". CNN.com. Accessed

October 7, 2011.

http://edition.cnn.com/2008/TECH/ptech/09/05/facebook.spies/index.html.

Singh, Timon. 2009. How social media was used during the Mumbai attacks. Next

Generation Online, November 26, 2009. Accessed March 7, 2012.

http://www.ngonlinenews.com/news/mumbai-attacks-and-social-media/.

"Social Networks/Blogs Now Account for One in Every Four and a Half Minutes

Online." 2010. nielsenwire, (blog). June 15, 2010. Accessed March 5, 2012.

http://blog.nielsen.com/nielsenwire/global/social-media-accounts-for-22-percent-

of-time-online/.

Surowiecki, James. 2004. The wisdom of crowds: why the many are smarter than the few

and how collective wisdom shapes business, economies, societies, and nations.

New York: Doubleday.

Swamy, Nikhil. 2008. Language-based enforcement of user-defined security policies: As

applied to multi-tier Web programs. Available from

http://www.proquest.com.ezproxy.lib.ucalgary.ca (publication number AAT

3324926).

Senate Committee on Homeland Security & Governmental Affairs. 2011. Senate

Committee on Homeland Security & Governmental Affairs. Ten Years After 9/11:

A Status Report On Information Sharing (Statement of Zoe Baird Budinger and

Jeffrey H. Smith). October 12, 2011. Accessed December 13, 2011.

http://www.markle.org/sites/default/files/MTFSenateTestimony_10-12-11_1.pdf.

Thompson, Clive. 2006. Open-Source Spying. The New York Times Magazine, Dec. 3,

2006, 54(L). Accessed 2011/08/28/.

http://go.galegroup.com/ps/i.do?id=GALE%7CA155359817&v=2.1&u=ucalgary

&it=r&p=AONE&sw=w.

TIME. 2008. Facebook for Spies - Best Inventions of 2008 (TIME Specials). TIME.

Available from

http://www.time.com/time/specials/packages/article/0,28804,1852747_1854195_

1854171,00.html.

Tonkin, Emma, Heather D. Pfeiffer, and Greg Tourte. 2012. "Twitter, information

sharing and the London riots?" Bulletin of the American Society for Information

Science and Technology no. 38 (2):49-57. doi: 10.1002/bult.2012.1720380212.

http://dx.doi.org/10.1002/bult.2012.1720380212.

159

Turner, Stansfield. 1991. "INTELLIGENCE FOR A NEW WORLD ORDER." Foreign

Affairs no. 70 (4):150-166.

http://ezproxy.lib.ucalgary.ca:2048/login?url=http://search.ebscohost.com/login.a

spx?direct=true&db=afh&AN=9110142318&site=ehost-live.

U.S. Senate. 2010. Senate Committee on Homeland Security and Governmental Affairs.

The Lessons and Implications of the Christmas Day Attack: Watchlisting and Pre-

Screening. March 10. http://www.fbi.gov/news/testimony/the-lessons-and-

implications-of-the-christmas-day-attack-watchlisting-and-pre-screening.

UNITED STATES SENATE COMMITTEE ON HOMELAND SECURITY AND

GOVERNMENTAL AFFAIRS. 2011. A Ticking Time Bomb: Counterterrorism

Lessons from the U.S. Government's Failure to Prevent the Fort Hood Attack.

Available from

http://www.hsgac.senate.gov//imo/media/doc/Fort_Hood/FortHoodReport.pdf?att

empt=2.

Valero, Rafael Enrique. 2008. Intelligence community places premium on collaboration

and job satisfaction, survey says. Government Executive, March 27, 2008.

Accessed February 23, 2012. http://www.govexec.com/pay-

benefits/2008/03/intelligence-community-places-premium-on-collaboration-and-

job-satisfaction-survey-says/26570/.

Walcott, John. 2011. Intelligence Budget Cuts Mean U.S. Will Have More Blind Spots.

BusinessWeek, November 14, 2011. Accessed March 4, 2012.

http://www.businessweek.com/news/2011-11-14/intelligence-budget-cuts-mean-

u-s-will-have-more-blind-spots.html.

"Web 2.0." 2009. In Web 2.0: From P2P and Grids to Services on the Web, edited by Ian

J. Taylor and Andrew B. Harrison, 313-337. Springer London.

Weinberger, Sharon. 2010. What is SIPRNet? Popular Mechanics, December 1, 2010.

Accessed October 23, 2011. http://www.popularmechanics.com/technology/how-

to/computer-security/what-is-siprnet-and-wikileaks-4085507.

Weinrod, W. Bruce. 1996. "U.S. Intelligence priorities in the post-Cold War era." World

Affairs no. 159 (1):3.

http://ezproxy.lib.ucalgary.ca:2048/login?url=http://search.ebscohost.com/login.a

spx?direct=true&db=afh&AN=9609281352&site=ehost-live.

Wertheimer, Michael. 2008. "Arming intelligence with Web 2.0." Gazette no. 70 (3).

April 8, 2012. http://www.rcmp-grc.gc.ca/gazette/vol70n3/2-0-eng.htm.

Witzel, Morgen. 2003. "Clayton Christensen: Lessons of an innovator." FT.com (Journal

Article):1. http://ezproxy.lib.ucalgary.ca:2048/login?url=http://proquest.umi.com

160

.ezproxy.lib.ucalgary.ca/pqdweb?did=389040721&Fmt=3&clientId=12303

&RQT=309&VName=PQD.

Zakaria, Tabassum. 2012. "U.S. spies to rely more on allies due to budget austerity".

Reuters, January 26, 2012. Accessed April 10, 2012.

http://www.reuters.com/article/2012/01/26/us-usa-intelligence-

idUSTRE80P24U20120126.

Zegart, Amy B. 2005. "September 11 and the Adaptation Failure of U.S. Intelligence

Agencies." International Security no. 29 (4):78-111.

http://www.jstor.org/stable/4137498.

Zerbisias, Antonia. 2010. "Coverage of the G20 proved Twitter's news edge".

thestar.com, July 11, 2010. Accessed March 27, 2012.

http://www.thestar.com/news/insight/article/834367--coverage-of-the-g20-

proved-twitter-s-news-edge.

Zetter, Kim. 2011. Army: Manning Snuck ‘Data-Mining’ Software Onto Secret Network.

Wired, April 4, 2011. http://www.wired.com/threatlevel/2011/04/manning-data-

mining/.

161

APPENDIX A: THE U.S. INTELLIGENCE CYCLE

Credit:

ODNI. 2011. U.S. National Intelligence: An Overview 2011. Pentagon Library. Available

from http://www.whs.mil/library/IC_Consumers_Guide_2011.pdf. 10-12.

The Six Steps in the Intelligence Cycle

The Intelligence Cycle is the process of developing raw information into finished

intelligence for use by policymakers, military commanders, and other consumers in

decisionmaking. This six step cyclical process is highly dynamic, continuous, and never-

ending . The sixth step, evaluation (which includes soliciting feedback from users) is

conducted for each of the other five steps individually and for the Intelligence Cycle as a

whole.

1. PLANNING AND DIRECTION: Establish the consumer’s intelligence

requirements and plan intelligence activities accordingly.

The planning and direction step sets the stage for the Intelligence Cycle. It is the

springboard from which all Intelligence Cycle activities are launched. Oftentimes, the

direction part of the step precedes the planning part. Generally, in such cases, the

consumer has a requirement for a specific product. That product may be a full report, a

graphic image, or raw information that is collected, processed, and disseminated, but

skips the analysis and production step. Given the customer’s requirement, the intelligence

organization tasked with generating the product will then plan its Intelligence Cycle

activities.

2. COLLECTION: Gather the raw data required to produce the finished product.

Data collection is performed to gather raw data related to the five basic intelligence

sources (Geospatial Intelligence [GEOINT], Human Intelligence [HUMINT],

Measurement and Signature Intelligence [MASINT], Open-Source Intelligence [OSINT],

and Signals Intelligence [SIGINT]). The sources of the raw data may include, but are not

limited to, news reports, aerial imagery, satellite imagery, and government and public

documents.

3. PROCESSING AND EXPLORATION: Convert the raw data into a

comprehensible format that is usable for production of the finished product.

The processing and exploitation step (see the Glossary of Terms for a definition of

“exploitation”) involves the use of highly trained and specialized personnel and

technologically sophisticated equipment to turn the raw data into usable and

understandable information. Data translation, data decryption, and interpretation of

filmed images and other imagery are only a few of the processes used for converting data

162

stored on film, magnetic, or other media into information ready for analysis and

production.

4. ANALYSIS AND PRODUCTION: Integrate, evaluate, analyze, and prepare the

processed information for inclusion in the finished product.

The analysis and production step also requires highly trained and specialized personnel

(in this case, analysts) to give meaning to the processed information and to prioritize it

against known requirements. Synthesizing the processed information into a finished,

actionable (see the Glossary of Terms for a definition of “actionable”) intelligence

product enables the information to be useful to the customer. Note that, in some cases, the

Intelligence Cycle may skip this step (for example, when the consumer needs only

specific reported information or products such as raw imagery). This was the case during

the Cuban Missile Crisis (October 1962) when President Kennedy needed only the actual

number of pieces of Soviet equipment in Cuba and facts concerning reports on observed

Soviet activity with no analysis of that information.

5. DISSEMINATION: Deliver the finished product to the consumer that requested it

and to others as applicable.

The consumer that requested the information receives the finished product, usually via

electronic transmission. Dissemination of the information typically is accomplished

through such means as websites, email, Web 2.0 collaboration tools, and hardcopy

distribution. The final, finished product is referred to as “finished intelligence.” After the

product is disseminated, further gaps in the intelligence may be identified, and the

Intelligence Cycle begins all over again.

6. EVALUATION: Continually acquire feedback during the Intelligence Cycle and

evaluate that feedback to refine each individual step and the cycle as a whole.

Constant evaluation and feedback from consumers are extremely important to enabling

those involved in the Intelligence Cycle to adjust and refine their activities and analysis to

better meet consumers’ changing and evolving information needs.

163

APPENDIX B: STRUCTURE OF THE USIC

“The structure of the USIC “(2010)

credit: Johnson, Loch K. 2010. The Oxford handbook of national security intelligence.

Oxford: Oxford University Press. 8.

164

APPENDIX C: GOOGLE LIVING STORIES

Screenshot of Google Living Stories software

(credit: Iris Pro Services, 2009: http://www.irisproservices.com/img/living_stories.gif).

165

APPENDIX D: USIC SOCIAL COMPUTING TOOLS

Discoverable

Information

Grassroots

content

provisioning

Workflow

integration

Champions Performance

and incentives

Inte

llip

edia

Information is

searchable and

widely

available for all

users; content is

both neutral and

partisan

All registered

users are able to

contribute to

topics of

interest

None; although

Intellipublia is

an upgrade to

include

workflow

integration

Management

supports user

contributions;

veteran users

serve as

examples for

new users of the

wiki

Regular users

who contribute

high quality

content are

recognized and

rewarded.

A-S

pace

Databases and

social circles

are searchable;

contact

information,

skills and

interests are

findable across

participating

agencies (only

operates under

JWICS; will

expand to other

Intelink

networks)

Users create

conversations,

collaborate on

documents

specific to their

group

participation;

users can join

multiple social

circles.

No workflow

integration;

lack of

managerial or

organizational

expectations to

use the tool.

Optional usage.

Power users are

identified;

fruitful

discussions and

documents are

usually

championed by

veteran users or

managers with

specific intent to

use A-Space as

the platform for

collaboration.

Managerial

recognition for

contributions

are known;

performance

measurements

are absent for

users, as its use

is not regularly

expected or

mandatory.

Inte

ldo

cs

Documents are

searchable via

Intelink; links

posted from A-

Space,

Intellipedia and

collaborative

workspace

sites.

Users can

create their own

documents.

No known

workflow

integration in

the intelligence

production

cycle.

N/A N/A

iNew

s

Links to

documents,

articles found

via searchable

metadata

RSS feeds are

the most

popular method

of distributing

intelligence

around the IC;

over 5000 feeds

exist.

N/A N/A N/A

166

APPENDIX D: USIC SOCIAL COMPUTING TOOLS (CONTINUED)

Discoverable

Information

Grassroots

content

provisioning

Workflow

integration

Champions Performance

and

incentives

iNew

s

Links to

documents,

articles found

via searchable

metadata

RSS feeds are

the most popular

method of

distributing

intelligence

around the IC;

over 5000 feeds

exist.

N/A N/A N/A

Blo

gs

Blogs available

to browse/search

on all three

network levels;

WordPress

functionality

among other

blog services in

use.

IC users are

available to start

their own blogs

via Intelink and

collaborative

workspace

functionality.

Blogs are

peripheral in

the IC; no

workflow

integration

into the

production

cycle.

Some blogs are

popular among

the IC

community (e.g.

CG-LIMS,

"Living

Intelligence"

blogs).

N/A

Mic

rob

loggin

g

(eC

hir

p)

Microblog posts

are avaialble for

IC users to read;

increases

"situational

awareness" on

events and

topics.

Users create their

own microblog

posts;

functionality is

designed for user

content creation.

Not

applicable.

N/A N/A

tag

|Co

nn

ect

Bookmarks can

be viewed by

other IC users;

content is meta-

tagged and

visible via link-

minded

bookmarks lists

on Intelink.

Users create their

own bookmarks.

N/A N/A N/A

167

APPENDIX D: USIC SOCIAL COMPUTING TOOLS (CONTINUED)

Discoverable

Information

Grassroots

content

provisioning

Workflow

integration

Champions Performance

and

incentives

IC C

on

nec

t (w

eb

con

fere

nci

ng

)

Web

conferencing

uses Adobe

Connect;

document

content

management is

available to

groups and

conference

participants.

Document

management and

collaboration

can be done real-

time.

Likely used

during

intelligence

production, but

not a mandatory

workflow step.

N/A N/A

Coll

ab

ora

tive

Work

space

s

Shared Space

Web Hosting

(e.g. SharePoint)

provide digital

workspaces for

members of the

workspace;

spaces are both

public and

private.

Users of each

space can

collaborate,

communicate

and share

information as

necessary.

Workspaces may

be managerial

directed or not.

May be used as

a space to

produce

intelligence

among project

groups;

direction may

come from

USIC

management.

N/A N/A

Ga

ller

y

Service similar

to Flickr;

content is

searchable and

metatagged for

ease of

discoverability.

Images can be

uploaded and

embedded in

other social tools

(A-Space,

Intellipedia);

cross-tool

functionality.

Not integrated

into the

intelligence

production

cycle.

Users can

comment on

and rate

images; users

with higher

reputation are

considered

champions /

reputable

users.

N/A

iVid

eo

Service similar

to YouTube;

content is

searchable and

metatagged for

ease of

discoverability.

Videos can be

uploaded and

embedded in

other social tools

(A-Space,

Intellipedia);

cross-tool

functionality.

Not integrated

into the

intelligence

production

cycle.

Users can

comment on

and rate

videos; users

with higher

reputation are

considered

champions /

reputable

users.

N/A