CrossTalk: The Journal of Defense Software Engineering. Volume 21, Number 9

33

Transcript of CrossTalk: The Journal of Defense Software Engineering. Volume 21, Number 9

Report Documentation Page Form ApprovedOMB No. 0704-0188

Public reporting burden for the collection of information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources, gathering andmaintaining the data needed, and completing and reviewing the collection of information. Send comments regarding this burden estimate or any other aspect of this collection of information,including suggestions for reducing this burden, to Washington Headquarters Services, Directorate for Information Operations and Reports, 1215 Jefferson Davis Highway, Suite 1204, ArlingtonVA 22202-4302. Respondents should be aware that notwithstanding any other provision of law, no person shall be subject to a penalty for failing to comply with a collection of information if itdoes not display a currently valid OMB control number.

1. REPORT DATE JAN 2008 2. REPORT TYPE

3. DATES COVERED 00-00-2008 to 00-00-2008

4. TITLE AND SUBTITLE CrossTalk: The Journal of Defense Software Engineering. Volume 21,Number 1, January 2008

5a. CONTRACT NUMBER

5b. GRANT NUMBER

5c. PROGRAM ELEMENT NUMBER

6. AUTHOR(S) 5d. PROJECT NUMBER

5e. TASK NUMBER

5f. WORK UNIT NUMBER

7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) OO-ALC/MASE,6022 Fir Ave,Hill AFB,UT,84056-5820

8. PERFORMING ORGANIZATIONREPORT NUMBER

9. SPONSORING/MONITORING AGENCY NAME(S) AND ADDRESS(ES) 10. SPONSOR/MONITOR’S ACRONYM(S)

11. SPONSOR/MONITOR’S REPORT NUMBER(S)

12. DISTRIBUTION/AVAILABILITY STATEMENT Approved for public release; distribution unlimited

13. SUPPLEMENTARY NOTES

14. ABSTRACT

15. SUBJECT TERMS

16. SECURITY CLASSIFICATION OF: 17. LIMITATION OF ABSTRACT Same as

Report (SAR)

18. NUMBEROF PAGES

32

19a. NAME OFRESPONSIBLE PERSON

a. REPORT unclassified

b. ABSTRACT unclassified

c. THIS PAGE unclassified

Standard Form 298 (Rev. 8-98) Prescribed by ANSI Std Z39-18

2 CROSSTALK The Journal of Defense Software Engineering January 2008

45

6

11

16

21

23

28

Announcing CrossTalk’s Co-Sponsor Team for 2008Meet CrossTalk’s 2008 co-sponsors.

The 2008 CrossTalk Editorial BoardHere is a list of CrossTalk’s article reviewers.

The Critical Need for Software Engineering EducationLong describes the need for more dedicated software engineeringeducational programs and professional software engineering certificationprograms in the United States.by Dr. Lyle N. Long

Using Inspections to Teach Requirements ValidationThis article describes an experiment conducted in a graduate-levelrequirements engineering course to provide students a real world experiencein requirements validation.by Lulu He, Dr. Jeffrey C. Carver, and Dr. Rayford B. Vaughn

Integrating Software Assurance Knowledge IntoConventional CurriculaThis article discusses the results of a comparison of the Common Bodyof Knowledge for Secure Software Assurance with traditional computingdisciplines.by Dr. Nancy R. Mead, Dr. Dan Shoemaker, and Jeffrey A. Ingalsbe

Software Engineering Continuing Educationat a Price You Can AffordMaj. Bohn gives detailed information about the Air Force Institute ofTechnology’s Software Professional Development Program.by Maj Christopher Bohn, Ph.D.

How to Avoid Software Inspection Failure and AchieveOngoing BenefitsThis article describes the proven benefits of inspections and signifies thatthey are too significant to let them fall by the wayside.by Roger Stewart and Lew Priven

Computer Science Education: Where Are the SoftwareEngineers of Tomorrow?This article briefly examines the set of programming skills that should bepart of every software professional’s repertoire.by Dr. Robert B.K. Dewar and Dr. Edmond Schonberg

PPoliciesolicies,, NeNewsws,, aandnd UpdatesUpdates

SoftwSoftwaarree EngineerEngineeringing TTechnolechnologogyy

TTrraainingining aandnd EducationEducation

310

2031

D eD e p ap a rr t m e n t st m e n t s

From the Sponsor

Coming EventsLetter to the Editor

SSTC 2008 Ad

BackTalkCover Design byKent Bingham

ON THE COVER

Additional art servicesprovided by Janna Jensen

OpenOpen FFororumum

CrossTalkCO-SPONSORS:

DOD-CIO

OSD (AT&L)

NAVAIR

76 SMXG

309 SMXG

DHS

STAFF:MANAGING DIRECTOR

PUBLISHER

MANAGING EDITOR

ASSOCIATE EDITOR

ARTICLE COORDINATOR

PHONE

E-MAIL

CROSSTALK ONLINE

The Honorable John Grimes

Kristen Baldwin

Jeff Schwalb

Kevin Stamey

Norman LeClair

Joe Jarzombek

Brent Baxter

Elizabeth Starrett

Kase Johnstun

Chelene Fortier-Lozancich

Nicole Kentta

(801) [email protected]

www.stsc.hill.af.mil/crosstalk

CrossTalk,The Journal of Defense SoftwareEngineering is co-sponsored by the Department ofDefense Chief Information Office (DoD-CIO); theOffice of the Secretary of Defense (OSD) Acquisition,Technology and Logistics (AT&L); U.S. Navy (USN);U.S. Air Force (USAF); and the U.S. Department ofHomeland Security (DHS). DoD-CIO co-sponsor:Assistant Secretary of Defense (Networks andInformation Integration). OSD (AT&L) co-sponsor:Software Engineering and System Assurance. USN co-sponsor: Naval Air Systems Command. USAF co-sponsors: Oklahoma City-Air Logistics Center (ALC)76 Software Maintenance Group (SMXG); andOgden-ALC 309 SMXG. DHS co-sponsor: NationalCyber Security Division of the Office ofInfrastructure Protection.

The USAF Software Technology SupportCenter (STSC) is the publisher of CrossTalk,providing both editorial oversight and technical reviewof the journal.CrossTalk’s mission is to encouragethe engineering development of software to improvethe reliability, sustainability, and responsiveness of ourwarfighting capability.

Subscriptions: Send correspondence concerningsubscriptions and changes of address to the followingaddress.You may e-mail us or use the form on p. 30.

517 SMXS/MXDEA6022 Fir AVEBLDG 1238Hill AFB, UT 84056-5820

Article Submissions:We welcome articles of interestto the defense software community.Articles must beapproved by the CROSSTALK editorial board prior topublication. Please follow the Author Guidelines, avail-able at <www.stsc.hill.af.mil/crosstalk/xtlkguid.pdf>.CROSSTALK does not pay for submissions. Articlespublished in CROSSTALK remain the property of theauthors and may be submitted to other publications.

Reprints: Permission to reprint or post articles mustbe requested from the author or the copyright hold-er and coordinated with CROSSTALK.

Trademarks and Endorsements:This Department ofDefense (DoD) journal is an authorized publicationfor members of the DoD. Contents of CROSSTALKare not necessarily the official views of, or endorsedby, the U.S. government, the DoD, the co-sponsors, orthe STSC.All product names referenced in this issueare trademarks of their companies.

CrossTalk Online Services: See <www.stsc.hill.af.mil/crosstalk>, call (801) 777-0857 or e-mail <[email protected]>.

Back Issues Available: Please phone or e-mail us tosee if back issues are available free of charge.

January 2008 www.stsc.hill.af.mil 3

From the Sponsor

One of my favorite resources is Webster’s 1828 Dictionary. Unlike any of our mod-ern dictionaries, the 1828 version provides a deeper understanding of words. I

thought the 1828 definition of education was quite fitting for this month’s theme.Webster’s says that education comprehends all that series of instruction and disciplinewhich is intended to enlighten understanding … and fit them for usefulness in theirfuture stations.A primary question this issue attempts to highlight is do our educational institutions in fact

prepare software professionals to be successful in their future stations? In October 2006, a study conduct-ed by the Center for Strategic and International Studies revealed a significant shortfall in thenumber of software professionals that had been formally educated in software project manage-ment. The study indicated that our industry is lacking in program managers, software architects,systems engineers, and domain experts. Many industry experts agree that there is an adequatesupply of programmers, but the pool of these critical senior managers and system experts isvery limited.

As Department of Defense (DoD) systems become more and more software intensive, soft-ware developments get bigger, more complicated, and more dependent on senior software pro-fessionals to get the project on the right path and keep it there. Since studies seem to indicatethat we are falling behind in our attempt to educate certain pockets of critical expertise and oursoftware projects are in greater need of these same professionals, our problem is getting worse.I hope this month’s articles will inspire you to think about how you will address this trend inyour organization.

For a more thorough discussion of this trend from the perspective of aerospace engineer-ing, I hope you will read The Critical Need for Software Engineering Education by Dr. Lyle N. Long.You will find more specific software educational issues addressed in the two articles that follow.In Using Inspections to Teach Requirements Validation by Lulu He, Dr. Jeffrey C. Carver, and Dr.Rayford B. Vaughn, the authors share an experiment that compares the use of both checklistsand Perspective-Based Reading to aid in a requirements validation exercise. In Integrating SoftwareAssurance Knowledge Into Conventional Curricula by Dr. Nancy R. Mead, Dr. Dan Shoemaker, andJeffrey A. Ingalsbe, the authors compare the contents of the Common Body of Knowledge forSecure Software Assurance with the Computing Curricula 2005: The Overview Report. Theirintent is to map our security needs with software-related curricula being recommended for ourschools.

One of the options available to assist DoD readers with required software training is thesoftware curriculums available through the Air Force Institute of Technology (AFIT); you canread more about AFIT in Software Engineering Continuing Education at a Price You Can Afford by MajChristopher Bohn, Ph.D. You will find another thought-provoking article from Roger Stewartand Lew Priven with their discussion of successfully implementing software inspections in Howto Avoid Software Inspection Failure and Achieve Ongoing Quality, Cost, and Schedule Benefits. We concludethis month’s issue with an insightful article by Dr. Robert B.K. Dewar and Dr. EdmondSchonberg entitled Computer Science Education: Where Are the Software Engineers of Tomorrow?

It should be clear throughout the software community that education does not end once agraduate has a bachelor’s degree. The technology we work with continues to grow at an astound-ing rate. Even if additional degrees are not sought, all professionals need additional educationto keep current with the needs of the software community. As a co-sponsor for CrossTalk,I am happy to provide one additional source for this education.

Training and Education

Kevin StameyOklahoma City Air Logistics Center, Co-Sponsor

4 CROSSTALK The Journal of Defense Software Engineering January 2008

Policies, News, and Updates

The Honorable John Grimes, Department ofDefense – Chief Information OfficerThe Assistant Secretary of Defense for Net-works and Information Integration for theDepartment of Defense Chief InformationOfficer (ASD[NII]/DoD-CIO) is the principalstaff assistant and advisor to the Secretary on net-

works and net-centric policies and concepts, command and con-trol, communications, non-intelligence space matters, enterprise-wide integration of DoD information matters, and InformationTechnology. Additionally, the DoD-CIO has responsibilities forintegrating information and related activities and services acrossthe DoD. The mission of the organization is to enable Net-Centricoperations. NII/CIO is leading the Information Age transforma-tion that will enhance the DoD’s efficiency and effectiveness byestablishing an Information on Demand capability. See<www.dod.mil/cio-nii> for more information.

Kristen Baldwin, Office of the Secretary ofDefense – Acquisition,Technology and LogisticsThe Office of the Secretary of Defense for Ac-quisition, Technology and Logistics (OSD(AT&L))Software Engineering and Systems Assurance isthe staff agent responsible for all matters relat-ing to DoD software engineering, systems assur-

ance, system of sytems (SoS) engineering, and Capability MaturityModel Integration. Organizational focus areas include policy, guid-ance, education and training, acquisition program support, softwareacquisition management and development techniques, software andsystems engineering integration, SoS enablers and best practices,engineering for system assurance, and government-industry collab-oration. See <www.acq.osd.mil/sse/ssa> for more information.

Terry Clark, NAVAIR, Systems EngineeringDepartment – Director, Software EngineeringThe Naval Air Systems Command (NAVAIR) hasthree Strategic Priorities through which it pro-duces results for the Sailor and the Marine. Firstare its People that we develop and provide tools,infrastructure, and processes needed to do their

work effectively. Next, Current Readiness that delivers NAVAL avi-ation units ready for tasking with the right capability, at the righttime, and the right cost. Finally, Future Capability in the delivery ofnew aircraft, weapons, and systems on time and within budget,

that meet Fleet needs, providing a technological edge over adver-saries. See <www.navair.navy.mil> for more information.

Kevin Stamey, 76 SMXG DirectorThe 76th Software Maintenance Group at theOklahoma City-Air Logistics Center is a leader inthe avionics software industry that understandstotal system integration. The center has a provenrecord of producing software on time, on bud-get, and defect-free. Its people provide the exper-

tise, software, weapons, interface, and aircraft systems that arefully integrated to ensure dependable war-winning capabilities. Itsareas of expertise include navigation, radar, weapons and systemintegration, systems engineering, operational flight software, auto-matic test equipment, and more. For more information, see<www.bringittotinker.com>.

Norman LeClair, 309 SMXG Acting DirectorThe 309th Software Maintenance Group at theOgden-Air Logistics Center is a recognizedworld leader in cradle-to-grave systems support,encompassing hardware engineering, softwareengineering, systems engineering, data manage-ment, consulting, and much more. The division

is a Software Engineering Institute Software Capability MaturityModel® (CMM®) Integration Level 5 organization with TeamSoftware ProcessSM engineers. Their accreditations also includeAS 9100 and ISO 9000. See <www.mas.hill.af.mil> for moreinformation.

Joe Jarzombek, Department of HomelandSecurity – Director of Software AssuranceThe DHS National Cyber Security Division servesas a focal point for software assurance (SwA),facilitating national public-private efforts to pro-mulgate best practices and methodologies thatpromote integrity, security, and reliability in soft-

ware development and acquisition. Collaborative efforts of theSwA community have produced several publicly available onlineresources. For more information, see the Build Security In Web site<https://buildsecurityin.us-cert.gov>, which is expanding tobecome the SwA Community of Practice portal <www.us-cert.gov/swa> to provide coverage of topics relevant to the broaderstakeholder community.

Announcing CrossTalk’s Co-Sponsor Team for 2008Elizabeth Starrett

CrossTalk

For more information about becoming a CCrroossssTTaallkkco-sponsor, please contact Elizabeth Starrett at (801) 775-4158 or <beth.starrett@ hill.af.mil>.

® Capability Maturity Model and CMM are registered in the U.S. Patent and TrademarkOffice by Carnegie Mellon University.

SM Team Software Process are service marks of Carnegie Mellon University.

I wish to express my continuing gratitude for the support CrossTalk’s co-sponsors again provide in 2008. I knowour readers and staff appreciate the benefits provided to the software community by the information made available asa result of their sponsorship. Co-sponsor team members are identified below with a description of their organization.Please look for their contributions each month in our From the Sponsor column found on page 3. Their organiza-tions will also be highlighted on the back cover of each issue of CrossTalk.

January 2008 www.stsc.hill.af.mil 5

National Defense UniversityElectronic Systems Center Engineering DirectorateSoftware Technology Support CenterSoftware Technology Support CenterHumans and TechnologyRetired (formerly with Microsoft)The AEgis Technologies Group, Inc.Software Technology Support CenterInformation Technology Standards and SolutionsSoftware Engineering InstituteAuburn UniversitySoftware Technology Support CenterOlin Institute for Strategic Studies, Harvard UniversitySoftware Technology Support CenterGAITS, Inc.Priority Technologies, Inc.Software Technology Support CenterRaytheonSoftware Technology Support CenterU.S. Marine CorpsNorthrop Grumman519th Combat Sustainment SquadronSoftware Technology Support CenterPEM SystemsRaytheon Integrated Defense SystemsScience Applications International CorporationL-3 Communications, Inc.Army PEO Simulation, Training and InstrumentationBattle Control System-Fixed Sustainment OfficeArrowpoint Solutions, Inc.SabiOso309th Software Maintenance GroupShim Enterprise, Inc.Software Technology Support CenterSoftware Technology Support CenterSoftware Technology Support Center309th Software Maintenance GroupDefense Advanced Research Projects AgencySoftware Technology Support CenterLockheed Martin Integrated Systems and SolutionsCharles Stark Draper LaboratoryMississippi State University309th Software Maintenance GroupSoftware Technology Support CenterSoftware Engineering Institute

The 2008 CrossTalk Editorial Board Elizabeth Starrett

CrossTalk

CrossTalk proudly presents the 2008 CrossTalk Editorial Board. Each article submitted to CrossTalk isreviewed by two technical reviewers from the list below in addition to me, CrossTalk’s publisher. The insights from the boardimprove the readability and usefulness of the articles that are published in CrossTalk. Most reviewers listed have graciouslyoffered their own time to support CrossTalk’s technical review process. We give very special thanks to all those participat-ing in our 2008 CrossTalk Editorial Board.

COL Ken Alford, Ph.D.Bruce Allgood

Brent BaxterJim Belford

Dr. Alistair CockburnRichard Conn

Dr. David A. CookLes Dupaix

Sally EdwardsRobert W. Ferguson

Dr. John A. “Drew” Hamilton Jr.Tony Henderson

Lt. Col. Brian Hermann, Ph.D.Thayne Hill

George Jackelen, PMPDeb Jacobs

Dr. Randall JensenAlan C. JostDaniel Keth

Paul KimmerlyTheron Leishman

Glen L. LukeGabriel Mata

Paul McMahonDr. Max H. Miller

Mike OlsemGlenn Palmer

Doug J. ParsonsTim Perkins

Gary A. PetersenVern Phipps

David PutmanKevin Richins

Thom RodgersLarry Smith

Elizabeth StarrettTracy Stauder

COL John “Buck” Surdu, Ph.D.Kasey Thompson

Dr. Will TraczJim Van Buren

Dr. Rayford B. Vaughn Jr.David R. WebbMark WoolseyDavid Zubrow

Software is everywhere, from cellphones to large military systems.

According to the National Academy ofScience [1], “... software is not merely anessential market commodity but, in fact,embodies the economy’s production func-tion itself.” The National Institute ofStandards and Technology (NIST) esti-mates that software errors cost the U.S.economy $59.5 billion a year and softwaresales accounted for $180 billion [2].Software engineering education does notget the attention it deserves, even thoughit is crucially important to our economy.The issues related to software engineeringas a discipline and the debates which haveoccurred over the years, are not new andare described in [3, 4, 5].

The list of software disasters growseach year. Some of the best-knowninclude the following: the Ariane 5 rocket(Flight 501) [6, 7], the Federal Bureau ofInvestigation Virtual Case File system [8],the Federal Aviation AdministrationAdvanced Automation System [7, 9], theCalifornia Department of Motor Vehiclesystem, the American Airlines reservationsystem, and many, many more [7, 10]. TheF-22 aircraft also had problems initiallydue to its complex software systems.Software disasters cost the United Statesbillions of dollars every year, and this mayonly get worse since future systems will bemore complex. Boeing spent roughly $800million on software for the 777, and theymight need to spend five times that on the787 [11]. Aerospace systems will alsoinclude some levels of autonomy, accom-panied by an entirely new level of soft-ware complexity. To help prevent futuredisasters, we must have more softwareengineers trained in rigorous technicalprograms that are on par with other engi-neering programs. The United States

should not wait until there is a disasterthat causes large numbers of human casu-alties before it acts. We do not currentlyhave enough software engineers. We needto educate many more in the near future,especially considering the large group ofengineers that will be retiring in the next10 years [12]. In 2005, the average age ofan aerospace engineer was 54 [13]. Inaddition, more than 26 percent of theaerospace workers will be eligible forretirement in 2008 [14].

Importance of Software inAerospace SystemsThe aerospace industry provides roughly$900 billion in economic activity andaccounts for more than 15 percent of thegross domestic product and supportsmore than 15 million high quality jobs inthe United States [14]. These aerospacesystems rely heavily on software, whichhas been called the Achilles Heel of aero-space systems. There are numerous anec-dotes and examples that illustrate theimportance of computing and software inaerospace. For example:• The Boeing 777 has 1,280 onboard

processors that use more than fourmillion lines of software; Ada accountsfor 99.5 percent of this [15, 16].

• The F-22 has more than two millionlines of software onboard; between 80and 85 percent is in Ada [17].

• Some Blackhawk helicopters havealmost 2,000 pounds of wire connect-ing all the computers and sensors.

• The wiring harness is often more com-plex and more difficult to design thanthe aircraft structure.

• Some aircraft cannot fly without theironboard computers (e.g., F-16 and F-117).

• The air traffic control system relies

heavily on computers, software, andcommunications.

• Interplanetary robotics and spacecraftperform amazing feats, often inextreme environments.

• Autonomous, intelligent, unmannedvehicles will be even less deterministicthan current systems.

• Computers are also important in thedesign and analysis of aerospace sys-tems. Often this means using high-per-formance, massively parallel computersystems.

• Communication systems are criticallyimportant for aircraft and spacecraft;this now includes computer network-ing onboard, to the ground, and toother aerospace vehicles.

• Modern aircraft and spacecraft seldomwork alone – they are usually part of asystem of systems.One way to measure the need for soft-

ware engineers in the aerospace field is toresearch existing job opportunities. AnOct. 2006 review of the Lockheed-MartinCorporation Web site showed they had536 job openings for recent graduates,including 68 openings (13 percent) in soft-ware engineering and four in aerospaceengineering. Most of the aerospace engi-neering job openings were for structuralengineers capable of performing finiteelement analyses. It should be noted thatthey do hire aerospace engineers in otherareas as well (for example, aerospace con-trol experts are sometimes listed underembedded systems). The Boeing employ-ment Web page gave similar results. Theyhad 298 job openings related to software.There were only three jobs that mentionedaerodynamics (none of which were actuallyjobs for aerodynamicists). When searchingthe Boeing site for aerospace engineer, itreturned a listing of six open positions in

The Critical Need for Software Engineering Education

Dr. Lyle N. Long The Pennsylvania State University

Software affects almost every aspect of our daily lives (manufacturing, banking, travel, communications, defense, medicine,research, government, education, entertainment, law, etc.). It is an essential part of our military systems, and it is usedthroughout the civilian sector, including safety-critical and mission-critical systems. In addition, the complexity of many ofthese systems has been growing exponentially. Unfortunately, the U.S. higher education system has not kept pace with theseneeds. Existing undergraduate and graduate science and engineering programs need to incorporate more material on softwareengineering. This is especially true for aerospace engineering, since those systems rely heavily on computation, information, com-munications, and software. In addition, the United States needs more dedicated software engineering educational programsand professional software engineering certification programs.

Training and Education

6 CROSSTALK The Journal of Defense Software Engineering January 2008

The Critical Need for Software Engineering Education

structural engineering. This is probably anindication that aerospace engineering edu-cational programs are concentrating toomuch on the applied physics of aerospaceengineering and not enough on comput-ing and software. We need to work withindustry and the government to redefineaerospace engineering. We need to educatestudents capable of designing and build-ing the new aerospace systems that we willneed in the future – which will be domi-nated by computing, networking, andinformation systems.

Software Engineering DefinedThe Institute of Electrical and ElectronicsEngineers (IEEE) defines software engi-neering [3] as “the application of a sys-tematic, disciplined, quantifiable approachto the development, operation, and main-tenance of software.” A good summary ofsoftware engineering can be found in [18].

Software systems are some of themost complicated things humans haveever created. To design and build them,one needs to follow processes and proce-dures typical of other engineering disci-plines [6, 19]. First, the requirements needto be carefully defined. Then the architec-ture of the software system needs to bedeveloped. Once the requirements andarchitecture are defined, one can begincode development. The code then needsverification, validation, and testing. Thereare many ways of accomplishing all ofthese steps which are related to the type oflife-cycle model used and the type of sys-tem developed. In addition, one needs toconsider how to estimate costs, how tomanage the people, and how to monitorthe ethical responsibilities of the team.This is not unlike the steps required todesign and build any complex system (e.g.,bridges, aircraft, and computer hardware).The actual code development or program-ming can be a fairly small portion of theprocess [6].

Most engineers and scientists do notfully appreciate or understand softwareengineering. Even high school studentsthink they can do software after they learnthe basics of Java or C++ syntax. All toooften, software engineering is equatedwith programming. This is like equatingcivil engineering with pouring concrete.Many people can pour concrete, but feware civil engineers and can build large,technically inspired masterpieces. Like-wise, many people can program, but fewcan develop large software masterpieces.It is not uncommon to hear people argu-ing about the merits or drawbacks of thedifferent computer languages, eventhough they are not well versed on the var-

ious languages. Often, they simply like thelanguage that they grew up with and donot appreciate or understand the others.These misconceptions are especiallyapparent in discussions regarding Ada[20], which is still probably the best lan-guage to use for mission- or safety-criticalsystems. In reality, people who developcode without sound software engineeringapproaches are merely hackers. Of course,programmers are an essential part of soft-ware engineering, and talented program-mers are quite rare and extremely valuable.

Software EngineeringEducationBoth the U.S. economy and nationaldefense depend upon software, but manyof these large software systems are beingdeveloped by people who have never beenformally trained in software engineering.While there are some incredibly talentedself-taught software engineers, we should

not rely on the majority of our softwareengineers being self-taught. We wouldnever build modern aircraft without aero-space engineers, and we would never buildbridges or buildings without civil engi-neers. So why are we developing largesoftware systems without teams of for-mally trained and professionally certifiedsoftware engineers?

Recently, Dr. John Knight, a professorat the University of Virginia, contrastedsoftware engineering to other engineeringdisciplines [21]. He spoke of how 1,000-year-old cathedrals were built using thebest civil engineering technology of thetime and how these buildings are stillstanding. Civil engineering has evolvedtremendously over the ages, and now wehave enormous skyscrapers and spectacu-

lar bridges. This would not be possiblewithout a vibrant civil engineering educa-tional system, research programs, andmentoring. Similar analogies could bedrawn from other engineering disciplines.A thousand years from now, people will bemarveling at aerospace engineering mile-stones such as the Wright Flyer, the SR-71Blackbird, and the Apollo program. Allwere great engineering projects in theirday and are now in museums. Will there beany software cathedrals to marvel at 1,000years from now? Or will future genera-tions view us as hackers?

Traditional Science and EngineeringEducational ProgramsMany students who graduate from U.S.science and engineering programs willeventually work in software development.Unfortunately, most of them will get littleor no software education. For example, 24percent of physics graduates will be work-ing on software five to eight years aftergraduation [22]. Most of them will proba-bly receive no training in software engi-neering in college. Other science gradu-ates, even outside of engineering, mayalso eventually work in software develop-ment. It would be very beneficial for thesestudents to know more about softwareengineering before they graduate. Theyneed more than a freshman-level coursein programming. This is true of almost allthe traditional science and engineeringdegree programs.

It is also not valid to assume that com-puter science graduates are software engi-neers, either. It is fairly easy to graduatefrom a computer science program withvery little education in software develop-ment. Knight and Leveson describe theneed for more software education in com-puter science and computer engineeringprograms and advocate for more softwareengineering programs [23].

The need for software education isespecially critical in aerospace engineeringprograms. Aerospace engineers havealways prided themselves on being thesystem integrators, but to do this youmust have some understanding of thecomplete aerospace system you are devel-oping. In modern combat aircraft, theelectronic components account forroughly 10 percent of the weight and 33percent of the cost [24]. So if aerospaceengineers are not well versed in comput-ing, networking, sensors, and softwarethen they cannot understand the completesystem (unless that system is 60 years old).Students need to be trained so that theycan develop the next generation of aero-space systems, not old aircraft and old

January 2008 www.stsc.hill.af.mil 7

“We need to work withindustry and the

government to redefineaerospace engineering.We need to educate students capable of

designing and buildingthe new aerospacesystems that we will

need in the future ...”

Training and Education

spacecraft. Aerospace systems havealways used the latest technology toachieve amazing performance. Futureand current systems rely heavily on com-puters and software and students need toknow that. Aerospace engineers areneeded as system integrators, but this isonly possible if they have some under-standing of the complete system (includ-ing computing and software).

Today this goes beyond the onboardavionics since modern aircraft andspacecraft are almost always tied to othersystems, but avionics is a huge part ofaerospace systems. Processing powerand computer memory have beenincreasing exponentially in military air-craft since about 1960 [25]. The F-106had less than 20 kilobytes of memory,while the Joint Strike Fighter (JSF) couldhave more than two gigabytes. Avionicscould account for 40 percent of the costof the JSF. The report also states thatsoftware content in these systems hasincreased dramatically, and that we needmore software engineers.

Computing and software are integralparts of aerospace engineering. It is nowone of the key disciplines in aerospaceengineering. Traditionally, aerospaceengineering [26] was built upon fourtechnology pillars: aerodynamics, struc-tures, propulsion, and dynamics andcontrol, as shown in Figure 1. These pil-lars are reflected in aerospace engineer-ing curricula. All these disciplines wereimportant for the Wright brothers andfor every aerospace system since then.However, modern aerospace engineeringmust include five pillars, as shown inFigure 2. In [27], the authors refer to thefive areas as the following: aerodynam-ics, materials, avionics, propulsion, andcontrols. Current and future aerospacesystems are and will be designed usingcomputers. They will have onboardcomputers and will need to communi-cate with other vehicles and computers.

Computing (including processing, net-working, and storing data) and softwareare essential elements of aerospace engi-neering, and they are the fifth pillar. Inaddition, this fifth pillar might be themost important pillar, and it is far lessmature than the other four. Aerospaceengineering educational programs have astrong emphasis on applied physics (e.g.,fluid dynamics, structural dynamics,dynamics, combustion, and propulsion).Historically, there were good reasons forthis, but we cannot continue to neglectthe research and educational needs inaerospace computing and software.

While computing and software iscrucial for aerospace systems, existingaerospace engineering educational pro-grams usually do not reflect this fact.Most aerospace engineering programsrequire roughly 40 courses over a four-year period, but students often take onlyone course related to software (a fresh-man-level programming course). Also,there is usually no requirement to learnabout avionics, embedded systems, net-working, sensors, or computer hardware.This trend carries through to aerospaceengineering graduate programs as well,where the entrance exams and curriculaseldom include computing, software, oravionics. They are often primarily

applied physics programs.Pennsylvania State University has

been working to modernize its aerospaceengineering curricula [28]. The universi-ty now offers senior-level courses inadvanced computer programming(object-oriented programming, Java,C++, Ada, etc.) and software engineer-ing (using [6]), both for aerospace engi-neers. Penn State also has a new courseon the Global Positioning System. As of2006, aerospace engineering studentswill be required to take either the softwareengineering course or an electronic cir-cuits course. Ideally, they should takeboth and also be exposed to systemsengineering, embedded systems, net-working, information systems, sensors,and software. These additional topicscould be covered in their technical elec-tives or in graduate courses. Some ofthem could be covered in a minor also.The university also hopes to establish anundergraduate minor in information sci-ences and technology for aerospaceengineering in 2008.

It should also be noted that it is dif-ficult to teach an engineer all they needto know in four years. In fact, the U.S.National Academy of Engineering [29]recommends that the bachelor of sci-ence degree be recognized as a pre-engi-neering degree. Scientists and engineersneed to continue learning throughouttheir lifetimes to be effective. In addi-tion, many aerospace engineering posi-tions require a master’s degree, whichallows the student to concentrate on aparticular area. An excellent combina-tion would be for a student to get abachelor of science in aerospace engi-neering and then a master’s degree ordoctorate in software (or systems) engi-neering. These graduates would beextremely valuable. Another possibility isto offer an undergraduate or graduateminor in software engineering. PennState has a popular graduate minor incomputational science, which attractsstudents from a wide variety of scienceand engineering departments [30]. Asimilar program could be created forsoftware engineering or systems engi-neering.

Dedicated Software EngineeringEducation Degree ProgramsAs previously stated, existing scienceand engineering education programsneed to include more computing andsoftware in their curricula, but they alsoneed more dedicated software engineer-ing programs. These software engineer-ing programs, however, need to include

8 CROSSTALK The Journal of Defense Software Engineering January 2008

AE

RO

DY

NA

MIC

S

OLDAEROSPACE ENGINEERINGA

ST

RU

CT

UR

ES

ROSPACE EER ACE EROSPPA

PR

OP

UL

SIO

N

NGINEERINGEN ERINGNGINEE

DY

NA

MIC

SA

ND

CO

NT

RO

L

G

DESIGN

AE

RO

DY

NA

MIC

S

MODERNAEROSPACE ENGINEERINGAER

ST

RU

CT

UR

ES

AN

DM

AT

ER

IAL

S

SPA ROAEROS

AN

D

PR

OP

UL

SIO

N

GINEERINGNGERRIN

DY

NA

MIC

SA

ND

CO

NT

RO

L

GG

DESIGN AND SYSTEMS ENGINEERINGE

CO

MP

UT

ER

SA

ND

SO

FT

WA

RE

PACE ENGPA GINEEEENGGINNEE ECEAC EN

AN

D

SYSTEYSTEMS ENMS EM

Figure 1: Old Aerospace Engineering

AE

RO

DY

NA

MIC

S

OLDAEROSPACE ENGINEERINGA

ST

RU

CT

UR

ES

ROSPACE EER ACE EROSPPA

PR

OP

UL

SIO

N

NGINEERINGEN ERINGNGINEE

DY

NA

MIC

SA

ND

CO

NT

RO

L

G

DESIGN

AE

RO

DY

NA

MIC

S

MODERNAEROSPACE ENGINEERINGAER

ST

RU

CT

UR

ES

AN

DM

AT

ER

IAL

S

SPA ROAEROS

AN

D

PR

OP

UL

SIO

N

GINEERINGNGERRIN

DY

NA

MIC

SA

ND

CO

NT

RO

L

GG

DESIGN AND SYSTEMS ENGINEERINGE

CO

MP

UT

ER

SA

ND

SO

FT

WA

RE

PACE ENGPA GINEEEENGGINNEE ECEAC EN

AN

D

SYSTEYSTEMS ENMS EM

Figure 2: Modern Aerospace Engineering

“While computing andsoftware is crucial foraerospace systems,existing aerospace

engineering educationalprograms usually do not

reflect this fact.”

The Critical Need for Software Engineering Education

plenty of science and engineering intheir curricula (e.g., physics, mathemat-ics, and embedded systems). Theyshould not have an overemphasis onmanagement, business, and processes.The Association for ComputingMachinery (ACM), the IEEE, and theNational Science Foundation have devel-oped very good undergraduate curriculain software engineering [3].

Currently in the United States, thereare only 10 accredited software engineer-ing undergraduate programs [31], whilethere are 67 aerospace engineering pro-grams. The United States needs manymore software engineering programs.This needs to happen soon, since it takesyears to start new programs and for stu-dents to graduate. In addition, theUnited States has an aging workforce.Some companies in the aerospace anddefense business could see 40 percent oftheir workforce retire in the next fiveyears [12]. According to the Wall StreetJournal, organizations such as theNational Aeronautics and SpaceAdministration have more engineersover 60 than under 30.

In addition to the existing undergrad-uate software engineering programs,there are 109 software engineering mas-ter’s degree programs and 40 softwareengineering doctorate programs in theUnited States. Few of the undergraduateor graduate programs, however, are atmajor research universities. In addition,few of them exist at universities includ-ed in the top 25 schools listed in the U.S.News and World Report rankings. Most ofthese programs are at relatively smallschools, maybe because they are able toreact more quickly to industry and soci-ety needs.

Unfortunately, change occursextremely slowly in academia becausethere are few incentives to change.Government funding could and shouldbe used to help expedite these changes.Industry leaders need to get involvedand demand change as well. There needsto be internships and mentoring avail-able. There is also a need for continuingeducation. At the government labs andin industry, there is a huge need for soft-ware engineering training for its existingworkforce.

We also need software engineeringprofessional certification. The IEEE hasdeveloped an excellent CertifiedSoftware Development Professional(CSDP) program [32, 33]. This is a greatprogram, but it is not quite a softwareengineering certification program.Unfortunately, there is no requirement

for a science or engineering backgroundfor the certification, so it is not the sameas other professional engineering certifi-cation programs. In addition, at the timethis article was written, there were only575 people in the world who have theCSDP certification. Beginning in 1999,Texas began certifying software engi-neers [34]. In addition, the Open Grouphas established an information technolo-gy architect certification program [35].

ConclusionHigher education in the United Statesneeds to be more responsive to the soft-ware engineering needs of its industryand government labs. The United Statescannot be complacent with its techno-logical leads in any field, especially soft-ware and aerospace, which are two ofthe most important industries in itseconomy. These two industries annuallyprovide more than $180 billion and $900billion, respectively, to the economy.Technology has been changing at anexponential rate, and too often curriculachanges extremely slowly. Software engi-neering needs to be incorporated intoexisting science and engineering pro-grams, especially aerospace engineeringcurricula. We also need to create morededicated software engineering educa-tional programs at all levels – shortcourses, bachelors, masters, and doctoratelevels. And finally, there also needs to be anational effort to develop professionalcertification of software engineers.u

References1. Jorgenson, D.W., and C.W. Wessner,

Eds. Measuring and Sustaining theNew Economy, Software, Growth, andthe Future of the U.S. Economy.Washington, D.C.: National Acade-mies Press, 2006.

2. NIST <www.nist.gov/public_affairs/releases/n02-10.htm>.

3. IEEE Computer Society and theACM. “Curriculum Guidelines forUndergraduate Degree Programs inSoftware Engineering.” 2004 <http://sites.computer.org/ccse/SE2004Volume.pdf>.

4. Vaughn, R. “Software EngineeringDegree Programs.” CrossTalkMar. 2000.

5. Computer Science and Telecommu-nications Board. Expanding Infor-mation Technology Research to MeetSociety’s Needs. Washington, D.C.:The National Academies Press, 2000.

6. Sommerville, I. Software Engineering.Addison-Wesley, 2006.

7. Glass, Robert L. Software Runaways:

Monumental Software Disasters.Prentice-Hall, 1997.

8. Eggen, D., and G. Witte. “The FBI’sUpgrade That Wasn’t.” WashingtonPost 18 Aug. 2006.

9. U.S. House of Representatives. Proc.of the Aviation SubcommitteeMeeting. Washington, D.C.: 2001.

10. Leveson, Nancy G. “Role of Softwarein Spacecraft Accidents.” Journal ofSpacecraft and Rockets 41.4 (2004).

11. Winter, D.C. Presentation at theNational Workshop on AviationSoftware Systems: Design forCertifiably Dependable Systems.Alexandria, VA: Oct. 5-6, 2006.

12. “The Aging Workforce: TurningBoomers Into Boomerangs.” TheEconomist 16 Feb. 2006.

13. Albaugh, J.F. “Embracing Risk: AVision for Aerospace in the 21stCentury.” Frank Whittle Lecture, RoyalAeronautical Society, Jan. 19, 2005.

14. Commission on the Future of the U.S.Aerospace Industry. “Final Report ofthe Commission on the Future of theU.S. Aerospace Industry.” WashingtonD.C.: National Academies Press, 2002.

15. Hafner, K. “Honey, I Programmed theBlanket.” New York Times 27 May1999.

16. Pehrson, R.J. “Software Developmentfor the Boeing 777.” CrossTalkJan. 1996.

17. Moody, B.L. “F-22 Software RiskReduction.” CrossTalk May, 2000.

18. U.S. Air Force. Software TechnologySupport Center (STSC). “A GentleIntroduction to Software Engineer-ing.” Hill Air Force Base, UT: STSC,1999.

19. Glass, R.L. Facts and Fallacies of Soft-ware Engineering. Addison-Wesley,2006.

20. Ada Home <www.adahome.com>.21. Knight, J.C. Presentation at the

National Workshop on AviationSoftware Systems: Design forCertifiably Dependable Systems.Alexandria, VA: Oct. 5-6, 2006.

22. American Institute of Physics. TheStatistical Research Center <www.aip.org/statistics>.

23. Knight, J.C., and N.G. Leveson.“Software and Higher EducationInside Risks Column.” Communica-tions of the ACM 49.1 (2006).

24. Sanders, P. “Improving SoftwareEngineering Practice.” CrossTalkJan. 1999.

25. Aging Avionics in Military Aircraft.Washington D.C.: National AcademiesPress, 1993.

26. Long, L.N. “Computing, Information,

January 2008 www.stsc.hill.af.mil 9

Training and Education

and Communication: The Fifth Pillarof Aerospace Engineering.” Journal ofAerospace Computing, Information,and Communication 1.1 (2004).

27. “The Future of Aerospace.” Wash-ington, D.C.: National AcademiesPress, 1993.

28. Pennsylvania State University Curri-culum Guide 2006-2007 <www.aero.psu.edu/undergrads/UG_Curriculum_Guide_2006-07.pdf>.

29. “Educating the Engineer of 2020:Adapting Engineering Education tothe New Century.” Washington D.C.:National Academies Press, 2005.

30. <www.csci.psu.edu/minor.html>.31. ABET <www.abet.org/>.32. IEEE Computer Society. IEEE

Computer Society Certified SoftwareDevelopment Professional ProgramWeb Center <www.computer.org/portal/pages/ieeecs/education/certification/>.

33. IEEE Computer Society. Guide to theSoftware Engineering Body ofKnowledge. IEEE Computer Society,2004.

34. Voas, J. “The Software QualityCertification Triangle.” CrossTalkNov., 1998.

35. The Open Group. “IT ArchitectCertification Program” <www.opengroup.org/itac>.

10 CROSSTALK The Journal of Defense Software Engineering January 2008

About the Author

Lyle N. Long, D.Sc., isa distinguished professorof aerospace engineer-ing, bioengineering, andmathematics at the Penn-sylvania State University.

He is also director of the graduate minorin computational science. Long has adoctorate of science from GeorgeWashington University, a master’s degreefrom Stanford University, and a bache-lor’s degree in mechanical engineeringfrom the University of Minnesota. He isan IEEE Computer Society CertifiedSoftware Development Professional.Long received the 1993 IEEE ComputerSociety Gordon Bell Prize. He is a Fellowof the American Institute of Aeronauticsand Astronautics, is a senior member ofthe IEEE, and has written more than 130technical papers. In 2007-2008, he was aMoore Distinguished Scholar at theCalifornia Institute of Technology.

Pennsylvania State University233 Hammond BLDGUniversity Park, PA 16802Phone: (814) 865-1172 Fax: (814) 865-7092E-mail: [email protected]

COMING EVENTS

February 4-828th Annual Texas Computer EducationAssociation’s Convention and Exposition

Austin, TXwww.tcea2008.org

February 13-152008 Conference for Industry and

Education CollaborationNew Orleans, LA

www.asee.org/conferences/ciec/2008/index.cfm

February 18-222008 Working IEEE/IFIP Conference on

Software Architecture Vancouver, BC, Canada

www.wicsa.net

February 25-2824th Annual National Test and

Evaluation ConferencePalm Springs, CA

www.ndia.org/Template.cfm?Section=8910&Template=/ContentManagement/ContentDisplay.cfm

&ContentID=18912

February 27-28AFCEA 2008 Homeland

Security ConferenceWashington, D.C.

www.afcea.org/events/homeland/landing.asp

April 29-May 2

2008 Systems and SoftwareTechnology Conference

Las Vegas, NVwww.sstc-online.org

COMING EVENTS: Please submit coming events thatare of interest to our readers at least 90 daysbefore registration. E-mail announcements to:[email protected].

Dear CrossTalk Editor,Once again, the cover for CrossTalkis a knockout! Congratulations to the staffand to the editor for a great edition. Iwas wondering when systems engineer-ing would be a transitional topic aboutsystems engineering and software inclu-sion.

When I stepped down from the prin-cipal investigator position on the B-1Bprogram, I went into software becauseof the lack of understanding I foundbetween systems engineering and soft-ware. Due to my relationship with thecompany’s system engineering managerand the software manager – and the helpof CrossTalk articles that I sentthem – they finally decided to have meet-ings in their manager’s office and talkedtogether! What a milestone that was! Istill send CrossTalk articles to theseguys, which they are grateful for becauseit helps keep them current on industry

thinking.This edition should be given to every

systems engineer manager and staff, aswell as software managers. Let’s get thecommunication going among our con-tractors!

I could not have been more pleasedwith Dr. John W. Fischer’s introductionin the Sponsor’s Note to this month’sissue of CrossTalk. His remarks aredead-on regarding the issues and evolutionof system engineering with software.Gee, can you imagine what the founda-tion for requirements would start to looklike?

Thanks for another great issue!

– Melonee MosesSoftware/Logistics Management Specialist

DCMA BoeingLong Beach, CA

<[email protected]>

LETTER TO THE EDITOR

It can be argued that requirements engi-neering (RE) is one of the most impor-

tant stages in a traditional software devel-opment life cycle because it helps to cor-rectly determine the desired purpose of asystem. The goal of an RE process is toidentify and document stakeholders andtheir needs in a form amenable to analy-sis, communication, and implementation[1]. Correct, complete, and unambiguousrequirements not only ensure that devel-opers build the right system, but alsoreduce the effort and cost that wouldotherwise be needed to fix requirementsproblems later. Because of the impor-tance in obtaining correct requirements,RE courses are fundamental elements ofany software engineering curriculum. AnRE course should teach students tech-niques for accurately eliciting, analyzing,and validating requirements. By acquiringthese skills, future software engineers arebetter equipped to produce qualityrequirements, eliminate faults, and reducedevelopment time. Educating students inRE, however, is not an easy task becauseit is a human-centered process that re-quires skills from a variety of disciplines(e.g., computer science, system engineer-ing, cognitive psychology, anthropology,and sociology) [1]. Moreover, RE educa-tors must bridge the gap between acade-mia and industry by exposing students tothe real world as much as possible. Oneimportant real-world aspect of RE thathas remained largely unstudied in thesoftware engineering education commu-nity is requirements validation.

This article describes an exercise toaid in the teaching of requirements vali-dation in a graduate-level RE course,along with an evaluation of its usefulness.In the exercise, the students validated areal software requirements specificationdocument provided by an industrial part-ner using a meeting-based N-fold inspec-tion method (described later) [2].

Background Previous research has identified challengesfaced during RE education. Because onegoal of software engineering education isto help students develop the knowledgeand skills necessary to be successful inindustry, a challenge for RE educators isthat the inherent complexity of industrialRE (i.e. the broad scope, multiple con-cerns, and deficient specifications) is diffi-cult to replicate in a classroom environ-ment [3, 4]. To be successful, a require-ments engineer must possess both thetechnical skills needed to interact with thesystem and the social skills needed tointeract with its human stakeholders [1].These soft skills (e.g., communication andteamwork) are also difficult to teach in theclassroom [5].

Most published work on RE educationhas focused on a small set of RE topics:elicitation, analysis, and the overall process.Validation is a complex task that is oftennot adequately covered in a traditional uni-versity education [6]. A requirements docu-ment can have many different types of defi-ciencies that may have disastrous effects onsubsequent development stages and yieldundesirable consequences [4, 7]. Therequirements validation process checks fordifferent types of problems (e.g., omis-sions, inconsistencies, and ambiguities) tohelp ensure that proper quality standardsare fulfilled. Due to time limits and otherconsiderations, validation is usually con-ducted informally either on an ad-hoc basisor simply as a peer review [8]. As a result,little has been reported concerning the edu-cational challenges of validation topics likeinspections. These challenges highlight theimportance of improving education andcritical, technical, and social skills that arequirements engineer must possess.However, the topics covered in many soft-ware engineering courses do not meet theseneeds, posing a major challenge for devel-oping RE skills [6, 9]. Furthermore, experi-

ence reports about RE education is rare insoftware engineering and RE literature. Toaddress the lack of focus on requirementsvalidation, we developed a requirementsvalidation exercise which we found to besuccessful and instructive. This exercise is afollow-on experiment from that reported atthe Fourteenth Annual Systems andSoftware Technology Conference held inSalt Lake City in April 2002 titled “ThirdParty Walkthrough Inspections: A JointNavy/University Empirical SoftwareEngineering Project.” The most recentresults of the experiment were also pre-sented by invitation at the Canadian AirForce Software Engineering Symposiumheld at the Royal Military College ofCanada in Kingston, Ontario, May 2007[10, 11].

Description of theRequirements ValidationExerciseThe exercise was conducted during a grad-uate-level RE course at Mississippi StateUniversity. The class included graduate stu-dents that were currently working for theDepartment of Defense (DoD), had previ-ous government software developmentexperience, and many that had no practicalexperience. The main goal of this coursewas to provide students with a comprehen-sive overview of requirements elicitation,analysis, specification validation, and man-agement. These activities were introducedin the context of systems engineering andvarious software development life-cyclemodels. For each activity, the students wereexposed to specific methods, tools, andnotations. The semi-weekly, 75-minuteclass sessions contained a mixture of lec-ture material, class discussions centered onoutside readings, and student presentations.Near the end of the semester, the studentsparticipated in a two-week requirementsvalidation exercise – the primary subject of

Using Inspections to Teach Requirements Validation

Requirements validation is often not adequately covered by a traditional software engineering curriculum in universities. Thisarticle describes an experiment conducted in a graduate-level requirements engineering course to provide students a real worldexperience in requirements validation. The experiment made use of the N-fold inspection method, in which multiple teams ofstudents inspect the same requirements document then meet together to discuss their findings. This procedure allows the stu-dents to not only practice their reviewing skills, but also to strengthen their communication and collaboration skills. At theconclusion of the exercise, the students were given the opportunity to provide qualitative and quantitative feedback. The resultsof this study suggest that the techniques employed by this class and the resulting defect detection could be useful in general dur-ing the requirements validation process.

Lulu He, Dr. Jeffrey C. Carver, and Dr. Rayford B. VaughnMississippi State University

January 2008 www.stsc.hill.af.mil 11

Training and Education

this article. While the exercise reportedhere is based on a small number of stu-dents, we believe the results achieved sug-gest that such techniques could be consid-ered in larger requirements validation exer-cises. The authors are amenable to cooper-ating with others to expand this researchthrough additional empirical investigation– particularly in DoD software engineeringendeavors.

Overview and Goals of theRequirements Validation ExerciseThe overall goal of the exercise was to pro-vide students with hands-on practice inrequirements validation. The specific goalsof the requirements validation exercisewere the following:1. To help students understand the course

materials better by experiencing the for-mat and presentation of a real require-ments document, experiencing theimpact of domain-specific language inunderstanding and reviewing a require-ments document, and exposing the stu-dents to flaws commonly found in arequirements document.

2. To help students obtain an appreciationfor the complexity and necessity of RE,especially requirements validation.

3. To give students hands-on experiencevalidating a requirements documentwith specific inspection techniques.

4. To provide an opportunity for studentsto practice soft skills such as communi-cation and teamwork.Two important goals of the course

were addressed through this exercise. First,give the students an idea of the size, com-plexity, language, and flaws that can occurin software artifacts, they validated a realrequirements document, which was actual-ly used by a contractor to develop andimplement a system. The requirementsdocument described an upgrade to a case-tracking system for the U.S. National LaborRelations Board. The 43-page documentwas written in natural language (English)and contained the standard content thatwould be expected in a governmentrequirements document.

Second, to expose students to a real-world requirements validation method, theN-fold inspection method was used. In thismethod, the same artifact is inspected bymultiple teams in parallel with the goal ofimproving the overall review effectiveness[12]. From an educational point of view,the N-fold inspection method providesstudents with an opportunity to discuss thedefects found with other students, therebyunderstanding how others had viewed theartifact. Furthermore, N-fold inspection

meetings expose students to the impor-tance of communication and teamwork –important soft skills.

Requirements Validation Techniques Studies have shown that inspections are aneffective requirements validation tech-nique because they greatly improve systemquality by detecting many defects early inthe software life cycle [13]. Within the N-fold method, individual reviewers can usedifferent approaches to review the docu-ment. Our students used either a checklistapproach or the Perspective-BasedReading (PBR) approach (each describedin more detail).

In practice, most industrial inspectionsuse an ad-hoc or checklist-based approachfor defect detection [14]. A checklist pro-vides the inspector with concrete guidancethat is not provided by an ad-hocapproach. A checklist is a list of items thatfocus an inspector’s attention on specifictopics, such as common defects or organi-zational rules, while reviewing a software

document [15]. For this requirements vali-dation exercise, an informal checklist wasdeveloped that focused on important qual-ity concepts relevant to a requirementsdocument. Checklist examples are readilyavailable on the Web, for example, <soft-ware.gsfc.nasa.gov/AssetsApproved/PA2.2.1.5.doc>, <www.processimpact.com/process_assets/requirements_review_checklist.doc>, or <www.swqual.com/training/Require.pdf>. For the class exer-cise, we used a checklist based on theVolere Requirements Specification Tem-plate which can be found at <www.systemsguild.com/GuildSite/Robs/Template.html>.

PBR is a systematic inspection tech-nique that supports defect detection insoftware requirements through a role-playing exercise [13]. In PBR, eachreviewer verifies the correctness of therequirements from the perspective of aspecific stakeholder. The most common

perspectives are the user, the designer,and the tester. PBR techniques providereviewers with a set of steps to follow tobuild a high-level system abstraction andquestions to help identify problems. Forexample, a reviewer assuming the userperspective may create use cases, while areviewer assuming the tester perspectivewould create test plans. Studies haveshown that PBR is a more effective, sys-tematic, focused, goal-oriented, customiz-able, and transferable process than a stan-dard checklist or ad-hoc approach [16]. Byreviewing the requirements documentfrom the perspective of different stake-holders (i.e., playing the role of that stake-holder), the reviewers are expected notonly to detect more defects, but also tobetter comprehend the complexity of RE.

Details of the RequirementsValidation Exercise and Summary ofResearch ResultsStudents enrolled in the course weredivided into four, three-person teams withthe expertise level balanced across teams.The members of two teams used a check-list to inspect the requirements document,while members of the other two teamsused the PBR technique. For the PBRteams, each student was instructed to useone of the three perspectives (user,designer, or tester) to guide their inspec-tion. These two inspection techniqueswere chosen because research suggestedthat while PBR is often a more effectivetechnique, checklists are more widely usedin government and industry [14]. A sec-ond motivator was that no previous workhad compared the effectiveness of achecklist-based approach to the effective-ness PBR in the context of the N-foldinspection method. More importantly,instead of using a generic checklist, thechecklist in this study was specificallydeveloped for the type of requirementsdocument to be inspected (e.g., from agovernment organization).

Before the exercise began, the stu-dents received one class meeting (75 min-utes) of training in their assigned tech-nique (checklist or PBR). The training forthe checklist teams was done through adiscussion with the course professorabout attributes of requirements quality.During this discussion, the checklist stu-dents – heavily guided by the professor –developed a checklist to guide theirinspection. The training for PBR wasdone by an expert in PBR and included adiscussion of the theory behind the tech-niques and a case study that illustrated anexample of its use. After the training, the

12 CROSSTALK The Journal of Defense Software Engineering January 2008

“Studies have shownthat inspections are aneffective requirements

validation technique ... bydetecting many defects

early in thesoftware life cycle.”

Using Inspections to Teach Requirements Validation

PBR reviewers were given the detailed pro-tocol for their assigned perspective. Theneach student performed an individualinspection of the requirements documentusing their assigned technique. During thisinspection, each student individuallyreviewed the document and recorded asmany defects as he or she could find. Thestudents were given two days to performthis task outside of class. Once all threemembers of a team completed the individ-ual inspection step, they met together inthe 1st Team Meeting. During this meet-ing, the students discussed the defects theyfound and agreed on a final team defectlist. After all four teams had conducted the1st Team Meeting, the two checklist teams(six students) met together and the twoPBR teams (six students) met together forthe 2nd Team Meeting. In these six-personmeetings, the reviewers examined the twodefect lists produced during the 1st TeamMeeting and agreed on a final list ofdefects.

The data analysis indicated that PBRwas more effective, both for individualinspectors and for the teams. Conversely,the data suggested that the checklist teamshad more effective team meetings duringthe N-fold inspection process than thePBR teams. Here, the effectiveness ofteam meetings is defined by two mea-sures: meeting gains, i.e. the number ofdefects identified during the meeting dis-cussions that no individual inspector hadfound prior to the meeting, and meetinglosses, i.e. defects found by an individualinspector that the inspection team deter-mines are false positives and, hence, notrecorded on the final team defect list. Inthe case of the PBR teams, the teammeeting served little purpose becausethere were few meeting gains or meetinglosses. The end result would have beensimilar had the individual list simply beencombined without spending time in a for-mal team meeting. Conversely, for thechecklist teams, the meeting served a vitalrole in the process. During the team meet-ings, not only were there meeting gains,but also a large percentage of false posi-tives were identified and removed asmeeting losses. Identification of falsepositives saves time during the reworkphase. One likely cause of this result isthe different perspectives from which thePBR reviewers approached the SoftwareRequirements Specification (SRS). EachPBR reviewer focused on their own per-spective and was less concerned with theperspectives of others. For the checklistteam, the reviewers inspected the SRSusing the same checklist and there wasmore interaction among team members

during the meetings. The most important,and novel, conclusion drawn from theseresults is that the effectiveness and neces-sity of a team meeting depends greatly onthe technique used during the individualpreparation phase of the inspection.Additional data on this experiment can beobtained by contacting the second orthird author of this article.

Evaluation of the Educational Valueof the ExerciseTo evaluate the effectiveness of this exer-cise relative to the four educational goalslisted earlier, quantitative and qualitativedata was collected using a post-study sur-vey. Goals 1-3 were specifically evaluatedby the survey questions in Table 1 (seepage 14). Goal 4 was addressed by usingteam meetings, but it was not specificallyevaluated on the post-study questionnaire.The survey focused on the students’ opin-ions of the exercise and gave them anopportunity to provide feedback on howto improve the exercise. The first twoquestions were answered using a predeter-mined scale (explained with each ques-tion). The other questions were answeredusing free text.

ResultsThis section discusses the results from thepost-exercise questionnaire along with abrief explanation. The questionnaire col-lected both qualitative and quantitativedata from the students.

Quantitative Data Q1. Was the exercise a positive or nega-tive experience?The students answered this question on ascale of 1 (negative) to 5 (positive). Figure1 shows that 91 percent of the studentsfound the exercise to be a positive experi-

ence with no students having a negativeexperience. In Figure 1, the ratings givenby students using PBR and checklist areshown in different shades to evaluate anyimpact the technique had. While the opin-ion of the PBR reviewers is slightly morepositive, there is little difference due to thetechnique used. Therefore, regardless ofwhich technique was used, the studentsfound the N-fold inspection exercise to bea positive experience.

Q2. Did the exercise help you betterunderstand the course materials?The students responded to this questionon a scale of 1 (not at all) to 5 (verymuch). Figure 2 shows that all of the stu-dents found the exercise was very helpful ora lot helpful.

Qualitative Data The qualitative feedback, in which the stu-dents were able to express their own opin-ions, provides additional insight into theusefulness and effectiveness of the exer-cise. In this analysis, specific answers givenfor each question are listed along with thenumber of students who gave that answer,shown in parentheses where 3 checklist/2PBR means that three students who usedthe checklist and two students who usedPBR gave the answer. In some cases, theresponse was given by only PBR studentsor only checklist students, e.g., 3 PBR inbullet three of Q3. This does not implythat the answer is not applicable to theother technique – it only means that nostudents using the other technique provid-ed an answer. In some cases, students pro-vided more than one answer to each ques-tion, so the total number of responsesmay be greater than 12. Q3 and Q4 fromthe post-study survey were geared towardsaddressing Educational Goals 2 and 3. Q5

January 2008 www.stsc.hill.af.mil 13

Negative LessNegative

Neutral PositiveLessPositive

6

4

2

0

Co

un

t

Q1. Was the exercise a positiveor negative experience?

Inspection Technique

Checklist

PBR

Figure 1: Positive or Negative ExperienceResults

Figure 3 Exercise designTable 1. Survey Questions

Q1. Was the exercise a positive or negative experience?

Q2. Did the exercise help you better understand the course materials?

Q3. Did you see any benefits from the exercise? If so what were they?

Q4. Did you learn anything by performing the exercise? If so, what?

Q5. Did you see any drawbacks to the exercise? If so what were they?

Q6. How could the exercise been improved?

Not AtAll

ALittle

Okay VeryMuch

ALot

6

5

4

5

2

1

0

Co

un

t

Q2. Did the exercise help you better understandthe course materials?

Inspection Technique

Checklist

PBR

Figure 2: Understanding the Course MaterialsResults

Training and Education

14 CROSSTALK The Journal of Defense Software Engineering January 2008

and Q6 provide feedback on how toimprove the exercise.

Q3. Did you see any benefits from theexercise? If so what were they?All 12 students indicated that they foundsome benefit from participation in thisexercise:1. It provided hands-on/practical experi-

ence (3 checklist/2 PBR).2. It helped them better understand a real

requirements document (3 checklist /3PBR).

3. It helped them better understanddefect detection in a requirementsdocument (3 PBR).

Because the students obtained hands-onexperience and indicated that they under-stood a real requirements document, thisexercise helped address Educational Goals2 and 3.

Q4. Did you learn anything by performingthe exercise? If so, what?From this exercise, the students learnedthe following:1. The usefulness of inspections (2

checklist/1 PBR).2. The difficulties involved in creating

and using a real requirements docu-ment (3 checklist/1 PBR).

3. The benefits of using a method tofocus a requirements inspection oncertain aspects of the requirementsdocument (2 checklist/5 PBR).

Similar to Q3, these responses indicatethat the students learned the benefits ofinspections, and the difficulties in creatingreal requirements documents, helpingaddress Educational Goals 2 and 3.

Q5. Did you see any drawbacks to theexercise? If so what were they?Only five (4 checklist/1 PBR) of the 12students reported any drawbacks of theexercise:1. Not enough training (3 checklist/1 PBR).2. Not enough time (2 checklist).On a positive note, these drawbacks allrelate to the logistics of the exercise andnot to its intrinsic value. None of thesedrawbacks are concerned with the inspec-

tion procedure that was used during theexercise.

Q6. How could the exercise have beenimproved?1. Expand the exercise (1 checklist/1

PBR).2. Provide more domain knowledge (2

checklist).3. Allow more time for various activities

(2 checklist/2 PBR).4. Provide more training (1 checklist/4

PBR).These suggestions generally relate to thedrawbacks cited in Q5. It was interestingthat two students asked for a more exten-sive exercise that allowed them to obtainmore practice and experience using theinspection techniques. This request sug-gests that the students believed that evenmore benefit would be obtained byexpanding the scope of the exercise.

Summary and ConclusionThis article describes the use of a require-ments validation exercise in a graduate-level RE course. In the exercise, the stu-dents validated a software requirementsdocument using a meeting-based N-foldinspection. The students provided theiropinions and feedback about the useful-ness of the exercise in a post-exercise sur-vey. These results showed that the exerciseachieved its goals.

Overall, students were highly satisfiedwith the exercise content and found it tobe helpful. The exercise helped studentsunderstand what a software requirementsdocument looks like and gain insight intothe difficulty and complexity involved inits correct development. They not onlyrealized the importance of requirementsvalidation but also gained hands-on expe-rience using inspection techniques. Theexercise provided the students with essen-tial knowledge about requirements quality,enabling them to better understand othercourse material such as the following: elic-itation, analysis, and specification.Moreover, though not empirically evaluat-ed, the team meetings in this exercise gavethe students an opportunity to practice

their communication and teamwork skills,which are essential for their future careers.

Based on the feedback provided by thestudents, the following modifications arebeing considered for future similar classexercises:1. Provide more training on the inspec-

tion techniques by using case studies.2. Give specific instructions on the struc-

ture and organization of the teammeeting.

3. Extend the length of the exercise toallow additional time for training andthe individual inspection.

4. To make the exercise more realistic,stakeholders of the software require-ments document will be invited toclass, and help students to obtain moredomain knowledge.

5. When the N-fold inspection is fin-ished, the defect lists will be given tothe owner of the requirements docu-ment to understand the disposition ofthose defects.This study was performed in a gradu-

ate-level university course; therefore, thenext step is to perform additional valida-tion in an industrial setting. Furthermore,the positive experience with the N-foldinspection process during RE suggeststhat it may have applicability in other phas-es of the software life cycle. In the future,we will seek out opportunities to replicatethese experiences in other aspects of thesoftware engineering curriculum.u

References1. Nuseibeh, B., and S. Easterbrook.

“Requirements Engineering: A RoadMap.” Proc. of International Confer-ence on Software Engineering,Limerick, Ireland, June 2000: Associa-tion of Computing Machinery (ACM)Press, 2000: 37-46.

2. He, L., and J.C. Carver. “PBR Vs.Checklist: A Replication in the N-FoldInspection Context.” Proc. of 5thACM/Institute of Electrical andElectronics Engineers (IEEE) Inter-national Symposium on EmpiricalSoftware Engineering (ISESE 2006),Rio de Janeiro, Brazil, Sept. 21-22,2006.

3. Armarego, J., and S. Clarke. “PreparingStudents for the Future: LearningCreative Software Development –Setting the Stage.” Proc. of the AnnualInternational Higher EducationResearch and Development Society ofAustral Asia Conference Christchurch,New Zealand. July 6-9, 2003.

4. Van Lamsweerde, A. “RequirementsEngineering in the Year 00: AResearch Perspective.” Proc. of 22nd

Figure 3 Exercise designTable 1. Survey Questions

Q1. Was the exercise a positive or negative experience?

Q2. Did the exercise help you better understand the course materials?

Q3. Did you see any benefits from the exercise? If so what were they?

Q4. Did you learn anything by performing the exercise? If so, what?

Q5. Did you see any drawbacks to the exercise? If so what were they?

Q6. How could the exercise been improved?

Not AtAll

ALittle

Okay VeryMuch

ALot

6

5

4

5

2

1

0

Co

un

t

Q2. Did the exercise help you better understandthe course materials?

Inspection Technique

Checklist

PBR

Table 1: Survey Questions

Using Inspections to Teach Requirements Validation

January 2008 www.stsc.hill.af.mil 15

International Conference on SoftwareEngineering. Limerick, Ireland, 2000:IEEE Computer Society Press, 2000.

5. Conn, R. “Developing Software Engi-neers at the C-130j Software Factory.”IEEE Software 19.5 (2002): 25-29.

6. Bubenko, J.A. “Challenges in Require-ments Engineering.” Proc. of 2ndIEEE International Symposium onRequire-ments Engineering, LosAlamitos, CA: IEEE ComputerSociety, 1995: 160-162.

7. Meyer, B. On Formalism in Specifi-cations. IEEE Software 2.1 (1985): 6-26.

8. Rosca, D. “An Active/CollaborativeApproach in Teaching RequirementsEngineering.” Proc. of 30th AnnualFrontiers in Education Conference.Kansas City, MO., Oct., 2000: 9-12.

9. Lethbridge, T.C. “What Knowledge Is

Important to a Software Profession-al?” Computer 33.5 (2000): 44-50.

10. Vaughn, R.B., and J. Lever. “ThirdParty Walkthrough Inspections: AJoint Navy/University EmpiricalSoftware Engineering Project.” Proc.of Fourteenth Annual Systems andSoftware Technology Conference, SaltLake City, UT, Apr. 29-May 2, 2002.

11. Vaughn, R.B., and J.C. Carver.“Experiences in N-Fold StructuredWalkthroughs of Requirements Docu-ments.” Proc. of Canadian Air ForceSoftware Engineering Symposium,.Royal Military College of Canada,Kingston, Ontario. 24-25 May, 2007.

12. Martin, J., and W. Tsai. “N-FoldInspection: A Requirements AnalysisTechnique.” Communications of theACM 33.2 (1990): 223-232.

13. Basili, V.R., et al. “The EmpiricalInvestigation of Perspective-BasedReading.” Empirical Software Engi-neering: An International Journal 1.2(1996): 133-164.

14. Laitenberger, O., K.E. Emam, andT.G. Harbich. “An Internally Repli-cated Quasi-Experimental Compari-son of Checklist and Perspective-Based Reading of Code Documents.”IEEE Transactions on SoftwareEngineering 27.5 (2001): 387-421.

15. Fagan, M. “Design and CodeInspections to Reduce Errors inProgram Development.” IBM SystemJournal 15.3 (1976): 182-211.

16. Shull, F., I. Rus, and V.R. Basili. “HowPerspective-Based Reading CanImprove Requirements Inspection.”IEEE Software 33.7 (2000):73-79.

About the Authors

Jeffrey Carver, Ph.D., isan Assistant Professor inthe Computer Scienceand Engineering Depart-ment at Mississippi StateUniversity. He received

his doctorate degree from the Universityof Maryland in 2003. His research inter-ests include software process improve-ment, software quality, software inspec-tions, and software engineering for sci-entific and engineering computing. Hehas more than 30 refereed publicationsin these areas. His research has beenfunded by the U.S. Army Corps ofEngineers, the U.S. Air Force, and theNational Science Foundation.

Computer Scienceand EngineeringPO Box 9637Mississippi State UniversityMississippi State, MS 39762Phone: (662) 325-0004Fax: (662) 325-8997E-mail: [email protected]

Lulu He is a doctorate.student in the ComputerScience and EngineeringDepartment at Mississ-ippi State University. Shereceived her bachelor’s

and master’s degrees in computer sci-ence from Wuhan University, China in2001 and 2004, respectively. She alsoreceived a master’s degree in computerscience from Mississippi State Universityin August, 2007. Her research interestsinclude Software Quality, SoftwareInspections, Software Architecture, andSoftware Engineering for Scientific andEngineering Computing.

Computer Scienceand EngineeringPO Box 9637Mississippi State UniversityMississippi State, MS 39762Phone: (662) 325-8798E-mail: [email protected]

Rayford B. Vaughn,Ph.D., received his doc-torate from Kansas StateUniversity in 1988. He isa William L. Giles Dis-tinguished Professor and

the Billie J. Ball Professor of ComputerScience and Engineering at MississippiState University and teaches and con-ducts research in the areas of SoftwareEngineering and Information Security.Prior to joining the university, he com-pleted a 26-year career in the U.S. Armyretiring as a Colonel and three years as aVice President of Defense InformationSystems Agency Integration Services,and EDS Government Systems. Vaughnhas more than 100 publications to hiscredit and is an active contributor tosoftware engineering and informationsecurity conferences and journals. In2004, he was named a Mississippi StateUniversity Eminent Scholar. Vaughn isthe current Director of the MississippiState University Center for CriticalInfrastructure Protection and the Centerfor Computer Security Research.

Computer Scienceand EngineeringPO Box 9637Mississippi State UniversityMississippi State, MS 39762Phone: (662) 325-7450Fax: (662) 325-8997E-mail: [email protected]

Defects are not an option in today’sworld. Much of our national well-

being depends on software. So the onething that America’s citizens should be ableto expect is that that software will be freeof bugs. Sadly, that is not the case. Instead,commonly used software engineering practices per-mit dangerous defects that let attackers compromisemillions of computers every year . That happensbecause commercial software engineering lacks therigorous controls needed to (ensure defect free) prod-ucts at acceptable cost [1].

Most defects arise from program ordesign flaws, and they do not have to beactively exploited to be considered a threat[2, 3]. In fiscal terms, the exploitation ofsuch defects costs the American economyan average of $60 billion dollars a year [4].Worse, it is estimated that in the future, thenation may face even more challenging problems asadversaries – both foreign and domestic – becomeincreasingly sophisticated in their ability to insertmalicious code into critical software systems [3].

Given that situation, the most impor-tant concern of all might be that theexploitation of a software flaw in a basicinfrastructure component such as power orcommunication could lead to a significantnational disaster [5]. The Critical Infra-structure Taskforce sums up the likelihoodof just such an event in the following:

The nation’s economy is increasing-ly dependent on cyberspace. Thishas introduced unknown interde-pendencies and single points offailure. A digital disaster strikessome enterprise every day, [and]infrastructure disruptions have cas-cading impacts, multiplying theircyber and physical effects. [5]

Predictions such as this are what moti-vated the National Strategy to SecureCyberspace, which mandates the Depart-ment of Homeland Security (DHS) to dothe following:

... promulgate best practices andmethodologies that promote

integrity, security, and reliability insoftware code development, in-cluding processes and proceduresthat diminish the possibilities oferroneous code, malicious code, ortrap doors that could be intro-duced during development. [5]

Given the scope of that directive, oneobvious solution is to ensure that securesoftware practices are embedded in work-force education, training, and develop-ment programs nationwide. The problemis that there is currently no authoritativepoint of reference to define what shouldbe taught [3]. For that reason, in 2005DHS created a working group to define aCBK for Secure Software Assurance<https://buildsecurityin.us-cert.gov/daisy/bsi/resources/dhs/95.html>. Thegoal of the CBK is to itemize all of theactivities that might be involved in pro-ducing secure code. DHS does not intendthe CBK to be used as a general standard,directive, or policy [3]. Instead, its sole pur-pose is to catalog secure practices thatmight be appropriate to currently existingacademic disciplines. Thus, the CBK is aninventory of potential knowledge areas with-in each contributing discipline.

The CBK assumes the following:

... software assurance is not a sepa-rate profession. What is not clear,however, is the precise relationshipbetween the elements of the CBKand the curricula of each potential-ly relevant field. [6]

So, the challenge is to correctly inte-grate secure software assurance practicesinto each contributing discipline [3, 6].

Several disciplines could conceivablybenefit from CBK, such as software engineer-ing, systems engineering, information systems secu-rity engineering, safety, security, testing, informa-tion assurance, and project management [3].Consequently, in order to ensure that theright content is taught in each, it is neces-sary to understand the proper relationship

between the CBK and the curricula ofeach relevant discipline [6].

Finding Where the CBK FitsInto Current CurriculaThe overall goal of the CBK is to ensure ade-quate coverage of requisite knowledge areas in eachcontributing discipline [3]. Accordingly, theworking group sought to understand theexact relationship of CBK elements toeach traditional curriculum. Once that rela-tionship was better understood, it was feltthat it should be possible to recommendthe right way to incorporate CBK contentinto each of the established disciplines.

That comparison was materially aidedby the fact that the sponsoring societies ofthe three most influential academic studieshad just finished their own survey of cur-ricular models for computing curricula.This was reported in “ComputingCurricula 2005: The Overview Report”commonly called CC2005 [7]. CC2005merges the recommendations for the con-tent and focus of computer engineering, com-puter science, information systems, informationtechnology, and software engineering curriculainto a single authoritative summary, whichis fully endorsed by the Association forComputing Machinery (ACM), theInstitute of Electrical and ElectronicsEngineers (IEEE) Computer Society, andthe Association for Information Systems.

CC2005 specifies 40 topic areas. These40 topics represent the entire range ofsubject matter for all five major comput-ing disciplines. The report specificallystates the following:

Each one of the five discipline-specific curricula represents thebest judgment of the relevant pro-fessional, scientific, and education-al associations and serves as a defi-nition of what these degree pro-grams should be and do. [7]

In addition to the 40 topic areas, whichin effect capture all of the knowledgerequirements for computing curricula

Integrating Software Assurance Knowledge IntoConventional Curricula

One of our challenges is deciding how best to address software assurance in university curricula. One approach is to incorpo-rate software assurance knowledge areas into conventional computing curricula. In this article, we discuss the results of a com-parison of the Common Body of Knowledge (CBK) for Secure Software Assurance with traditional computing disciplines.The comparison indicates that software engineering is probably the best fit for such knowledge areas, although there is overlapwith other computing curricula as well.

Jeffrey A. IngalsbeFord Motor Co.

Dr. Nancy R. MeadSoftware Engineering Institute

Dr. Dan ShoemakerUniversity of Detroit Mercy

16 CROSSTALK The Journal of Defense Software Engineering January 2008

January 2008 www.stsc.hill.af.mil 17

along with a ranking of their relativeemphasis in each specific discipline,CC2005 also summarizes the expectations forthe student after graduation [7]. This summaryidentifies 60 competencies that should beexpected for each graduate. By referencingthose identified competency outcomes, itis relatively easy to see the relationshipbetween CBK knowledge elements andthe CC2005 curricular requirements. It isalso easier to see the places where there isa misalignment between the CBK andeach discipline’s curricular goals.

The CBK was mapped to the CC2005recommendations for only three of thefive disciplines. The two disciplines atopposite ends of the CC2005 continuum,computer engineering and information technology,were omitted because the former overlapstoo much with electrical engineering andthe latter overlaps too much with business.

Because these three curricula (comput-er science, information systems, and soft-ware engineering) have differing focuses,the content of CC2005 was examined onediscipline at a time. First, a topic-by-topicanalysis of depth of coverage was done.Depth of coverage was defined as thequantity of material in the CBK that providesspecific advice about how to execute a given activ-ity in CC2005 in a more secure fashion.

To obtain a metric, the assessment ofquantity of material was based on a count ofthe textual references in the CBK thatcould be associated with each of the 40topics. The assumption was that the morereferences to the topic in the CBK thegreater the importance of integratingsecure software assurance content into theteaching of that topic (see Table 1 fordegree to which CC2005 KnowledgeAreas are reflected in the CBK).

What Does This Mean?The following eight CC2005 topic areashad a significant degree of coverage in theCBK (greater than 100 references): 1)requirements, 2) architecture, 3) design, 4)verification and validation (V&V), 5) evo-lution (e.g., maintenance), 6) processes, 7)quality, and 8) information systems projectmanagement. The following three CC2005topic areas had moderate coverage in theCBK (less than 100 but more than 10): 1)legal/professional/ethics/society, 2) riskmanagement, and 3) theory of program-ming languages.

This mapping shows that the mainfocus of the CBK is on generic softwarework rather than on the specific curricularaspects that characterize the study itself,such as algorithms (for computer science),or information systems management (forinformation systems). That indicates that

CBK content would be best integratedinto the places where the practical ele-ments of the life cycle are introduced,such as a software design project course.

Fit Between the CBK andDesired Outcomes for theProfessionThere were six priorities in CC2005. Theserange from highest possible expectationsthrough highest expectations to moderate expec-tations, low expectations, little expectations, andno expectations. One of the more interestingaspects of CC2005 is the 60 expected com-petencies. Because there is a difference infocus for each discipline, there is a differ-ence in what should be expected for eachof them. For instance, there is a differentset of presumed competencies for a com-puter scientist than for a software engineer.

The 60 expected competencies weretaken directly from the 40 learning topics.Each competency was examined to deter-mine which of the 40 topics could beassigned to it. For instance, if the compe-tency was to design a user-friendly interface,there are 255 references in the CBK todesign, and five references in the CBK tohuman/computer interfaces. So the number ofCBK references for this outcome wasassigned as 260 (Tables 2-4, pages 18-19).

For computer science there is a weak matchbetween the CBK and the highest possible com-petency expectations in that only one of theeight outcomes (12.5 percent) had anydegree of coverage. There is a slightly bet-ter match for the high expectations catego-

ry, three of 10 (30 percent). However, thereis an excellent match with moderate expec-tations, nine of 12 (75 percent).

For information systems there is a reason-able match between the CBK and the high-est possible competency expectations in that nineof the 22 competencies (40.9 percent)specified for that discipline are covered.There is no match for the high expecta-tions category. However, there is a goodmatch with moderate expectations, five ofnine (55.5 percent).

For software engineering there is a strongmatch between the CBK and the highest pos-sible set of competency expectations in that fourof the seven outcomes (57.1 percent) arecovered. There is reasonable match for thehigh expectations category in that three of12 competencies are covered (25 percent).There is also a good match with moderateexpectations in that five of nine areas arecovered (55.5 percent).

Integrating the CBK into theWorld of Practical EducationOne of the main inferences that can bedrawn from this comparison is that the cur-rent CBK is less focused on theory than itis on application of the knowledge in prac-tice. In essence, the results demonstratethat the CBK is built around and encapsu-lates knowledge about practical processesthat are universally applicable to securingsoftware rather than on discipline-specificconcepts, theories, or activities.

This is best illustrated by the matchesthemselves. Outcomes such as solve programming

Integrating Software Assurance Knowledge Into Conventional CurriculaTable 1: Degree to Which CC 2005 Knowledge Areas Are Reflected in the CBK

Knowledge Areas All Disciplines Cites Knowledge Areas AllDisciplines

Cites

Integrative Programming(integrated)

9 Analysis of Requirements 1

Algorithms 1 Technical RequirementsAnalysis

274

Complexity 6 Engineering Economics forSoftware

1

Architecture 150 Software Modeling andAnalysis

2

Operating Systems Principles andDesign

5 Software Design 255

Operating Systems Configurationand Use

5 Software V&V 401

Platform Technologies 3 Software Evolution(Maintenance)

438

Theory of Programming Languages 10 Software Process 296

Human-Computer Interaction (HCI) 5 Software Quality 163

Graphics and Visualization 1

Information Management (DB)Practice

1 Non-Computing Topics Cites

Legal/Professional/Ethics/Society 93 Risk Management 86

Information Systems Development 7 Project Management 156

Table 2: Match Between CBK and CC2005 Expected Competencies for Each Discipline; Degree of Disciplinary

Expectations Addressed by the CBK

Table 1: Degree to Which CC 2005 Knowledge Areas Are Reflected in the CBK

Training and Education

problems, do large scale programming, and designand develop new software and/or IS are high pri-orities in all of the disciplines. At the same time,each of these also has a significant degreeof coverage in the CBK.

High-priority items in each of these dis-ciplines that were not good matches with

the CBK tended to be such competenciesas prove theoretical results (computer science),develop proof-of-concept programs (computer sci-ence), select database products (informationsystems), use spreadsheet features well (informa-tion systems), do small scale programming (soft-ware engineering), and produce graphics or

game software (software engineering).Others, such as create a software user inter-

face, were a mixed bag, with a good matchto design but a poor match to HCI.

So, while these competencies might beindividually important to their specific disci-plines, they are not essential elements of

18 CROSSTALK The Journal of Defense Software Engineering January 2008

Computer Science

Highest Possible Expectation (5) Depth of Coverage

Solve programming problems (algorithms) Analysis (274), Design (255) strong

High Expectation (4) Depth of Coverage

Do large-scale programming (programming) Analysis (274), Design (255), Architecture (150) strong

Develop new software systems (programming) Analysis (274), Design (255), Architecture (150) strong

Create a software user interface (HCI) HCI (5) weak

Moderate Expectation (3) Depth of Coverage

Create safety-critical systems (programming) Analysis (274), Design (255), Architecture (150) strong

Process (296), V&V (401), PM (156)

Design information systems (IS) Analysis (274), Design (255), Architecture (150) strong

Implement information systems (IS) Process (296), V&V (401), PM (156) strong

Maintain and modify information systems (IS) Evolution (438) strong

Install/upgrade computers (planning) Process (296), V&V (401), PM (156) strong

Install/upgrade computer software (planning) Process (296), V&V (401), PM (156) strong

Design network configuration (networks) Analysis (274), Design (255), Architecture (150) strong

Manage computer networks (networks) Evolution (438) strong

Implement mobile computing system (networks) Analysis (274), Design (255), Architecture (150) strong

Information Systems/Information Technology

Highest Possible Expectation (5) Depth of Coverage

Create a software user interface (IS) HCI (5) weak

Define information system requirements (IS) IS Dev. (7), Bus. Req. (1), Analysis (274) strong

Design information systems (IS) Design (255), Modeling (5), Architecture (150) strong

Maintain and modify information systems (IS) Evolution (438) strong

Model and design a database (DB) DB (1), Design (255) strong

Manage databases (DB) Evolution (438) strong

Develop corporate information plan (planning) Process (296) strong

Develop computer resource plan (planning) PM (156) strong

Schedule/budget resource upgrades (planning) PM (156) strong

Develop business solutions (integration) Bus. Req (1), Modeling (5), Design (255) strong

High Expectation (4) Depth of Coverage

None n/a

Moderate Expectation (3) Depth of Coverage

Develop new software systems (programming) Analysis (274), Design (255), Architecture (150) strong

Install/upgrade computer software (planning) PM (155) strong

Manage computer networks (networks) Evolution (238) strong

Manage communication resources (networks) PM (155), Economics (1) strong

Conclusion

Conclusion

Conclusion

Conclusion

Conclusion

Conclusion

Table 2: Match Between CBK and CC2005 Expected Competencies for Computer Science

Computer Science

Highest Possible Expectation (5) Depth of Coverage

Solve programming problems (algorithms) Analysis (274), Design (255) strong

High Expectation (4) Depth of Coverage

Do large-scale programming (programming) Analysis (274), Design (255), Architecture (150) strong

Develop new software systems (programming) Analysis (274), Design (255), Architecture (150) strong

Create a software user interface (HCI) HCI (5) weak

Moderate Expectation (3) Depth of Coverage

Create safety-critical systems (programming) Analysis (274), Design (255), Architecture (150) strong

Process (296), V&V (401), PM (156)

Design information systems (IS) Analysis (274), Design (255), Architecture (150) strong

Implement information systems (IS) Process (296), V&V (401), PM (156) strong

Maintain and modify information systems (IS) Evolution (438) strong

Install/upgrade computers (planning) Process (296), V&V (401), PM (156) strong

Install/upgrade computer software (planning) Process (296), V&V (401), PM (156) strong

Design network configuration (networks) Analysis (274), Design (255), Architecture (150) strong

Manage computer networks (networks) Evolution (438) strong

Implement mobile computing system (networks) Analysis (274), Design (255), Architecture (150) strong

Information Systems/Information Technology

Highest Possible Expectation (5) Depth of Coverage

Create a software user interface (IS) HCI (5) weak

Define information system requirements (IS) IS Dev. (7), Bus. Req. (1), Analysis (274) strong

Design information systems (IS) Design (255), Modeling (5), Architecture (150) strong

Maintain and modify information systems (IS) Evolution (438) strong

Model and design a database (DB) DB (1), Design (255) strong

Manage databases (DB) Evolution (438) strong

Develop corporate information plan (planning) Process (296) strong

Develop computer resource plan (planning) PM (156) strong

Schedule/budget resource upgrades (planning) PM (156) strong

Develop business solutions (integration) Bus. Req (1), Modeling (5), Design (255) strong

High Expectation (4) Depth of Coverage

None n/a

Moderate Expectation (3) Depth of Coverage

Develop new software systems (programming) Analysis (274), Design (255), Architecture (150) strong

Install/upgrade computer software (planning) PM (155) strong

Manage computer networks (networks) Evolution (238) strong

Manage communication resources (networks) PM (155), Economics (1) strong

Conclusion

Conclusion

Conclusion

Conclusion

Conclusion

Conclusion

Table 3: Match Between CBK and CC2005 Expected Competencies for Information Systems

Integrating Software Assurance Knowledge Into Conventional Curricula

January 2008 www.stsc.hill.af.mil 19

secure software assurance as defined by theCBK. In view of that finding, it wouldappear to be easier to introduce CBK con-tent into curricula that is focused on teach-ing pragmatic software processes and meth-ods. And given its historic involvement withthose areas, the discipline of software engi-neering might be the place to start.

Therefore, one additional suggestionmight be that a similar study should be donebased strictly on Software Engineering2004, “Curricular Guidelines for Under-graduate Programs in Software Engineer-ing,” particularly Table 1: Software Engi-neering Education Knowledge Elements[8]. That itemizes a set of knowledge areas andknowledge units that are similar in focus andpurpose to the 40 knowledge areas con-tained in CC2005. Thus, it should be possi-ble to better understand the actual relation-ship between standard software engineeringcurricular content and the contents of theCBK through that comparison.

There is another distinct observationarising out of this study. Although the mod-erate expectations category does not reflectpriority areas, it is overwhelmingly the bestaligned category for each discipline. Whatthat might indicate is that, although securesoftware assurance is a legitimate area ofstudy for all of these fields, it is not thehighest priority in any of them. In terms ofdisciplinary implementations, the practi-tioner orientation and the fact that security

content is not the point of these fields indi-cates that courses that cover practical life-cycle functions might be the place to intro-duce secure software assurance contentwithin any given discipline.

As a final note, the measurementprocess used in this study (e.g., a raw count)is inherently less accurate than expert con-textual analysis of the meaning of eachknowledge element. Therefore, a more rig-orous comparison should be undertaken tobetter characterize the functional relation-ship between the items in the CBK and thevarious curricular standards. This would beparticularly justified for the study of soft-ware engineering curricular content men-tioned above. Once a means of compari-son that everybody can agree on is used, itshould be relatively simple to work out thenuts-and-bolts of specific implementationswithin each individual program.u

References1. President’s Information Technology

Advisory Committee. Cybersecurity: ACrisis of Prioritization. Arlington:Executive Office of the President,National Coordination Office forInformation Technology Research andDevelopment, 2005.

2. Jones, Capers. Software Quality in2005: A Survey of the State of the Art.Marlborough: Software ProductivityResearch, 2005.

3. Redwine, Samuel T., Ed. SoftwareAssurance: A Guide to the CommonBody of Knowledge to Produce,Acquire and Sustain Secure Software,Version 1.1. Washington: U.S. DHS,2006.

4. Newman, Michael. Software ErrorsCost U.S. Economy $59.5 BillionAnnually. Gaithersburg: NationalInstitute of Standards and Technology(NIST), 2002.

5. Clark, Richard A., and Howard A.Schmidt. A National Strategy to Se-cure Cyberspace. Washington: ThePresident’s Critical Infrastructure Pro-tection Board, 2002 <www.us-cert.gov/reading_room/cyberspace_strategy.pdf.>.

6. Shoemaker, D., A. Drommi, J. Ingals-be, and N.R. Mead. “A Comparison ofthe Software Assurance CommonBody of Knowledge to CommonCurricular Standards.” Dublin: 20thConference on Software EngineeringEducation and Training, 2007.

7. Joint Taskforce for ComputingCurricula. Computing Curricula 2005:The Overview Report. ACM/AIS/IEEE, 2005.

8. Joint Taskforce for ComputingCurricula. Software Engineering 2004,Curricular Guidelines for Undergrad-uate Programs in Software Engineer-ing. ACM/IEEE, 2004.

Software Engineering

Highest Possible Expectation (5) Depth of Coverage

Do large-scale programming (programming) Analysis (274), Design (255), Architecture 150) strong

Develop new software systems (programming) Analysis (274), Design (255), Architecture (150) strong

Create safety-critical systems (programming) Analysis (274), Design (255), Architecture (150) strong

Process (296), V&V (401), PM (156) strong

Manage safety-critical projects (programming) V&V (401), PM (156), Evolution (438) strong

High Expectation (4) Depth of Coverage

Develop new software systems (programming) Analysis (274), Design (255), Architecture (150) strong

Create a software user interface (HCI) HCI (5) weak

Define information system requirements (IS) IS Development (7), Bus. Req. (1), Analysis (274) strong

Moderate Expectation (3) Depth of Coverage

Design a human-friendly device (HCI) HCI (5), Design (255) strong

Design information systems (IS) Design (255), Modeling (5), Architecture (150) strong

Maintain and modify information systems (IS) Evolution (438) strong

Install/upgrade computers (planning) Process (296), V&V (401), PM (156) strong

Install/upgrade computer software (planning) Process (296), V&V (401), PM (156) strong

Manage computer networks (networks) Evolution (438) strong

Implement mobile computing system (networks) Analysis (274), Design (255), Architecture (150) strong

Conclusion

Conclusion

Conclusion

Table 4: Match Between CBK and CC2005 Expected Competencies for Software Engineering

Training and Education

20 CROSSTALK The Journal of Defense Software Engineering January 2008

About the Authors

Dan Shoemaker, Ph.D.,is the director of theCentre for AssuranceStudies. He has been pro-fessor and chair of com-puter and information

systems at the University of DetroitMercy for 24 years, and co-authored thetextbook, “Information Assurance forthe Enterprise.” His research interestsare in the areas of secure software assur-ance, information assurance and enter-prise security architectures, and informa-tion technology governance and control.Shoemaker has both a bachelor’s and adoctorate degree from the University ofMichigan, and master’s degrees fromEastern Michigan University.

Computer and Information Systems – College of Business Administration University of Detroit Mercy Detroit, MI 48221Phone: (313) 993-1202E-mail: [email protected]

Nancy R. Mead, Ph.D.,is a senior member ofthe technical staff in theNetworked Systems Sur-vivability Program at theSEI. She is also a faculty

member at Carnegie Mellon University.Mead’s research interests are in the areasof information security, softwarerequirements engineering, and softwarearchitectures. She is a Fellow of theIEEE and the IEEE Computer Societyand is also a member of the Associationfor Computing Machinery. Meadreceived her doctorate in mathematicsfrom the Polytechnic Institute of NewYork, and received bachelor’s and mas-ter’s degrees in mathematics from NewYork University.

SEI4500 5th AVEPittsburgh, PA 15213E-mail: [email protected]

Jeffrey A. Ingalsbe is asenior security and con-trols engineer with FordMotor Company wherehe is involved in infor-mation security solutions

for the enterprise, threat modelingefforts, and strategic security research.He has a bachelor’s degree in electricalengineering and a master of sciencedegree in computer information systemsfrom Michigan Technological Universityand the University of Detroit Mercy,respectively. He is currently working on adoctorate in software engineering atOakland University. Ingalsbe serves asan expert industry panelist on twonational working groups within theDHS’s Cybersecurity Division.

17475 Federal DRSTE 800-D04Allen Park, MI 48101Phone: (313) 390-9278E-mail: [email protected]

January 2008 www.stsc.hill.af.mil 21

Software is everywhere. It is in our kitchenappliances; it is in our cars. It allows us to

communicate worldwide, and it helps usmanage our personnel systems. It is in ourweapons systems. Famously, 80 percent ofthe F-22’s functionality is performed in soft-ware; some have said that taking a picture ofan F-22 is the only thing you can do with itthat does not require software [1]. Softwareis so integral to the F-35 that LockheedMartin spearheaded the effort to create asafety-critical C++ standard [2]. There is noproject in today’s military that is not affect-ed by software. Software, like any othertechnology, has its limitations; as dependentas we have become on software, those limi-tations become our limitations. Many won-der how we can add armor to our programs’Achilles heels.

Perhaps you were motivated bySecretary Wynne’s emphasis on making edu-cation a priority in your career [3, 4].Perhaps you read the National DefenseIndustrial Association’s report on the topdefense software engineering issues and arewondering how you can overcome theseissues in your program [5]. Perhaps you aresimply looking for some job-relevant educa-tion to satisfy the Acquisition ProfessionalDevelopment Program continuing educa-tion requirements without taxing your unit’stravel budget. We are here to help you. Wecan bring education to your office or home,and we will not charge you or your organi-zation a dime.

The AFIT’s SPDP is a distance learning,professional continuing education programdesigned to benefit the Department ofDefense (DoD) organizations and individu-als with varying levels of experience andresponsibility. SPDP is well into its seconddecade, but it has been anything but stag-nant; we have constantly been adapting tomeet the needs of the defense softwareengineering community. You may have readabout SPDP when it transitioned from aresident program to satellite-delivered dis-tance learning program [6, 7]. You may haveread about SPDP when it transitioned fromsatellite delivery to Internet streaming and

from quarter-long courses to month-longcourses [8]. Since then, we have been work-ing to improve the education we provide toour students [9, 10].

A typical SPDP course is four weekslong; the lectures are available throughInternet streaming or they can be down-loaded to view offline. This format has per-mitted our students to complete courses in ahigh-paced environment; we have even hadstudents take SPDP courses while deployed

to Southwest Asia and while at sea. SPDPcourses differ from asynchronous Web-based training in many ways. The most sig-nificant is that they are instructor-led cours-es – real, human instructors who providethe lectures (see Figure 1, page 22), comple-mented with reading assignments from atextbook. We also make use of online dis-cussion boards and sometimes teleconfer-ences to provide continuous instructor-stu-dent and student-student interaction.Finally, our students are evaluated withhomework assignments and exams.

Available CoursesWe currently offer 12 SPDP courses. Wehave two courses focused on project man-agement:• CSE 479, Software Project Initiating and

Planning.• CSE 480, Software Project Monitoring

and Control.The software engineering life cycle is cov-

ered in six courses. CSE 481, Introductionto Software Engineering, provides anoverview, and each major activity of thesoftware life cycle is covered in greater detailin its own course:• CSE 482, Software Requirements.• CSE 483, Software Design.• CSE 484, Software Implementation.• CSE 485, Software Systems Mainte-

nance.• CSE 486, Verification, Validation and

Testing.Our next three courses address object-ori-ented development:• CSE 487, Fundamentals of Object-

Oriented Systems.• CSE 488, Modeling Object-Oriented

Systems using UML.• CSE 489, Advanced Analysis and

Design of Object-Oriented Systems.Our final course, CSE 496, SoftwareEngineering Practicum, is a three-week resi-dent course held at our campus near Wright-Patterson AFB, Ohio.

With the exception of CSE 489 andCSE 496, none of these courses have pre-requisites. You can choose to take them allor only the ones you are interested in andyou can take them in any order. We havecommitted to offering each of these cours-es at least once per year, though whendemand and faculty resources are in accord,we will provide more frequent offerings.

As stated earlier, a typical SPDP courseis four weeks long. Each week, the instruc-tor will provide two lectures and perhapswill hold an optional teleconference. Therewill be reading assignments and homeworkassignments. There may be a mid-termexam, and the class will conclude with a finalexam.

The atypical courses are CSE 489 andCSE 496, as these are project-based courses.CSE 489 is still a four-week, instructor-ledcourse, but there is only one lecture perweek and one mandatory teleconference inwhich the students present project progress.Enrollment in CSE 489 requires completionof CSE 487 and CSE 488. CSE 496 is athree-week resident course in which stu-

Software Engineering Continuing Educationat a Price You Can Afford

Software is so critical to today’s military programs that failure of the software generally results in failure of the program.Hoping for success is the surest way to avoid it. Avoiding software failure is not accidental; it requires careful application ofsound software engineering. Whether you are an engineer, a program manager, a contracting officer, or anyone else involved inthe development, acquisition, or sustainment of a software-intensive system, the Air Force Institute of Technology’s (AFIT)Software Professional Development Program (SPDP) can bring you the education you need to achieve software success1.

Maj Christopher Bohn, Ph.D.Air Force Institute of Technology

“Software, like any othertechnology, has its

limitations; as dependentas we have become on

software, thoselimitations become our

limitations.”

22 CROSSTALK The Journal of Defense Software Engineering January 2008

dents form a development team and havethe opportunity to apply what they havelearned; before enrolling in CSE 496, a stu-dent must have completed at least sevenother SPDP courses.

CertificationsAFIT offers four certifications to help youmark your progress through SPDP and tohelp you demonstrate that you have taken abreadth of courses. The Software Engi-neering Management Certificate is awardedin recognition of successful completion oftopics related to software project manage-ment and the individual components of thesoftware life-cycle model; it will be awardedfor the completion of CSE 479, CSE 480,and CSE 481. The Software LifecycleDevelopment Certificate is awarded inrecognition of a more in-depth study ofeach of the phases of the software life cycle;it will be awarded after the successful com-pletion of CSE 482, CSE 483, CSE 484, andCSE 485. The Advanced SoftwareDevelopment Certificate is awarded afterthe successful completion of CSE 486, CSE487, and CSE 488, in recognition of theseanalysis, modeling, and testing topics.Finally, the Technical Software Develop-ment Certificate is awarded after the suc-cessful completion of CSE 489 and CSE496, in recognition of completing the pro-ject-centered courses in the program.

Besides the AFIT certifications, SPDPcan help you toward another credential.AFIT’s School of Systems and Logistics isone of seven registered educationalproviders worldwide for the Institute ofElectrical and Electronics Engineers(IEEE) Computer Society’s CertifiedSoftware Development Professional(CSDP) program [11]. The CSDP certifica-tion is the only software development certi-fication that has all of the components of aprofessional certification: an exam demon-

strating mastery of the SoftwareEngineering Body of Knowledge (SWE-BOK), an experience base, and continuingeducation. While taking the SPDP courses isneither sufficient nor necessary to earn theCSDP, completing the curriculum will fullyimmerse you in the SWEBOK, preparingyou for the CSDP exam.

EligibilitySPDP classes are funded through AFIT forall DoD employees (active duty and reservecomponent service members and govern-ment civilians). Contractors and employeesof other US. Government Agencies may alsoenroll in SPDP courses on a space-availablebasis. There is no tuition charged for SPDPcourses, and AFIT provides textbooks freeto DoD employees. Contractors and non-DoD government employees are responsiblefor procuring their own textbooks prior tothe beginning of a course offering.

Nobody wants to be another statistic forthe next study of defense software crises.You can mitigate this risk by arming yourselfwith software engineering knowledge andskills. Fortunately, with the SPDP, you willnot need to dig into your tight travel fundsand you will not need to figure out how topay for expensive continuing educationcourses. Our faculty is here to help. Thefeedback we have received from our stu-dents and their supervisors is that they usu-ally see improvement during their currentprojects.

If you are interested in taking SPDPcourses, or at least curious, please visit theSPDP Web site at <www.afit.edu/ls/spdp/>, e-mail the faculty at <[email protected]>, or contact Candace Barker at (937)255-7777 ext. 3319, or DSN 785-7777 ext.3319.u

Note1. The views expressed in this article are

those of the author and do not reflectthe official policy or position of the AirForce, DoD, or the U.S. Government.

References1. Ferguson, Jack. “Crouching Dragon,

Hidden Software: Software in DoDWeapon Systems.” IEEE SoftwareJuly/Aug. (2001): 105-107.

2. Carroll, Kevin. “Deploying C++ forUse in International Safety-CriticalApplications.” Proc. from Systems andSoftware Technology Conference,2007.

3. Wynne, Michael W. “Letter to Airmen:Education and the Airman.” 13 Apr.2006. <www.af.mil/library/viewpoints/secaf.asp?id=229>.

4. “Airman’s Roll Call: EducationBenefits Essential to Professional,Personal Development.” 8 Aug. 2007<www.af.mil/shared/media/document/AFD-070807-058.pdf>.

5. National Defense Industrial Associa-tion. “Top Software EngineeringIssues within Department of Defenseand Defense Industry.” 2006. <www.ndia.org/Content/ContentGroups/Divisions1/Systems_Engineering/PDFs18/NDIA_Top_SW_Issues_2006_Report_v5a_final.pdf>.

6. “The Air Force Software ProfessionalDevelopment Program.” Cross-Talk Dec. 1994.

7. “Distance Learning in the Air ForceProfessional Development Program:An Update.” CrossTalk Feb. 1995.

8. Hermann, Brian. “The AFIT OffersSoftware Continuing Education atYour Location at No Cost.”CrossTalk Dec. 2002.

9. Reisner, John. “Bridging the Gap:Peer-to-Peer Learning in a DistanceEnvironment.” Proc. Interservice/Industry Training, Simulation andEducation Conference, 2004.

10. Bohn, Christopher A. “Watch Mr.Software.” Proc. InternationalConference on Frontiers in Education:Computer Science and ComputerEngineering.

11. IEEE Computer Society <www.computer.org/certification/>.

Figure 1: The author teaches CSE 489 whileattending the 2006 Systems and SoftwareTechnology Conference.

About the Author

Maj Christopher Bohn,Ph.D., is a Software En-gineering Course Direc-tor at AFIT’s School ofSystems and Logistics,where he teaches a series

of distance-learning short courses. Overthe past 14 years, he has served in vari-ous Air Force operational and researchassignments. He is an IEEE-certifiedsoftware development professional.Bohn has a bachelor’s degree in electricalengineering from Purdue University, amaster’s degree in computer engineeringfrom AFIT, and a doctorate from TheOhio State University.

AFIT/LS Research Park Campus3100 Research BLVDKettering, OH 45420-4022Phone: (937) 255-7777 ext. 3415 Fax: (937) 656-4654E-mail: [email protected]

Training and Education

January 2008 www.stsc.hill.af.mil 23

For the purpose of this article, an inspec-tion is defined as a preemptive peer

review of work products – by trained indi-viduals using a well defined process – todetect and eliminate defects as early aspossible in the Software DevelopmentLife Cycle (SDLC) or closest to the pointsof defect injection.

BackgroundAccording to a National Institute ofStandards and Technology (NIST) study,the problem of continued delivery of bug-riddensoftware is costing the U.S. economy an esti-mated $59.5 billion each year. The studyalso found the following:

…although all errors cannot beremoved, more than a third ofthese costs, or an estimated $22.2billion, could be eliminated by animproved testing infrastructure[reviews, inspections, etc.] thatenables earlier and more effectiveidentification and removal of soft-ware defects. These are the savingsassociated with finding anincreased percentage [but not 100percent] of errors closer to thedevelopment stages in which theywere introduced. Currently, overhalf of all errors are not founduntil ‘downstream’ in the develop-ment process (testing) or duringpost-sales software use. [1]

Figure 1 shows a typical relationshipbetween the costs of repairing a defect ina given phase of the development cycleversus which phase the defect was intro-duced. This relationship gives rise to thedevelopment costs described in the NISTreport.

The following testimonials answer thequestion: What is the evidence that inspectionsaddress the cost and quality issues described earli-er but are not widely used correctly to maximizedefect detection and removal?

• The data in support of the quality,

cost, and schedule impact ofinspections is overwhelming. Theyare an indispensable part of engi-neering high-quality software. [3]

• Inspections are surely a key topic,and with the right instrumentationand training they are one of themost powerful techniques fordefect detection. They are botheffective and efficient, especiallyfor up-front activities. In additionto large-scale applications, we areapplying them to smaller applica-tions and incremental development(Chris Ebert). [3]

• Inspection repeatedly has beendemonstrated to yield up to a 10-to-1 return on investment. . . .depressingly few practitionersknow about the 30-year-old tech-nique of software inspection. Evenfewer routinely perform effectiveinspections or other types of peerreviews. [4]

• Formal inspections can raise the[defect] removal efficiency to over95 percent. But part of the prob-lem here is that not a lot of com-panies know how to use thesethings. [5]

• The software community has usedinspections for almost 28 years.During this timeframe, inspectionshave consistently added value formany software organizations. Yetfor others, inspections never suc-ceeded as well as expected, primar-ily because these organizations didnot learn how to make inspectionsboth effective and low cost. [6]

• I continue to be amazed at the num-ber of software development orga-nizations that do not use this pow-erful method [inspections] toimprove quality and productivity. [7]

It is clear from these testimonials thatinspections are the most effective way toimprove the quality, schedule, and cost ofdeveloping software, but after all the yearsafter their introduction, why are they notan integral part of all software develop-ment life cycles?

The authors of this article, RogerStewart and Lew Priven each spent morethan 20 years developing projects thatused inspections and, for the past eightyears, each has trained a wide variety ofcompanies in the use of Fagan inspec-tions. They consistently observed thatsoon after inspection training completes,malicious compliance sets in by critical inspec-tion execution deviations being intro-duced and/or ineffective shortcuts beingemployed. This results in inspection bene-fits being compromised, leads to limiteduse or discontinuation, and allows toomany defects to escape to later, more cost-ly phases of test and customer use.

Back to BasicsIn order to deal with the problem ofinspections not being widely used (or notused correctly for the maximum benefit),we need to go back and look at the origi-nal approach. Inspections were an out-growth of the quality message from gurusW. Edwards Demming and J.M. Juran todesign in quality at the beginning of thedevelopment process, instead of testing in

How to Avoid Software Inspection Failureand Achieve Ongoing Benefits

Roger Stewart and Lew Priven Stewart-Priven Group

The objectives of this article are to examine why software inspections are not used more widely, identify the issues contribut-ing to their lack of use, identify why inspection benefits deteriorate for many companies, and recommend what can be done toaddress and solve these issues. The proven benefits of inspections are too significant to let them fall by the wayside!

Software Engineering Technology

37

73

130

26

133

1

50

105

110 2

151

1

0

50

100

150

200

250

350

400

Phase Defect Introduced

Ph as

eR

epai

red

Relative Cost to Repair

400

350

400

300

250

200

150

100

50

0

368

130

50

6437

7 310

51

2613

31

115

102 1

Requirements

Customer

Design

Test

Integration

IntegrationTest

CodeDesign

Requirements

RelativeCost toRepair

350

Code

PhaseRepaired

“1” IdentifiesPhase Defect

Introduced

Relative Cost of Software Fault Propogation

Figure 1: Cost of Fixing a Defect [2]

24 CROSSTALK The Journal of Defense Software Engineering January 2008

Software Engineering Technology

pseudo-quality at the end of the produc-tion line.

What naturally followed was the ideaof applying sampling quality control tech-niques to the software development lifecycle as if it were a production line.Specifically, this involves sampling theproduct periodically (detect defects), mak-ing adjustments as defects are found (fixdefects and improve the developmentprocess), and predicting the shipped prod-uct quality based on the results of the sam-pling.

Application of the sampling quality con-trol techniques to the software developmentcycle led to the development of the softwareinspection process. The most widely knownand practiced inspection process was intro-duced to the IBM software community in1972 by a team led by Michael Fagan andmanaged by Lew Priven (co-author of thisarticle) [6].

In the case of software, the develop-ment life cycle is the production line, andinspections are the sampling and predictiontechnique. Inspections are the vehicle tosample the product in the earlier phases ofthe development life cycle to detect and fixdefects closest to the point of injection, and

the data collected from inspections can beused as the basis for predicting the quality ofthe delivered product.

How Have Inspections Evolved?In 1972, Priven published an IBM TechnicalReport which described a software develop-ment management system including points ofmanagement control using process monitorsthat evolved into inspections [8]. The man-agement system was based on a well-defineddevelopment process – which satisfied theneed for a production line as described ear-lier. With the production line in place, Privenhired Michael Fagan, a quality engineer witha hardware and manufacturing background,to work with the development team to finda way to improve the quality of deliveredsoftware [6-10]. The IBM (Fagan)Inspection Process then evolved as a criticalcomponent of the end-to-end softwaredevelopment life cycle. Over the years, theintegration of inspections into the softwaredevelopment life cycle has been lost as theinspection process came to be viewed as a standalonequality process with inspection execution becomingthe prime focus. However, the supportinginfrastructure of a software developmentlife cycle is still critical to successfully imple-

menting inspections.

Why Is the SupportingDevelopment InfrastructureImportant? The supporting infrastructure of a well-defined development process is importantbecause it requires management at all lev-els – and during all development phases –to actively support the inspection process.A life-cycle view is needed because thecost and schedule impact are primarilyborne by the requirements, design, andimplementation components of the orga-nization. However, while these compo-nents also realize some of the reducedcost, higher quality, and improved sched-ule benefits, the majority of these benefitsare primarily realized in testing and main-tenance.

Theory Is Good, but Why AreInspections Not Embraced?In addition to being viewed as a stand-alone process, which lacks a life-cycle viewof investment and associated savings,inspections have also been characterizedby a number of myths. These myths dis-courage implementation. While there is akernel of truth in each myth, each can beturned into a positive. Some examples fol-low:• Inspections are time consuming.

Yes, they add up-front developmenttime to requirements, design and code.However, rather than being viewed asa problem, this additional up-fronttime for inspections should be viewedas an investment in obtaining the over-all quality, cost, and schedule benefitsover the project’s life cycle.

• Inspections are bureaucratic andone size fits all. System engineers andsoftware engineers, with support frommanagement, need to have the flexibil-ity to adjust their inspection process tothe needs of the product under devel-opment. For example, the differencebetween inspecting software to controla jet fighter (where a defect could be amatter of life and death) and softwarethat displays a Web form (where theimpact of a defect may be an inconve-nience). The former may require abroader comprehensive set of inspec-tions while the latter could employother visual analysis techniques to sup-plement a base set of inspections.

• All work products must be inspect-ed. There is a lack of guidance onwhen, where, and how to start aninspection process. An approach toprioritizing what work products to

0

50

100

150

200

250

300

350

400

Graph from December 2005 'CrossTalk', page 16 sidebar titled "Economics of Fault Finding"

RequirementsDesign

CodeUnit Test

System TestCustomer

Design Code Unit Test SystemTest

Defects inserted (‘1’ on the graph) and not discovered and fixed at their point of insertion are much costlier to fix laterin the Software Development Life Cycle. For example, in the graph:

Defects inserted during Requirement specification could be five times more costly to fix during Design, or 10 times morecostly to fix during Coding. Defects inserted during Design could be 26 times most costly to fix during System Test.

R elativeCost toR epair

0

50

100

150

200

250

300

350

400

Graph from December 2005 'CrossTalk', page 16 sidebar titled "Economics of Fault Finding"

RequirementsDesign

CodeUnit Test

System TestCustomer

Design Code Unit Test SystemTest

Defects inserted (‘1’ on the graph) and not discovered and fixed at their point of insertion are much costlier to fix laterin the Software Development Life Cycle. For example, in the graph:

Defects inserted during Requirement specification could be times more costly to fix during Design, or 10 times morecostly to fix during Coding. Defects inserted during Design could be 26 times most costly to fix during System Test.

ALLReqts

Quality-Cr it ical Areas;(examples)

• Securit y• Error handling• Algor ithms• Interfaces (ex: user)

and Com plex Areas

Defect -Prone Areas

Defect Fixes

Remaining Areas,if econom ic payoff

RequirementPhase

ALLDesign

Sel

ectio

n C

riter

ia

MaintenanceDesignPhase

Code / ImplementationPhase

TestingPhases

Feature Enhancements

ALLReqts

Quality-Cr it ical Areas;(examples)

• Securit y• Error handling• Algor ithms• Interfaces (ex: user)

and Com plex Areas

Defect -Prone Areas

Defect Fixes

Remaining Areas,if econom ic payoff

RequirementPhase

ALLDesign

Sel

ectio

n C

riter

ia

Development MaintenanceDesignPhase

Code / ImplementationPhase

TestingPhases

CustomerUse

Feature Enhancements

RelativeCost toRepair

“1” Identifies Phase Defect Introduced

RequirementsRequirements

Relative Cost of Software Fault Propagation

PhaseRepaired

Figure 2: Prioritizing What to Inspect

1. Review &Assessment ofDevelopmentInfrastructure 2. Tuning

Recommendation;& any PrerequisiteInfrastructureImplementation 3. Inspection

Methodology:

Tools 4. Performance,Consultingand Coaching

Assessment Methodology

Development Infrastructure

ToolsImplementation

1. Review andAssessment ofDevelopmentInfrastructure 2. Tuning

Recommendation;and any PrerequisiteInfrastructureImplementation 3. Inspection

Methodology:

Tools and Training 4. Performance,Consultingand Coaching

Assessment Methodology

Development Infrastructure

Tools and TrainingImplementation

Figure 3: Assessment Methodology

Table 1: Risks Associated With Inspection Pitfalls

Lack of supportiveSDLC infrastructure

Poor management understandingof the Inspection Process, itsbenefits, and their responsibilities

No computerizedmanagement-planning tools

Inadequate monitoring ofinspection execution andtracking of results

Slow inspection implementationby project teams

10

1

2

3

4

5

6

7

8

9

No inspection facilitator/project champion

Too little schedule timefor inspections

No computerized inspector tools

No post-classpractitioner refresher

No inspection process capture

• Immature practices for planning, data collection, reporting, monitoring, and tracking• Leads to Pitfalls #3, 4, 6, 10

• Assessment methodology ° Step 1: Assess client SDLC ° Step 2: Recommend any changes

• Leads to Pitfalls #4, 6, 8, 9 ° Inadequate inspection: schedules, tools facilitation, implementation

• Upper management overview• Management performance class• Student feedback from inspection class

• Inadequate schedule time (Pitfall #4)• No savings appreciation, leads to no inspections or too few inspections

• Planning counter tool• Savings/cost-estimator tool

• Defects escape to more costly phases to fix• Inspections not correctly executed• Leads to malicious compliance

• Inspection planning counter tool• Management performance class• Criteria for prioritizing what to inspect

• Inconsistency, compromising shortcuts• Defects, escape to more costly phases to fix

• Preparation tool• Inspection meeting tool• Data analysis tool, team analysis tool

• Inspection process execution deteriorates• Defects escape to more costly phases to fix• Employees lose interest when savings summaries are not periodically shared

• ROI calculators for text and code• Data analysis tool, team analysis tool• Aggregate results calculator tool

• Process misunderstood, compromising shortcuts introduced, defects escape

• Seminar for previous students• Inspection role-reference card• Inspection product checklist kit

• Inspection process issues not addressed, coordinated, or resolution disseminated• Inconsistent or incorrect inspection execution• Little useful feedback to management

• Upper management overview• Management performance class

• Ineffective start or no-start occurs• Inspection training forgotten; incorrect execution

• Inspector training accommodating multiple classes, of 2 days or less, per week

• Process misunderstood, inconsistent execution, defects escape• No repository for project lessons learned

• Course material tailoring• Inspection process capture tool

Pitfall SolutionsPitfall RisksNo. Pitfall

Figure 3: Assessment Methodology

January 2008 www.stsc.hill.af.mil 25

inspect needs to be intelligentlyapplied.While these are myths that we typical-

ly hear about inspections, upon furtherexamination they are symptoms of amuch larger underlying set of issues. Theremainder of the article will focus on deal-ing with those issues which we will laterrefer to as inspection pitfalls.

A Realistic ApproachAlthough complete inspection coveragemay be ideal, a realistic approach to inspec-tions is to formulate a set of selection criteria(see Figure 2). These criteria guide the iden-tification of those areas of the productmost critical to success or where problemsare most likely to occur. At the least theseareas should be inspected. This addressesthe common complaint that there is notenough time to integrate inspections intotight schedules yet allows for using inspec-tions for finding defects where they aremost likely to cause problems.

Figure 2 addresses this no-time issue byshowing the prioritization of what to inspectrelated to the development cycle phases ofthe project. There should be a strong focuson requirements and design, which are the

most costly to fix when discovered later inthe development cycle (see Figure 1). Thefocus on requirements and design is partic-ularly important because our experience hasshown that the largest numbers of defectsare injected during these two phases ofdevelopment. One example from a TRWstudy shows about 52 percent of defects are

injected in requirements and 28 percent areinjected in design [11].

The most successful implementationsof inspections have been in organizationsthat have multi-level active managementsupport of inspections and a well-defineddevelopment life cycle with pre-existingemphasis on planning, monitoring, and

1. Review &Assessment ofDevelopmentInfrastructure 2. Tuning

Recommendation;& any PrerequisiteInfrastructureImplementation 3. Inspection

Methodology:

Tools 4. Performance,Consultingand Coaching

Assessment Methodology

Development Infrastructure

ToolsImplementation

1. Review andAssessment ofDevelopmentInfrastructure 2. Tuning

Recommendation;and any PrerequisiteInfrastructureImplementation 3. Inspection

Methodology:

Tools and Training 4. Performance,Consultingand Coaching

Assessment Methodology

Development Infrastructure

Tools and TrainingImplementation

Figure 3: Assessment Methodology

Table 1: Risks Associated With Inspection Pitfalls

Lack of supportiveSDLC infrastructure

Poor management understandingof the Inspection Process, itsbenefits, and their responsibilities

No computerizedmanagement-planning tools

Inadequate monitoring ofinspection execution andtracking of results

Slow inspection implementationby project teams

10

1

2

3

4

5

6

7

8

9

No inspection facilitator/project champion

Too little schedule timefor inspections

No computerized inspector tools

No post-classpractitioner refresher

No inspection process capture

• Immature practices for planning, data collection, reporting, monitoring, and tracking• Leads to Pitfalls #3, 4, 6, 10

• Assessment methodology ° Step 1: Assess client SDLC ° Step 2: Recommend any changes

• Leads to Pitfalls #4, 6, 8, 9 ° Inadequate inspection: schedules, tools facilitation, implementation

• Upper management overview• Management performance class• Student feedback from inspection class

• Inadequate schedule time (Pitfall #4)• No savings appreciation, leads to no inspections or too few inspections

• Planning counter tool• Savings/cost-estimator tool

• Defects escape to more costly phases to fix• Inspections not correctly executed• Leads to malicious compliance

• Inspection planning counter tool• Management performance class• Criteria for prioritizing what to inspect

• Inconsistency, compromising shortcuts• Defects, escape to more costly phases to fix

• Preparation tool• Inspection meeting tool• Data analysis tool, team analysis tool

• Inspection process execution deteriorates• Defects escape to more costly phases to fix• Employees lose interest when savings summaries are not periodically shared

• ROI calculators for text and code• Data analysis tool, team analysis tool• Aggregate results calculator tool

• Process misunderstood, compromising shortcuts introduced, defects escape

• Seminar for previous students• Inspection role-reference card• Inspection product checklist kit

• Inspection process issues not addressed, coordinated, or resolution disseminated• Inconsistent or incorrect inspection execution• Little useful feedback to management

• Upper management overview• Management performance class

• Ineffective start or no-start occurs• Inspection training forgotten; incorrect execution

• Inspector training accommodating multiple classes, of 2 days or less, per week

• Process misunderstood, inconsistent execution, defects escape• No repository for project lessons learned

• Course material tailoring• Inspection process capture tool

Pitfall SolutionsPitfall RisksNo. Pitfall

Table 1: Risks Associated With Inspection Pitfalls

How to Avoid Software Inspection Failure and Achieve Ongoing Benefits

Table 2 shows how an Inspection Methodology can solve and prevent the 10 inspection pitfalls.

No. Pitfall Pitfall Solutions

1 Lack of supportive SDLC infrastructure • Assessment methodology– step 1 assess client SDLC.– step 2 recommends any changes.

2 Poor management understanding of the inspectionprocess, its benefits, and their responsibilities

• Upper management overview.• Management performance class.• Student feedback from inspection class.

3 No computerized management planning tools • Planning-counter tool.• Savings/cost estimator tool.

4 Too little schedule time for inspections • Inspection planning-counter tool.• Management performance class.• Criteria for prioritizing what to inspect.

5 No computerized inspector tools • Preparation tool.• Inspection meeting tool.• Data analysis tool, team analysis tool.

6 Inadequate monitoring of inspection execution andtracking of results

• ROI calculators for text and code.• Data Analysis tool, team analysis tool.• Aggregate results calculator tool.

7 No post-class practitioner refresher • Seminar for previous students.• Inspection role reference card.• Inspection product checklist kit.

8 No inspection facilitator / project champion • Upper management overview.• Management performance class.

9 Slow inspection implementation by project teams • Inspector training accommodating multiple. classes, of two days orless, per week.

10 No inspection process capture • Course material tailoring.• Inspection process capture tool

Table 2 – How an Inspection Methodology Can Solve and Prevent the Pitfalls

Figure 4 – Computerized Inspection Tool Overview

The Road Map to Success

Product/ProjectInspectionPlanning Tools

Inspection ExecutionAnalysis and ReportingTools

Inspection Monitoringand Tracking Tools

Figure 4: Computerized Inspection Tool Overview

Software Engineering Technology

26 CROSSTALK The Journal of Defense Software Engineering January 2008

measurements use.

Development Infrastructureto Support InspectionsThere is a lot of guidance on the structureof inspections such as the Institute forElectronics and Electrical Engineers(IEEE) Standard 1028-1997 [12] and howto conduct an inspection, but little guid-ance on the following:1. How to select what to inspect (see

Figure 2).2. How to develop an appropriate soft-

ware development life-cycle infrastruc-ture that provides the necessary frame-work for successful implementation ofinspections.

3. How to determine what computerizedtools are needed to ensure properinspection execution and managementvisibility into results, project savings,and return on investment (ROI). Forexample, because of the lack ofinspection tools, data collection –which is too often left to the discretionof the inspection teams – is often notperformed or is performed inconsis-tently. Therefore, data needed to eval-uate inspection effectiveness is noteasily available to management.These three items will be discussed

further in the next section.Successful inspection implementation

requires a software development life-cycleinfrastructure that demands planning, datacollection, reporting, monitoring, andtracking. Introducing inspections into a

project culture that does not believe in andhave a development infrastructure thatactively supports these activities is fraughtwith risk.

Developing an appropriate infrastruc-ture begins with selecting a frameworkupon which to build your developmentlife cycle. A widely accepted framework isthe Capability Maturity Model® (CMM®),and its successor CMM-Integration(CMMI®). However, as Watts Humphreypoints out in [13], “Although the CMM[and CMMI] provides a powerfulimprovement framework, its focus is nec-essarily on what organizations should do –not how they should do it.”

There are four key steps to filling outthe development framework:1. Select a development model (e.g., iter-

ative, incremental, waterfall, spiral,agile).

2. Clearly define the development lifecycle by identifying and recording foreach process within the life cycle, itsrequired inputs, the input’s entrancecriteria, the what and how of theprocess, the expected outputs, and theoutput’s exit criteria.

3. Get process agreement by all compo-nents of the development organization(e.g., requirements generators, design-ers, coding/implementers, testers, etc.).

4. Determine which project tools will beused for planning, data collection,reporting, monitoring, and tracking(tool examples are critical path, earnedvalue, etc.).When these steps are completed, the

introduction of inspections has the neces-

sary framework (i.e., development infra-structure) for ongoing success and forinspections to be accepted as a very inte-gral part of the end-to-end developmentlife-cycle.

How an Inspection MethodologyCan Reinvigorate InspectionsInspections will only be successful longterm if they are integral to a well-defineddevelopment process that has active man-agement support in each phase of devel-opment. The methodology shown inFigure 3 (see page 25) starts with anassessment to ensure an adequate devel-opment life-cycle infrastructure is in placeprior to inspection training. Steps 1 and 2in Figure 3 are the assessment steps.

Once the development infrastructureis in place, what else needs to be done?Based on our experience in training morethan 5,000 inspectors in companies atmore than 50 locations, evaluating the datacollected and observing the ongoingimplementation or lack thereof, we haveidentified 10 inspection pitfalls, each ofwhich inhibits inspection implementation.These inspection pitfalls must be resolvedto achieve lasting benefits from inspec-tions.

The 10 inspection pitfalls and associat-ed risks are shown in Table 1 (see page25). Note: The lack of a well-definedSDLC infrastructure, discussed earlier, isthe first pitfall.

Table 1 identifies how each inspectionpitfall leads to findable defects not beingdiscovered with inspections, resulting inthe following:A. Development cost savings are not fully

realized.B. Quality improvements are not fully

achieved.C. Maintenance and support savings are

not realized.D. Inspections could become a total cost,

not a savings.We distinguish between the develop-

ment life-cycle infrastructure – withinwhich inspections fit (previous section),and the inspection infrastructure – whichenables proper inspection execution forachieving the maximum benefit. Anenabling inspection infrastructure mustaddress all 10 pitfalls and would consist ofthe following:1. Computerized management tools for

use in planning inspections and pre-dicting the overall project costs andsavings from applying inspections (seeFigure 4 on page 25 for an inspectiontool overview example).

2. Computerized tools to aid inspectors® CMMI is registered in the U.S. Patent and Trademark

Office by Carnegie Mellon University.

All Inspection Pitfalls Must Be Solved

4Time forInspectionsin Schedule

I

ts

I

6. ExecutionMonitoring

& Tracking Tools

Suc

cess

fuli

nspe

ctio

nim

plem

enta

tion

Inspection MethodologyIncludes Practitioner Training

ANDPitfall Prevention

Man tUnd g

All Inspection Pitfalls Must SolvedAll Inspection Pitfalls Must Solved

Suc

cess

fulI

I

Inspection MethodologyIncludes Practitioner Training

ANDPitfall Prevention

Man tUnd g

Road Map to Successful Inspection Implementation

6. ExecutionMonitoring andTracking Tools

4. Time forInspectionsIn Schedule

7. TrainingRefresher

8. InspectionChampion

9. Rapid Project Training

10. InspectionProcess

Capture Tool

5. InspectorTools

3. ManagementPlanning Tools

2. ManagementUnderstanding

1. SDLCInfrastructure

Figure 5: Inspection Infrastructure

How to Avoid Software Inspection Failure and Achieve Ongoing Benefits

January 2008 www.stsc.hill.af.mil 27

in correctly and consistently perform-ing inspections, gathering inspectionrelated data, and performing analysisto identify how future inspections canbe improved.

3. Monitoring and analysis computerizedtools for management’s post-inspec-tion evaluation of individual inspec-tions.

4. Computerized management tools foranalyzing inspection ROI and anaggregate calculator for assessing andtracking the resulting project savingsfrom multiple inspections.

5. An inspection process that providesflexibility for prioritizing what toinspect as shown in Figure 2.

6. Ability to have customized trainingmaterial which incorporates your ter-minology and is based on your needs.

7. Rapid training (two days or less) ofproject personnel in a comprehensivetraining course with significant focuson requirements and design.

8. An overview briefing for upper man-agement along with a more rigorousmanagement performance course soupper managers and project leaderscan fully understand the inspectionprocess, its benefits, and their respon-sibilities

9. Follow-up practitioner refreshers todeal with any implementation prob-lems – focused on making inspectionimplementation successful both initial-ly and long-term.

10. An inspection process capture tool toenable inspections to quickly becomean integral part of a company’s SDLCinfrastructure.Table 1 shows how an Inspection

Methodology can solve and prevent the 10inspection pitfalls.

The Road Map to SuccessThe pitfall solution road map in Figure 5shows the solution relationships that pro-vide for successful software inspectionimplementation that will endure over thelong term. The pitfall solutions providethe inspection infrastructure that, togetherwith a comprehensive inspector trainingprogram, forms an inspection methodolo-gy for achieving a lasting and successfulinspection program.

SummaryOur experience has shown us that inspec-tions can live up to their potential and beembraced by the development communityif the following happens:• Inspections are integral to a well-

defined software development life-cycle infrastructure supported by man-

agement in each phase of develop-ment.

• Inspections are flexible in determiningwhat to inspect.

• Computerized tools are available toassist management in planning inspec-tions and estimating project savingsbefore commitment.

• Computerized tools are available toguide the inspection teams.

• Management tools are available formonitoring inspection process confor-mance (not an individual’s perfor-mance) and tracking resulting inspec-tion benefits.

• Project personnel are provided withthe proper training and follow-up sup-port.u

References1. NIST. “The Economic Impacts of

Inadequate Infrastructure for SoftwareTesting.” NIST Planning Report 02-3.May 2002.

2. Bennett, Ted L., and Paul W.Wennberg. “Eliminating EmbeddedSoftware Defects Prior to IntegrationTest.” CrossTalk Dec. 2005.

3. McConnell, Steve. “Best Influences onSoftware Engineering Over Past 50Years.” IEEE Software Jan./Feb. 2000.

4. Wiegers, Karl. “The More ThingsChange.” Better Software Oct. 2006.

5. Jones, Capers. “Interview.” ComputerAid Inc. July 2005.

6. Radice, Ron. High Quality Low CostSoftware Inspections. Andover, MA:Paradoxicon Publishing, 2002.

7. Weller, Ed. “Calculating the Eco-nomics of Inspections.” StickyMindsJan. 2002.

8. Priven, L.D. “Managing the Program-ming Development Cycle.” IBMTechnical Report 21.463 Mar.1972.

9. Fagan, Michael E. “Design and CodeInspections and Process Control in theDevelopment of Programs.” IBMTechnical Report 21.572. Dec. 1974.

10. Priven, L. and F. Tsui. “Implementa-tion of Quality Control in SoftwareDevelopment.” AFIPS ConferenceProceedings, 1976 National ComputerConference 45 (1976):443-449.

11. McGraw, Gary. Making EssentialSoftware Work. Cigital, Inc. Mar. 2003<http://citigal.com/whitepapers>.

12. Software Engineering StandardsCommittee of the IEEE ComputerSociety. “IEEE Standard for SoftwareReviews, Section 6. Inspections.”IEEE Std. 1028-1997, 1997.

13. Humphrey, Watts. “Three Dimensionsof Process Maturity.” CrossTalkFeb. 1998.

About the Authors

Lew Priven is co-founder of the Stewart-Priven Group. He is anexperienced executivewith systems and soft-ware management and

technical background. Priven was vice-president of engineering and applicationdevelopment at General ElectricInformation Services and vice presidentof application development for IBM’sApplication Systems Division. He has abachelor’s degree in electrical engineer-ing from Tufts University and a master’sdegree in management from RensselaerPolytechnic Institute.

The Stewart-Priven Group7962 Old Georgetown RDSTE BBethesda MD 20814Phone: (865) 458-6685Fax: (865) 458-9139 E-mail: [email protected]

Roger Stewart is co-founder of the Stewart-Priven Group. Previous-ly, he spent 30 years withIBM’s Federal SystemsDivision managing and

developing systems for air traffic con-trol, satellite command and control, on-board space shuttle, LAMPS helicopterand in commercial banking, telecommu-nication and networking systems.Stewart has a bachelor’s degree in math-ematics from Cortland University.

The Stewart-Priven Group7962 Old Georgetown RDSTE BBethesda, MD 20814Phone: (865) 458-6685Fax: (865) 458-9139E-mail: [email protected]

It is all about programming! Over the lastfew years we have noticed worrisome

trends in CS education. The following rep-resents a summary of those trends:

1. Mathematics requirements in CS pro-grams are shrinking.

2. The development of programmingskills in several languages is giving wayto cookbook approaches using largelibraries and special-purpose packages.

3. The resulting set of skills is insufficientfor today’s software industry (in partic-ular for safety and security purposes)and, unfortunately, matches well whatthe outsourcing industry can offer. Weare training easily replaceable profes-sionals.

These trends are visible in the latestcurriculum recommendations from theAssociation for Computing Machinery(ACM). Curriculum 2005 does not mentionmathematical prerequisites at all, and itmentions only one course in the theory ofprogramming languages [1].

We have seen these developments fromboth sides: As faculty members at NewYork University for decades, we haveregretted the introduction of Java as a firstlanguage of instruction for most computerscience majors. We have seen how thischoice has weakened the formation of ourstudents, as reflected in their performancein systems and architecture courses. Asfounders of a company that specializes inAda programming tools for mission-criticalsystems, we find it harder to recruit quali-fied applicants who have the right founda-tional skills. We want to advocate a morerigorous formation, in which formal meth-ods are introduced early on, and program-ming languages play a central role in CSeducation.

Formal Methods and SoftwareConstructionFormal techniques for proving the correct-ness of programs were an extremely activesubject of research 20 years ago. However,

the methods (and the hardware) of thetime prevented these techniques frombecoming widespread, and as a result theyare more or less ignored by most CS pro-grams. This is unfortunate because thetechniques have evolved to the point thatthey can be used in large-scale systems andcan contribute substantially to the reliabili-ty of these systems. A case in point is theuse of SPARK in the re-engineering of theground-based air traffic control system inthe United Kingdom (see a description ofiFACTS – Interim Future Area ControlTools Support, at <www.nats.co.uk/arti-cle/90>). SPARK is a subset of Ada aug-mented with assertions that allow thedesigner to prove important properties ofa program: termination, absence of run-time exceptions, finite memory usage, etc.[2]. It is obvious that this kind of designand analysis methodology (dubbedCorrectness by Construction) will add sub-stantially to the reliability of a systemwhose design has involved SPARK fromthe beginning. However, PRAXIS, thecompany that developed SPARK andwhich is designing iFACTS, finds it hard torecruit people with the required mathemat-ical competence (and this is present even inthe United Kingdom, where formal meth-ods are more widely taught and used thanin the United States).

Another formal approach to which CSstudents need exposure is model checkingand linear temporal logic for the design ofconcurrent systems. For a modern discus-sion of the topic, which is central to mis-sion-critical software, see [3].

Another area of computer sciencewhich we find neglected is the study offloating-point computations. At New YorkUniversity, a course in numerical methodsand floating-point computing used to berequired, but this requirement was droppedmany years ago, and now very few studentstake this course. The topic is vital to all sci-entific and engineering software and issemantically delicate. One would imaginethat it would be a required part of all cours-es in scientific computing, but these often

take MatLab to be the universal program-ming tool and ignore the topic altogether.

The Pitfalls of Java as a FirstProgramming LanguageBecause of its popularity in the context ofWeb applications and the ease with whichbeginners can produce graphical programs,Java has become the most widely used lan-guage in introductory programming cours-es. We consider this to be a misguidedattempt to make programming more fun,perhaps in reaction to the drop in CSenrollments that followed the dot-combust. What we observed at New YorkUniversity is that the Java programmingcourses did not prepare our students forthe first course in systems, much less formore advanced ones. Students found ithard to write programs that did not have agraphic interface, had no feeling for therelationship between the source programand what the hardware would actually do,and (most damaging) did not understandthe semantics of pointers at all, whichmade the use of C in systems program-ming very challenging.

Let us propose the following principle:The irresistible beauty of programmingconsists in the reduction of complex for-mal processes to a very small set of primi-tive operations. Java, instead of exposingthis beauty, encourages the programmer toapproach problem-solving like a plumberin a hardware store: by rummaging througha multitude of drawers (i.e. packages) wewill end up finding some gadget (i.e. class)that does roughly what we want. How itdoes it is not interesting! The result is a stu-dent who knows how to put a simple pro-gram together, but does not know how toprogram. A further pitfall of the early useof Java libraries and frameworks is that it isimpossible for the student to develop asense of the run-time cost of what is writ-ten because it is extremely hard to knowwhat any method call will eventually exe-cute. A lucid analysis of the problem is pre-sented in [4].

We are seeing some backlash to this

Computer Science Education:Where Are the Software Engineers of Tomorrow?

By Dr. Robert B.K. Dewar and Dr. Edmond SchonbergAdaCore Inc.

It is our view that Computer Science (CS) education is neglecting basic skills, in particular in the areas of programming andformal methods. We consider that the general adoption of Java as a first programming language is in part responsible for thisdecline. We examine briefly the set of programming skills that should be part of every software professional’s repertoire.

Open Forum

28 CROSSTALK The Journal of Defense Software Engineering January 2008

approach. For example, Bjarne Stroustrupreports from Texas A & M University thatthe industry is showing increasing unhappi-ness with the results of this approach.Specifically, he notes the following:

I have had a lot of complaints aboutthat [the use of Java as a first pro-gramming language] from industry,specifically from AT&T, IBM, Intel,Bloomberg, NI, Microsoft, Lock-heed-Martin, and more. [5]

He noted in a private discussion on thistopic, reporting the following:

It [Texas A&M] did [teach Java asthe first language]. Then I startedteaching C++ to the electrical engi-neers and when the EE studentsstarted to out-program the CS stu-dents, the CS department switchedto C++. [5]

It will be interesting to see how manydepartments follow this trend. AtAdaCore, we are certainly aware of manyuniversities that have adopted Ada as a firstlanguage because of similar concerns.

A Real Programmer CanWrite in Any Language (C,Java, Lisp,Ada)Software professionals of a certain age willremember the slogan of old-timers fromtwo generations ago when structured pro-gramming became the rage: Real program-mers can write Fortran in any language.The slogan is a reminder of how thinkinghabits of programmers are influenced bythe first language they learn and how hardit is to shake these habits if you do all yourprogramming in a single language.Conversely, we want to say that a compe-tent programmer is comfortable with anumber of different languages and that theprogrammer must be able to use the men-tal tools favored by one of them, evenwhen programming in another. For exam-ple, the user of an imperative languagesuch as Ada or C++ must be able to writein a functional style, acquired through prac-tice with Lisp and ML1, when manipulatingrecursive structures. This is one indicationof the importance of learning in-depth anumber of different programming lan-guages. What follows summarizes what wethink are the critical contributions thatwell-established languages make to themental tool-set of real programmers. Forexample, a real programmer should be ableto program inheritance and dynamic dis-patching in C, information hiding in Lisp,

tree manipulation libraries in Ada, andgarbage collection in anything but Java.The study of a wide variety of languages is,thus, indispensable to the well-roundedprogrammer.

Why C MattersC is the low-level language that everyonemust know. It can be seen as a portableassembly language, and as such it exposesthe underlying machine and forces the stu-dent to understand clearly the relationshipbetween software and hardware. Perfor-mance analysis is more straightforward,because the cost of every software state-ment is clear. Finally, compilers (GCC forexample) make it easy to examine the gen-erated assembly code, which is an excellenttool for understanding machine languageand architecture.

Why C++ MattersC++ brings to C the fundamental conceptsof modern software engineering: encapsu-lation with classes and namespaces, infor-mation hiding through protected and pri-vate data and operations, programming byextension through virtual methods andderived classes, etc. C++ also pushes stor-age management as far as it can go withoutfull-blown garbage collection, with con-structors and destructors.

Why Lisp MattersEvery programmer must be comfortablewith functional programming and with theimportant notion of referential transparency.Even though most programmers find imper-ative programming more intuitive, they mustrecognize that in many contexts that a func-tional, stateless style is clear, natural, easy tounderstand, and efficient to boot.

An additional benefit of the practice ofLisp is that the program is written in whatamounts to abstract syntax, namely theinternal representation that most compilersuse between parsing and code generation.Knowing Lisp is thus an excellent prepara-tion for any software work that involveslanguage processing.

Finally, Lisp (at least in its lean Schemeincarnation) is amenable to a very compactself-definition. Seeing a complete Lispinterpreter written in Lisp is an intellectualrevelation that all computer scientistsshould experience.

Why Java MattersDespite our comments on Java as a first oronly language, we think that Java has animportant role to play in CS instruction.We will mention only two aspects of thelanguage that must be part of the real pro-grammer’s skill set:

1. An understanding of concurrent pro-gramming (for which threads provide abasic low-level model).

2. Reflection, namely the understandingthat a program can be instrumented toexamine its own state and to determineits own behavior in a dynamicallychanging environment.

Why Ada MattersAda is the language of software engineer-ing par excellence. Even when it is not thelanguage of instruction in programmingcourses, it is the language chosen to teachcourses in software engineering. This isbecause the notions of strong typing,encapsulation, information hiding, concur-rency, generic programming, inheritance,and so on, are embodied in specific fea-tures of the language. From our experienceand that of our customers, we can say thata real programmer writes Ada in any lan-guage. For example, an Ada programmeraccustomed to Ada’s package model, whichstrongly separates specification fromimplementation, will tend to write C in astyle where well-commented header filesact in somewhat the same way as packagespecs in Ada. The programmer will includebounds checking and consistency checkswhen passing mutable structures betweensubprograms to mimic the strong-typingchecks that Ada mandates [6]. She willorganize concurrent programs into tasksand protected objects, with well-definedsynchronization and communicationmechanisms.

The concurrency features of Ada areparticularly important in our age of multi-core architectures. We find it surprising thatthese architectures should be presented as anovel challenge to software design whenAda had well-designed mechanisms for writ-ing safe, concurrent software 30 years ago.

Programming Languages AreNot the Whole StoryA well-rounded CS curriculum will includean advanced course in programming lan-guages that covers a wide variety of lan-guages, chosen to broaden the understand-ing of the programming process, ratherthan to build a résumé in perceived hot lan-guages. We are somewhat dismayed to seethe popularity of scripting languages inintroductory programming courses. Suchlanguages (Javascript, PHP, Atlas) areindeed popular tools of today for Webapplications. Such languages have all thepedagogical defaults that we ascribe to Javaand provide no opportunity to learn algo-rithms and performance analysis. Theirabsence of strong typing leads to a trial-

Computer Science Education: Where Are the Software Engineers of Tomorrow?

January 2008 www.stsc.hill.af.mil 29

and-error programming style and preventsstudents from acquiring the discipline ofseparating design of interfaces from speci-fications.

However, teaching the right languagesalone is not enough. Students need to beexposed to the tools to construct large-scale reliable programs, as we discussed atthe start of this article. Topics of relevanceare studying formal specification methodsand formal proof methodologies, as well asgaining an understanding of how high-reli-ability code is certified in the real world.When you step into a plane, you are puttingyour life in the hands of software whichhad better be totally reliable. As a comput-er scientist, you should have some knowl-edge of how this level of reliability isachieved. In this day and age, the fear ofterrorist cyber attacks have given a newurgency to the building of software that isnot only bug free, but is also immune frommalicious attack. Such high-security soft-ware relies even more extensively on for-mal methodologies, and our students needto be prepared for this new world.u

References1. Joint Taskforce for Computing

Curricula. “Computing Curricula 2005:The Overview Report.” ACM/AIS/IEEE, 2005 <www.acm.org/education

/cur r i c_vo l s/CC2005-March06Final.pdf>.

2. Barnes, John. High Integrity Ada: TheSpark Approach. Addison-Wesley,2003.

3. Ben-Ari, M. Principles of Concurrentand Distributed Programming. 2nd ed.Addison-Wesley, 2006.

4. Mitchell, Nick, Gary Sevitsky, andHarini Srinivasan. “The Diary of aDatum: An Approach to AnalyzingRuntime Complexity in Framework-Based Applications.” Workshop onLibrary-Centric Software Design,Object-Oriented Programming, Sys-tems, Languages, and Applications, SanDiego, CA, 2005.

5. Stroustrup, Bjarne. Private communica-tion. Aug. 2007.

6. Holzmann Gerard J. “The Power ofTen – Rules for Developing SafetyCritical Code.” IEEE Computer June2006: 93-95.

Note1. Several programming language and sys-

tem names have evolved fromacronyms whose formal spellings areno longer considered applicable to thecurrent names for which they are read-ily known. ML, Lisp, GCC, PHP, andSPARK fall under this category.

Open Forum

30 CROSSTALK The Journal of Defense Software Engineering January 2008

Get Your Free Subscription

Fill out and send us this form.

517 SMXS/MXDEA

6022 Fir Ave

Bldg 1238

Hill AFB, UT 84056-5820

Fax: (801) 777-8069 DSN: 777-8069

Phone: (801) 775-5555 DSN: 775-5555

Or request online at www.stsc.hill.af.mil

NAME:________________________________________________________________________

RANK/GRADE:_____________________________________________________

POSITION/TITLE:__________________________________________________

ORGANIZATION:_____________________________________________________

ADDRESS:________________________________________________________________

________________________________________________________________

BASE/CITY:____________________________________________________________

STATE:___________________________ZIP:___________________________________

PHONE:(_____)_______________________________________________________

FAX:(_____)_____________________________________________________________

E-MAIL:__________________________________________________________________

CHECK BOX(ES) TO REQUEST BACK ISSUES:

OCT2006 c STAR WARS TO STAR TREK

NOV2006 c MANAGEMENT BASICS

DEC2006 c REQUIREMENTS ENG.

JAN2007 c PUBLISHER’S CHOICE

FEB2007 c CMMI

MAR2007 c SOFTWARE SECURITY

APR2007 c AGILE DEVELOPMENT

MAY2007 c SOFTWARE ACQUISITION

JUNE2007 c COTS INTEGRATION

JULY2007 c NET-CENTRICITY

AUG2007 c STORIES OF CHANGE

SEPT2007 c SERVICE-ORIENTED ARCH.

OCT2007 c SYSTEMS ENGINEERING

NOV2007 c WORKING AS A TEAM

DEC2007 c SOFTWARE SUSTAINMENT

To Request Back Issues on Topics NotListed Above, Please Contact <[email protected]>.

About the Authors

Edmond Schonberg,Ph.D., is vice-presidentof AdaCore and a pro-fessor emeritus of com-puter science at NewYork University. He has

been involved in the implementation ofAda since 1981. With Robert Dewar andother collaborators, he created the firstvalidated implementation of Ada83, thefirst prototype compiler for Ada9X, andthe first full implementation ofAda2005. Schonberg has a doctorate inphysics from the University of Chicago.

AdaCore104 Fifth AVE15th FLNew York, NY 10011E-mail: [email protected]

Robert B.K. Dewar,Ph.D., is president ofAdaCore and a professoremeritus of computerscience at New YorkUniversity. He has been

involved in the design and implementa-tion of Ada since 1980 as a distinguishedreviewer, a member of the Ada Rappor-teur group, and the chief architect ofGnu Ada Translator. He was a memberof the Algol68 committee and is thedesigner and implementor of Spitbol.Dewar lectures widely on programminglanguages, software methodologies, safe-ty and security, and on intellectual prop-erty rights. He has a doctorate in chem-istry from the University of Chicago.

AdaCore104 Fifth AVE15th FLNew York, NY 10011Phone: (212) 620-7300 ext. 100Fax: (212) 807-0162E-mail: [email protected]

BACKTALK

January 2008 www.stsc.hill.af.mil 31

I’ve been doing some networking lately. Not the kind of net-working that gets one a new job, but computer networking.

Since job hunting networking is now done on a computer, somefolks may not see the difference, so just take my word for it, net-working isn’t necessarily networking. I’ve been doing the type ofnetworking that allows computers to talk to each other. This iswhat some people would call network administration or networkengineering.

There are many different types of people in the networkingfield, with various backgrounds. Some of them don’t have techni-cal backgrounds, and technical fields have their own language, orvocabulary.

While working on an infrared project some years ago, I dis-covered that there existed three different definitions for infrared.One definition exists for engineers, which was defined in theInstitute for Electronics and Electrical Engineers (IEEE) stan-dard, one for physics, and one for astronomers. The different def-initions evolved separately from a historical perspective, and that isunderstandable. Astronomers have been around long before engi-neers.

Now, in the networking literature, I find a particular word, thebyte, that seems to be taking on different meanings. As an engi-neer, I learned that a byte is eight bits. Always. A nibble is four bits.A word is the size of the processor data line, or the number ofdata bits that the processor takes in on a clock cycle. Sometimesthis is the size of the internal data bus, but not always. When I talkabout a byte, I mean eight bits. When some one else talks about abyte, I think they mean eight bits. As the March Hare told Alicewhile she was visiting Wonderland, “Then you should say what youmean [1].” We have to use words with denotations that are shared.

In several books that I have been reading on computer net-working, the authors don’t seem to know that a byte is eight bits –always. Let’s review some computer science. A nibble is four bits,a byte is eight bits, and a computer word varies, depending on thecomputer architecture. Some early computers were six bitmachines, but most of us are familiar with the eight bit, 16 bit, 32bit, and now 64 bit machines. I think that some of these network-ing authors are confusing bytes with computer words. Could thatbe because of different backgrounds making up the network engi-neering field? Maybe we will have different definitions of a byte,one from the engineering field, one from the computer scientists,and one from certified network engineers. Is it too late to stop thismadness?

One book gives the correct definition for byte on page 83,then, on page 87, defines a byte as seven or eight bits, dependingon whether parity is used [2]. This book, which is very good, hasa glossary where a byte is correctly defined as eight bits. Perhaps,on page 87, the author is confusing using bytes with using theAmerican Standard Code for Information Interchange (ASCII)character set. The ASCII is a seven-bit code, or eight, if parity isused. A byte is always eight bits.

So, what does dictionary.com say? To my horror, some of thedefinitions given online differ. But my Webster’s has it right [3]. Itdefines a byte as “a group of eight binary digits processed as a unitby a computer and used especially to represent an alphanumericcharacter.” And Webster’s definition for word, under 2c, has “anumber of bytes processed as a unit and conveying a quantum ofinformation in communication and computer work.” Webster’sdoes not have a technical definition of a nibble.

Again, to my horror, I found that the IEEE Standard Glossary

of Software Engineering Terminology did not commit a byte toalways be eight bits. The definition given for byte and the onegiven for word, was very similar, and a definition for nibble wasnot given at all [4].

I think that because a byte is often used to represent analphanumeric character, and the ASCII code is 7 or 8 bits, depend-ing on whether parity is used, some confusion is creeping in here.ASCII is being replaced by Unicode, which can use as many asfour bytes (or 32 bits). See <http://www.unicode.org>. Now, iffour bytes are 32 bits, how large is a byte?

Some of these networking book authors don’t think that a byteis always eight bits, and they want to use the word octet to conveyan eight bit quantity [5]. I was just wondering if the definition ofbyte was being changed because of differing backgrounds of peo-ple in the information technology workplace. Before reading thesebooks, it never occurred to me that anyone thought that a byte wasanything other than eight bits.

In Salinger’s book, “The Catcher in the Rye,” Holden Caulfieldtells his little sister Phoebe what he would like to do with his life[6]. He tells of picturing these children playing in a big field of ryenext to a cliff. He wants to stand on the edge of the cliff and catchany kids who fall over. In my version of this fantasy, I see infor-mation technology professionals wandering about in a big field ofrye wondering if a byte had six, seven, or eight bits in it. I wouldbe there to assure them that every byte has eight bits and only eightbits. It’s the size of a computer word that varies.

— Dennis [email protected]

References1. Carroll, Lewis. Alice in Wonderland. Grosset and Dunlap

Publishers, 1980.2. Lammle, Todd. CCNA: Cisco Certified Network Associate.

Wiley Publishing, Inc., 2005.3. Merriam-Wester’s Collegiate Dictionary, 10th ed. (2001).4. IEEE. IEEE Std. 610.12-1990. IEEE Standard Glossary of

Software Engineering Terminology.5. Comer, Douglas E. Internetworking with TCP/IP: Principles,

Protocols, and Architecture. Pearson Prentice Hall, 2006.6. Salinger, J.D. The Catcher in the Rye. Little, Brown, and

Company, 2001.

Bite My Bytes

Can You BackTalk?

Here is your chance to make your point, even if it is a bittongue-in-cheek, without your boss censoring your writing. Inaddition to accepting articles that relate to software engineer-ing for publication in CrossTalk, we also accept articles forthe BackTalk column. BackTalk articles should provide aconcise, clever, humorous, and insightful perspective on thesoftware engineering profession or industry or a portion of it.Your BackTalk article should be entertaining and clever ororiginal in concept, design, or delivery. The length should notexceed 750 words.

For a complete author’s packet detailing how to submityour BackTalk article, visit our Web site at<www.stsc.hill.af.mil>.

CrossTalk / 517 SMXS/MXDEA6022 Fir AVEBLDG 1238Hill AFB, UT 84056-5820

PRSRT STDU.S. POSTAGE PAID

Albuquerque, NMPermit 737

CrossTalk isco-sponsored by the

following organizations: