Work priority scheme for EDP audit and computer security ...

68
A111D2 Sbim NBS PUBLICATIONS All 102561 191 Ruthberg, Zella G/Work priority scheme f QC100 .U56 NO. 86-3386 1986 V19 C.1 NBS-P 386 Work Priority Scheme for EDP Audit and Computer Security Review Zella G. Ruthberg U.S. DEPARTMENT OF COMMERCE National Bureau of Standards Institute for Computer Sciences and Technology Gaithersburg, MD 20899 and Bonnie T. Fisher Department of Health & Human Services Office of Inspector General Washington, DC 20201 March 1986 Issued August 1 986 OC DEPARTMENT OF COMMERCE 100 . 1)56 86-3386 1986 c-2 DNAL BUREAU OF STANDARDS

Transcript of Work priority scheme for EDP audit and computer security ...

A111D2 Sbim NBS

PUBLICATIONS

All 102561 191Ruthberg, Zella G/Work priority scheme f

QC100 .U56 NO.86-3386 1986 V19 C.1 NBS-P386

Work Priority Scheme for EDPAudit and Computer SecurityReview

Zella G. Ruthberg

U.S. DEPARTMENT OF COMMERCENational Bureau of Standards

Institute for Computer Sciences and Technology

Gaithersburg, MD 20899

and

Bonnie T. Fisher

Department of Health & Human Services

Office of Inspector General

Washington, DC 20201

March 1986

Issued August 1 986

OC DEPARTMENT OF COMMERCE100

. 1)56

86-3386

1986

c-2

DNAL BUREAU OF STANDARDS

MBS

RESEARCH

INFORMATION

CENTER

AJ&St

NBSIR 86*3386

WORK PRIORITY SCHEME FOR EDPAUDIT AND COMPUTER SECURITYREVIEW

lAQr H?

19%t’lS

Zella G. Ruthberg

U S. DEPARTMENT OF COMMERCENational Bureau of Standards

Institute for Computer Sciences and Technology

Gaithersburg, MD 20899

and

Bonnie Fischer

Department of Health & Human Services

Office of Inspector General

Washington, DC 20202

March 1986

Issued August 1 986

U.S. DEPARTMENT OF COMMERCE, Malcolm Baldrige, Secretary

NATIONAL BUREAU OF STANDARDS, Ernest Ambler, Director

WORK PRIORITY SCHEME FOR EDP AUDIT AND

COMPUTER SECURITY REVIEW

Editors

:

Zella G. RuthbergInstitute for Computer Sciences and Technology

National Bureau of Standards

and

Bonnie T. FisherDepartment of Health & Human Services

Office of Inspector General

March 1986

TABLE OF CONTENTS

1. INTRODUCTION 1

1.1 The Work Priority Scheme in Perspective 1

1.2 Internal Controls and Security Safeguards .... 1

1.3 Brief Overview of the Scheme 1

2. THE NEED FOR THE SCHEME 3

2.1 Dependence on Computers 3

2.2 EDP Audits/Security Reviews - A Form of Control . 3

2.3 Formal Requirements for Audits and Reviews .... 3

2.4 Size of Review Task 4

3. BACKGROUND ON THE METHODOLOGY 4

3.1 The Invitational Workshop 4

3.2 Workshop Points of Agreement 5

4. A WORK PRIORITY SCHEME FOR THE EDP AUDITOR 6

4.1 Assumptions and Caveats 6

4.2 Audit Planning/Prioritization Process 6

4.3 Non-Discretionary Audits 6

4.4 Risk Evaluation Levels and Dimensions 9

4.5 Two Level Work Priority Dimensions/Characteristics 104.5.1 Level I: 104.5.2 Level II: 11

4.6 Risk Scoring — Application of the Work PriorityScheme 154.6.1 Implementation of the Scheme 154.6.2 A Simple Scoring Approach 154.6.3 A Detailed Scoring Approach 15

5. Discretionary Audits 16

6. USES OF THE WORK PRIORITY SCHEME 16

7. PROBLEMS WITH AND SOLUTIONS TO USE OF SCHEME 187.1 Potential Difficulties in Utilization 187.2 Methods for Overcoming Difficulties 19

8. Recommendations 20

References 21

APPENDIX A PCIE WORK GROUP ON EDP SYSTEMS REVIEW ANDSECURITY A-l

APPENDIX B STRAW MAN PRIORITIZING SCHEME FOR USE BYAUDITORS IN EVALUATING AUDIT RISK INAUTOMATED INFORMATION SYSTEMS B-l

APPENDIX C PCIE/NBS INVITATIONAL WORKSHOP DISCUSSIONGROUPS MEMBERSHIP C-l

APPENDIX D TWO RISK SCORING METHODS D-l

i

FIGURES

Figure 1 AUDIT PLANNING/PRIORITIZATION PROCESS 7

Figure 2 AUDIT AREAS OF CONCERN 17Figure B.l RISK DIMENSION CHARACTERISTICS B-10Figure D.l SYSTEM RISK SCORING - SIMPLIFIED METHOD .... D-2Figure D.2 PRACTICE TEMPLATE FOR RISK SCORING OF AN AIS . . D-8

TABLES

Table D.l SYSTEM RISK SCORING - SIMPLIFIED METHOD EXAMPLE . D-3Table D.2 DIMENSION RISK SCORES AND SYSTEM RISK SCORES

FOR AIS 1 D-9Table D.3 DIMENSION RISK SCORES AND SYSTEM RISK SCORES

FOR AIS 2 D-10

ii

ABSTRACT

This report describes a high level risk analysis forAutomated Information Systems (AISs) that can be used by computersecurity reviewers and EDP auditors to prioritize their non-dis-cretionary and discretionary review activities for these AISs.It divides the risk analysis problem into five areas of riskconcern (called dimensions) with each area defined by a set ofcharacteristics. The five dimensions are: Criticality/MissionImpact, Size/Scale/Complexity, Environment/Stability, Reliabili-ty/Integrity, and Technology Integration. The report presentstwo possible two-level risk scoring schemes, each of whichcalculates the level of risk for each dimension, uses theCriticality score as a first order system risk score, and thencombines all five dimension risk scores for a second order systemrisk score. One scoring method is simple and intuitive; theother scoring method is more detailed. An approach for derivingan EDP audit or computer security review plan using these scoresis outlined.

iii

KEYWORDS

audit/review planautomated information system risk analysiscomputer security reviewCritical ity/Mission Impactdiscretionary audit/reviewEDP auditEnvironment/Stabilitynon-discretionary audit/reviewReliability/Integrityrisk scoreSize/Scale/ComplexityTechnology Integration

IV

1. INTRODUCTION1.1

The Work Priority Scheme in Perspective

This report describes a methodology for prioritizing thework to be performed by EDP Auditors and Computer SecurityReviewers. It is based largely on the results of a Spring 1985public/private sector workshop of EDP auditors and systemsdevelopers who explored the criteria for assessing risk incomputer systems. The workshop was co-sponsored by NBS and theEDP Systems Review and Security Work Group of the President'sCouncil on Integrity and Efficiency (PCIE) . The Work Group wasestablished in October of 1983 under the auspices of the PCIEComputer Security Project, chaired by Richard P. Kusserow,Inspector General (IG) of the Department of Health and HumanServices ( HHS) . (See Appendix A for membership in and furtherdescription of the Work Group.) The methodology described inthis report is to be included in the EDP system development auditguide currently being developed by this Work Group for jointpublication by the PCIE and the National Bureau of Standards(NBS)

.

1.2

Internal Controls and Security Safeguards

Although it may at first appear strange to have the samemethodology applicable to both EDP audit and computer securityreview, further analysis of these two activities reveals thesimilarity of their focus. EDP audit is concerned with thereview of internal controls in an Automated Information System(AIS ) , while Computer Security review examines the securitysafeguards in an AIS. Security must be recognized as only one,albeit a major category of internal controls. A study performedby Arthur Young for the Department of Energy [1], recognized thatcomputer security controls are a subset of the internal controlsto be found in an AIS. The major difference between these twosets of controls is that internal controls address efficiency andeffectiveness in addition to security issues. The Office ofManagement and Budget, in their OMB Circular A-130 (a re-write of0MB Circular A-71 TMl (Computer Security)), acknowledges theinterrelationship between internal control and security concernsin both their definition of key terms and their acceptance ofinternal control reviews and documentation in lieu of securityreviews. OMB Circular A-123 also reflects this correlation.

1.3

Brief Overview of the Scheme

The Scheme described in this report enables its user tosystematically perform a risk-based evaluation of the subjectsfor EDP audit/security review within an organization (i.e., theuniverse of its AISs) , and to arrive at a risk measurement foreach AIS. This final risk measure (or score) is based on ananalysis of risk in key areas of concern (dimensions fordescribing risk) in that system. These scores enable the user to

1

rank the systems by determining which AISs offer the highestlevels of risk to the organization and which dimensions withineach AIS contribute most to this high level of risk. Based onthis analysis, the user can then draw up an EDP audit or securityreview work plan for the organization in question. The work planwould include annual coverage along with a basis for formulatingthe scope of specific AIS reviews. Considering the generality ofthe dimensions and their associated characteristics, the schemeis equally appropriate for public and private sector reviewsubjects.

The scheme employs a two-level review and the characteris-tics associated with the five dimensions. The levels for thedimensions are:

Level I

Criticality/Mission Impact

Level IISiz e/Scal e/Complexi tyEnvironment/StabilityReliability/IntegrityTechnology Integration

Each dimension is defined by a related set of characteristicswhich are used to estimate or calculate the amount of risk posedby that dimension to the failure of the system. A Level I reviewlooks at Criticality/Mission Impact of the system to the organi-zation and develops a risk score for each AIS with respect tothis dimension. Since this dimension is the most important ofthe five risk areas, it can be used as a first approximation to asystem risk score. The AISs can then be placed in sequence fromhigh to low risk and the low risk systems eliminated from furtherreview consideration. Organizations with very limited resourcescould stop at a Level I review and plan their work based on theseresults.

To refine the risk scores further, the high criticality riskAISs are reviewed at Level II. Risk scores are obtained for thefour remaining dimensions for each high criticality risk AIS.These four dimension risk scores are summed and added to theLevel I risk score to yield the system risk score for that AIS.The AISs reviewed at Level II can then again be placed insequence from high to low risk and thus enable the reviewer toprioritize his work.

Two possible risk scoring methods are suggested and describ-ed briefly in Section 4 and in detail in Appendix D. The firstis a simple intuitive approach based on a minimal collection ofinformation on the AIS; the second is more elaborate and is basedon more detailed information on the AIS. Organizations withlimited resources could use the simple scoring method to obtainsystem risk scores while those with more resources could use themore elaborate approach.

2

2. THE NEED FOR THE SCHEME2.1

Dependence on Computers

As part of the Fiscal Year 1986 budget, the Presidenthighlighted systems management as one of the Federal Government'smajor initiatives for FY 1986 and beyond [2], The FederalGovernment continues to develop 90 percent of its software, whichconstitutes the controlling mechanism for the approximately $14billion spent annually on information technology. More than120,000 federal employees are involved in programming andmanaging the resultant systems which will ultimately control anddistribute the almost $1 trillion dollars in outlays projectedfor 1986. Obviously, the Federal commitment to the computerhardware, software, and management arenas has reached giganticproportions with no tapering off in sight of either the size orthe growth rate.

2.2

EDP Audits/Security Reviews - A Form of Control

In the past ten years there has been a slowly growingrecognition of the need for controls in the Federal Government'sautomated systems. Although there often is resistance amongprogram sponsors or user management to employing internalcontrols within AISs because of the cost, time, and overhead thatsuch controls can introduce, the interest in and use of controlsin AISs is continuing to grow. This growth is augmented by theincreasing emphasis OMB has placed on internal controls since thepassage of PL97-255, the Federal Managers' Financial IntegrityAct of 1982 [3]

,

and the completion and revision of their ownCircular A-123. (See Section 2.3 for descriptions of thesecontrol requirements.) The General Accounting Office (GAO), atCongressional request, has closely followed the Federal agencies'implementation of A-123, and, thus far, has been dissatisfiedwith agencies' compliance—especially in the area of internalcontrols in AISs.

Internal audit organizations, whose activities existed longbefore the computer age, have long recognized and stressed theneed for internal controls in manual (primarily financial)systems and the need for independent audits as a criticalcomponent of the oversight of an organization's systems. Withthe advent of computerized AISs in organizations, career fieldsspecializing in EDP audit (generally found in audit organiza-tions) and security review (often found in audit or management)have developed. Recognition and revision of their role in thereview of automated systems is continuing, and increasingrapidly.

2.3

Formal Requirements for Audits and Reviews

The major legal requirements for EDP audits and securityreviews within Federal agencies are found in three OMB circulars:A-130 , A-123, and A-127. Circular A-130 (the follow-on to A-71

3

TMl) outlines specific requirements for establishing agencysecurity programs, and specifies the use of (1) design reviewsand system tests for security during development of applications(to be used for certification of application) (2) periodicsecurity audits or reviews of applications for recertification,and (3) periodic risk analyses of installations. OMB CircularA-123, issued in 1981 and revised in 1983, outlines for Federalagencies specific policies and standards for establishing andmaintaining internal controls in their programs and administra-tive activities. This includes requirements for vulnerabilityassessments and internal control reviews. The main provisions ofA-123 were made into law through the enactment of the FederalManagers' Financial Integrity Act of 1982. OMB Circular A-127,issued in 1984, outlines for Federal agencies specific policiesand standards for establishing and maintaining internal controlsin financial management systems. This includes requirements forannual reviews of agency financial systems which build on reviewsrequired by OMB Circular A-123. In addition to these three keylegal directives, internal audit and security are subject todepartmental requirements, audit organization recommendations,and GAO audit standards for computer based systems [4]

.

2.4 Size of Review Task

A major implication of the enormous numbers of computers andour dependence on them, found today in government (see section2.1) as well as the private sector, is that the universe of AISsthat need reviewing is also enormous. However, the number oftrained EDP auditors and security reviewers to do this job hasnot kept pace with the size of this problem. A consistentmethodology for obtaining a risk score for an AIS is seen as amajor tool for culling through the review work that needs to bedone and assigning relevant as well as realistic workloads to thereview staff available within an organization.

3. BACKGROUND ON THE METHODOLOGY

3.1 The Invitational Workshop

The PCIE Work Group, in the course of its activities,decided that an essential component of their final product, Guideto Auditing for Controls and Security Throughout the SystemsDevelopment Life Cycle, was a methodology for prioritizing theEDP auditor's work. Rather than rely exclusively on the experi-ence and background of the Work Group members, it was decided tohold an invitational workshop on the subject and use the ideasgenerated during the course of the workshop to develop a workpriority scheme.

The 2 1/2 day workshop was held in March of 1985. A"strawman" scheme (see Appendix B) , used as a starting point fordiscussions, was provided by William Perry, based on a HarvardBusiness Review article [5] by F. Warren McFarlan on predictingthe failure of systems under development. The 62 attendees

4

included EDP auditors, senior ADP managers, and computer securityspecialists from both the Federal Government and the privatesector. (See Appendix C for list of attendees.) Presentations,to set the stage, were given on the first morning by attendeesfrom Coopers & Lybrand, Touche Ross & Co., General Motors,International Security Technology Inc. , and Management & ComputerServices, Inc. The attendees were then divided into fivediscussion groups, each of which had 1 1/2 days to analyze the"strawman" and come up with their own version of a work priorityscheme, based on the "strawman" framework of providing criticalrisk dimensions with associated characteristics. Each grouppresented its scheme at the closing session of the workshop.

3.2 Workshop Points of Agreement

Although each group came up with a somewhat different set ofmajor audit/security concerns (dimensions) for the scheme, therewas universal agreement on four underlying premises:

1. The entire EDP audit planl must first give consideration tonon-discretionary audits (mandated by law, regulation, and/or theagency/organization management) . These are reflected in the frontend qualifiers. Only if there are remaining resources for EDPaudit would the scheme be used as originally intended.

2. The risk based prioritizing evaluation needs to be performedat two levels, Level I and Level II.

3. The first level of inquiry (for its Level I dimension) shouldconcern itself with the criticality of the AIS to the agency/or-ganization mission. Only critical systems should be reviewedfurther (for its Level II dimensions^) and given a more detailedrisk score.

4. The ranking and rating of the risk characteristics of eachdimension is program and agency/organization specific. Only therisk scoring method is applicable across the board.

1) It should be understood that the terras EDP audit and

security review may be used interchangeably throughout the schemeand the surrounding discussion.

2) The four major concerns or dimensions to be addressed in

a Level II review (presented in section 4.4) are a synthesis ofthe conclusions drawn by the five workshop discussion groups.Two analysis approaches for risk measurement (or scoring) arediscussed briefly in Section 4.6 and in detail in Appendix D.

5

4. A WORK PRIORITY SCHEME FOR THE EDP AUDITOR4.1

Assumptions and Caveats

The use of the proposed work prioritizing scheme is based oncertain ideal assumptions and caveats. These include:

o An inventory of all computer systems (AISs)—operational,under development, or undergoing major change— ismaintained, to establish the audit universe.

o The above inventory may not be complete due to userdevelopment or system changes made outside the systemdevelopment process.

o To use the priority scheme, certain minimal informationis required or the assessment of the system may not bevalid.

o The full priority scheme would most easily be performedby EDP audit groups in order to enlist multiple perspec-tives, especially where resources are known to be aconcern.

o Auditors in the organization must agree that risk can beevaluated by a standardized scheme.

o Users should always be consulted in the risk evaluationconducted by the auditor to ensure appropriate assump-tions, and to assure maximum effectiveness.

o Auditor judgement is still needed 1

Within this framework of assumptions and caveats the entire EDPaudit work plan can then be developed. To the degree theseassumptions differ from the reality of the organization's SDLCenvironment, the work planning methodology should be adjusted.

4.2

Audit Planning/Prioritization Process

The risk evaluation performed as part of the work priorityscheme must be done within the context of the entire auditplanning process. There are elements of the process that need tobe considered prior to the risk evaluation (such as non-discre-tionary audit requirements) and other elements that requireconsideration afterwards (such as resource constraints) . Thefollowing sections contain a suggested model for the entireprioritization process.

4.3

Non-Discretionary Audits

As can be seen from the model in Figure 1, the auditplanning and prioritization process starts with front end

6

Figure 1 AUDIT PLANNING/PRIORITIZATION PROCESS

I FRONT END |

I QUALIFIERS |

IRISK | | AUDIT I

I EVALUATION| I IMPLEMENTATION!

I BACK END |

I QUALIFIERS |

7

qualifiers that must be considered by the auditor prior to makingdecisions with respect to which system(s) should be audited.These front end qualifiers consist of nondiscretionary factorswhich are beyond the auditor's control. These nondiscretionaryfactors include, but are not limited to the following:

o External directives (e.g., laws, regulations, OMBcirculars, and audit standards);

o Internal directives and priorities (e.g., contractualrequirements; requirements, standards, and policies ofaudit and data processing organizations; upper managementdirectives)

;

o Business/organizational environment unique to theorganization (e.g., effect of economy on organization,budget of organization, and technology available to orused by organization) ;

o Organizational unique factors (e.g., presence andstrength of quality assurance and security functions,management and control philosophy, structure, andpolicies ) ;

o Geo-political environment (e.g., public concern andpolitics )

;

o Resource constraints/economic health (e.g., dollars,time, expertise, training, tools, and techniques);

o Known problems with the system, from current logs orprevious evaluations and audits (e.g., nature andmagnitude of problems)

;

o Evaluations and audits planned by management;

o Auditor's institutional knowledge of organization'suniverse of systems.

After all of the front end qualifiers have been considered,it may be that the entire audit plan is dictated by the nondis-cretionary work. That is, external directives, internal direc-tives, business environment, unique organization/responsibil i-

ties, and/or resource constraints may require that certain auditsbe performed and these required audits may use up the limitedaudit resources available. In this case, the priority scheme maystill be useful for determining audit approaches and where tofocus efforts.

If, on the other hand, additional audit resources areavailable for discretionary audits, the risk evaluation of thework priority scheme can be used to identify and rank the systemsin greatest need of audit coverage. Ultimately, back endqualifiers may need to be considered for the discretionaryaudits, as described in Section 5.

8

4.4 Risk Evaluation Levels and Dimensions

The work priority scheme expresses the risk concerns interms of two levels and five dimensions. The risk concerns inLevel I are reviewed first and those in Level II are reviewedsecond. Level I has one dimension and Level II has four dimen-sions. Each dimension is defined as a related set of character-istics which can estimate or measure the amount of risk posed bythat dimension to a failure of the system. The chief concern ofeach dimension can be stated in the form of a question asfollows

:

1. What is the impact/critical ity of the system to theorganization?

A poorly developed or controlled system that is missioncritical could jeopardize an organization's basic operational orprogrammatic effectiveness? therefore, an impact/critical systemcommands audit attention. The larger the impact, the moreimportant it is to audit.

2. How complex is the system? (This includes size consid-erations.)

The more complex the system, the more difficult iscommunication and control, and consequently, the higher the riskof failure. The greater the chance for failure, the moreimportant it is to audit the system.

3. How stable is the system internally (structure) andexternally (environment)?

The less stable the system, the more difficult it is todevelop procedures for communication and control, the greater thechance for failure, and the greater the need to audit.

4. How reliable is the system and the information itprocesses and generates (i.e., what is the chance of the systemfailing or the data being wrong)?

The answer to this question is obtained by looking atthe controls in the system (integrity controls) and prior auditexperience. The less reliable, the more chance for failure andthe need to audit.

5. How well is the technology integrated into the organiza-tion?

The poorer the system technology is integrated with theskills of the staff and the standards and procedures of theorganization, the more chance for failure and the greater theneed to audit.

These questions serve as the basis for the five dimensionsand their associated characteristics developed for the work

9

prioritization scheme. Identified simply the two levels and fivedimensions are:

Level I

1. Critical ity/Mission Impact

Level II2 . Siz e/Scal e/Compl exi ty3. Environment/Stability4. Reliability/Integrity5. Technology Integration

The five workshop discussion groups believed strongly thatthe overriding dimension of the five should beCriticality/Mission Impact. Systems that significantly impactthe mission of an organization, or key organizational components,would easily take precedence over all other dimensions inallocating EDP audit resources. Because Criticality/MissionImpact was such an overriding dimension, the work priority schemewas developed as a two level scheme. Level I is composed of thedimension Criticality/Mission Impact while Level II is composedof the remaining four dimensions: Size/Scale/Complexity,Environment/Stability, Reliability/Integrity, and TechnologyIntegration.

The two level work priority scheme permits a high amount offlexibility depending on organizational need since it can beapplied in any degree of detail required. For example, theresults of Level I ranking may be adequate to prioritize allaudit work, based on available time and resources. If additionalranking characteristics are necessary, the more detailed Level IIcan be used to further prioritize audit work. A two level review,additionally, enables the auditor to purge fromconsideration those systems which will definitely not be re-viewed, for any number of reasons. Environment and resourceissues enter in here.

The two level work priority scheme follows in outline form,identifying the five dimensions and their relatedcharacteristics. [Note that the same characteristic may be usedin more than one dimension because the question asked will bedifferent.

]

4.5 Two Level Work Priority Dimensions/Characteristics

4.5.1 Level I:

A. Mission Impact /Strategic Value/Organization (Business)Criticality and Sensitivity Factors

o criticality of system to organization mission

o criticality/sensitivity of system to well being,safety or interest of general public/clients/con-sumers

10

o criticality/sensitivity of data and informationcompetitive advantageconfidence of public in program/departmentprivacy/confidentiality/security issues

o materiality of resources controlled by system

o fraud potential

o life cycle costs of system (people and dollars)development cost budget

. people

. dollarshardwaresoftwarefacilities

operating cost budget. people

data processing/systems (including training)users (including training)

. dollarshardware (CPU, peripherals, terminals,

telecommunications, etc.)acquisitionoperation

softwareacquisition

- maintenancesuppliesfacilitiesconfiguration change control

o degree of dependence on AIS

o criticality of interfaces with other systems andexternal organizations

A Level I review, outlined above, provides a "first cut" atthe total audit universe. This initial review will identifycritical systems that require audit coverage. The additionaldimensions to be reviewed in Level II should be used torank these critical systems to find those most deserving ofdiscretionary audit coverage.

4.5.2 Level II:

b. System^ size/Scale/Complesity

^The term "system" is used in place of "project" to signifythe entire AIS life cycle and the possibility of auditing at anypoint in the development process or operations.

11

o size of user area impacted

o number/complexity of interfaces/relationships withother projects or systems

o complexity of AIS technology (e.g., network size,communication needs, system configuration, degree ofcentralization, nature of transaction couplingmechanisms, nature of security)

o size/complexity of systemsize of system budget

. development costs

. maintenance/operation costsnumber/complexity of different inputs

- number/complexity of unique filesnumber/complexity of unique outputsnumber/complexity of logical files (views) systemwill accessnumber/complexity of major types of on-lineinquirynumber of source documents maintained/retained

- number/complexity of computer programscomplexity of programming language

- complexity of system configurationnumber of human elements interfacing

- number of decision levelsnumber of functions by devicesnumber, types and complexity of transactionsnumber of external organizations impacted

o nature of interactions with external organizations

C. System Environment/Stabil itv

o organizational breadth (interfaces, dependencies,system configuration)

o management involvement/commitment

o project management approach and structureconfiguration management program

management efficiency and effectiveness

o specificity of, agreement on, and support for userrequirements

o confidence in estimates — both cost andtime — premising make-or-buy decisions, vendorselection, system testing/validation, etc.

o number of vendors/contractors involved

12

o newness of function/process to user

o problems associated with current system performanceand/or system development effort

o existence/scope of data processing standards,policies and procedures, especially systems develop-ment life cycle methodology and documentationrequirements

o availability of evidence - document and reportpreparation and maintenance for entire systems lifecycle (e.g., test/validation/certification results,operations manual, system specifications, audittrails, exception reporting)

o quality and completeness of documentation

o general controlsphysical access controlsenvironmental controlscommunication controlsmanagement controls environment

- document controlssystem change and test/validation/certificationcontrols

o on-going concern issues/organization effect (willmission objectives be met in a timely manner?)

interruption tolerance- ability to maintain performance

unsatisfactory system performance (adverseconsequences from degradation or failure)

- unsatisfactory system development completion- unsatisfactory conversion

o labor relations (e.g., salary parity, hours, fringebenefits, etc.)

o project team (management and staff effectiveness andtraining)

o organizational and personnel changes (frequency,magnitude and number)

o functional requirements changes (frequency, number,and magnitude)

o technical changes (e.g., frequency, magnitude andnumber)

o factors affecting cost/economic/budget climate

o availability and adequacy of back-up and recoveryprocedures

13

D. Reliabilitv/Integrity

o hazards/risks to information system (data, hardware,communications, facilities)

o general controlsenvironmental (e.g., physical access controls,natural hazards controls)management

o applications controls

o availability and adequacy of audit trails

o quality and quantity of automated errordetection/correction procedures

o availability and adequacy of back-up and recoveryprocedures

o completeness, currency and accuracy of documentationfor audit

o prior reviews (e.g., A-123, A-127, A-130,audits—internal, CPA, QA— IRM triennial reviews)

o auditor judgement (intuitively obvious)

E. Technology Integration

o make-up of project team in relation to technologyused (number, training, and experience)

o applicability of the data processing designmethodologies and standards to the technology in use

o pioneering aspects (newness of technology and/ortechnological approaches used in this informationsystem for application and organization)

o technical complexity of information system(interrelationships of tasks)

o user knowledge of DP technology

o margin for error (i.e., is there reasonable time tomake adjustments, corrections or perform analysesbefore the transaction is completed?)

o utilization of equipment (tolerance for expansion)

o availability of automated error detection/correctionprocedures

14

o completeness, currency and accuracy of documentationfor implementation/maintenance/operation (e.g.,operations/maintenance manuals)

.

o amount of hardcopy evidence

4.6 Risk Scoring — Application of the Work Priority Scheme

4.6.1 Implementation of the Scheme

For the scheme to be of use to the EDP auditor, an analysisapproach for risk scoring must be employed using the dimensionsand characteristics. Two possible approaches for arriving at asystem risk score are suggested here and described in Appendix D.The first scoring method is a simple intuitive approach based ona minimal collection of information on the AIS while the secondone is more elaborate and based on more detailed information onthe AIS. User experience will undoubtedly lead to modificationsand improvements in the application of the scheme and the riskscoring methods. If the EDP reviewer for some reason does notwish to use a scoring methodology, he/she could still keep thedimensions and their characteristics in mind when performing aless formal review.

4.6.2 A Simple Scoring Approach

The simple approach assigns a weight and a risk level toeach dimension, based on a qualitative judgement with respect tothe characteristics associated with each dimension. Criticality/Mission Impact is always assigned the highest weight. Theproduct of the weight and risk level of a dimension is the riskscore for that dimension. The Criticality/Mission Impact riskscore is then the Level I system risk score. To obtain the LevelII system risk score, the sum of the dimension risk scores overthe four Level II dimensions is added to the Level I system riskscore. (See Appendix D for details.)

4.6.3 A Detailed Scoring Approach

The more detailed approach looks in depth at the characteri-stics associated with each dimension. Each dimension is definedby a set of characteristics which are used to calculate theamount of risk posed by that dimension to the failure of thesystem. Each characteristic is given a weight and a risk level.The product of these two numbers is the risk score of thecharacteristic and the sum over the risk scores of the character-istics of a dimension yields the dimension risk score. Again,the Criticality/Mission Impact risk score is the Level I systemrisk score. And again, to obtain the Level II system risk score,the sum of the dimension risk scores over the four Level IIdimensions is added to the Level I system risk score. (SeeAppendix D for details.)

15

5. Discretionary Audits

After the systems have been identified and ranked, using therisk based evaluation, several back end qualifiers must beconsidered by the auditor in determining how many discretionaryaudits can be added to the audit plan (See Figure 1) . These backend qualifiers can be categorized in two areas:

- Audit Types and Objectives, and

Audit Resource Constraints

Figure 2 identifies the different audit methodologies that can beused and the different audit objectives that can be accomplishedin performing ADP audits. The auditor must consider the auditmethodology to be performed and the audit objective to beaccomplished in deciding on the number of additional (discretion-ary) audits that can be performed. Furthermore, these issuesmust be considered in light of the audit resource constraints(e.g., people, time, dollars, expertise) that exist. Forexample, to perform a system under development audit which looksat security, confidentiality, and privacy issues requiressubstantially more resources than an operational system auditwhich looks at only data reliability issues. Thus, the mix ofaudit methodologies to be performed, and the existing auditresource constraints must be considered when deciding on thenumber of discretionary audits that can be added to the auditplan. After these back end qualifiers have been considered, theaudit plan can then be finalized, and audits implemented.

6. USES OF THE WORK PRIORITY SCHEME

The risk scores developed during the risk based evaluationcan be used for both developmental and operational systems. Themajor difference between risk based evaluations of these twoclasses of systems is that (1) the ranking of characteristics maychange, and (2) some characteristics may not even be applicableto both. The following is a brief enumeration of some possibleuses of the Work Priority Scheme (from "strawman" scheme inAppendix B)

.

1) To determine relative risk between applications - A riskscore of one application is compared to scores developed forother applications in the same department. Thus, risk scoring isused to determine relative risk among applications. The score is

not used to determine an absolute measure of risk.

2) To create an audit risk profile - An audit risk profileis a pictorial representation of the various risk characteristicsmeasured. While the audit risk score shows audit risk for theentire automated information system, the risk profile shows therelational risk among the various risk characteristics. Theobjective of the risk profile is to graphically illustrate whatcharacteristics contribute to the total risk, and in whatproportion.

16

Figure

2

AUDIT

AREAS

OF

CONCERN*

0ou

CO

HUwPQO

a)

o >> ac e CD

QJ O >•H C •HO O 4J•H o o4H W CD

4-1 4-)

w 4-1

Wc

4-> 0 CO

•H •H CD

iH 4-> O*H CO PX) 4-1 £ 3co 0 P OfH 0 to

•H 4h CD

CO C OS> M<

>>4-J

•HrH

^ co >>4-> -H u*H +J co

p c >3 CD •HO T3 p0) *HCO 4h

Oh

c0o

>>4->

•HCO rH4-> •HCO XIo CO

(U

CO 04-1 CO 0 *r4

C CO 3 4-1

a) a) 4-> CO

S o CO 4-> A 00a o >. C • 00 cO P CO CD 00 c •H

o ^ Oh P • •H CO

o CD CD 4-1 rH 3 0) A CO CO

X > CD ”3 C CO rH 4-1 CO (Do (D rH C CD c a A c (D CJ

Q Q O 3 0 O 0 c 0) o oO >> a •H M o £ o pffi £ U 0 o 4-> •H 3 p aH CD 3 rH CO 4J 4-1 00 apxD 4j a) 4-i a) P CO o cO CD co

CO 4H co > CD O c c rH 4-1

>» 0) a a 3 CO CD cO

CO CO Q O w a r H Qrs <r-\ /—S

< PQ u Q'w'

17

Decisions

on

audit

methodology

and

objectives

desired

will

influence:

-

Weights

given

when

ranking

risk

factors

-

Audit

scope,

i.e.,

level

of

involvement

(e.g.,

tasks,

dollars,

hours)

3) To modify the characteristics contributing to audit risk- Both the auditor and data processing management can use theaudit risk scheme to identify those characteristics which maycause the information system to be less successful than pro-posed. For example, if the application project personnel do notunderstand the computer technology being used, the probability ofsuccess of the information system being developed diminishes.Once the characteristics that may cause the system to be lesssuccessful than desired are known, those characteristics can bealtered such that the probability of the system being successfulincreases.

4) To help allocate audit resources - The informationgathered during the audit risk analysis can be used as a basisfor allocating audit resources to review application systemsand/or review specific aspects of those systems. For example,high-risk information systems may receive extensive reviews,medium risk cursory reviews, and low risk no reviews. For thosesystems reviewed, the area of review can be selected based on thehigh-risk characteristics. For example, if computer technologyis a high-risk characteristic, the auditors may want to expendtime reviewing how effectively the project team is using thattechnology.

5) To develop a data base of risk characteristics - Theinformation gathered during this process should be saved and usedfor two purposes. The first use is to improve the audit riskprioritization scheme to make it more predictive of audit risk;and the second use is to assist data processing management instructuring and planning projects such that those projects willhave the highest probability of success.

7. PROBLEMS WITH AND SOLUTIONS TO USE OF SCHEME

Potential difficulties in using the work priority scheme andmethods for overcoming these difficulties were discussed by thePCIE Work Group participants in order to facilitate the use ofthe scheme. These follow in outline form.

7.1 Potential Difficulties in Utilization

o Time and resources needed for sufficient data collection

o Inadequate organization data processing planning

o Need to establish an understanding of and agreement onrelated issues on a consistent basis by all affectedparties (auditors/systems developers/users/etc.)

o Need to convince affected management (audit and operations) as to the credibility of scheme and its impact onaudit coverage, given a finite level of audit resources

o Initial time and resources needed to adapt the work

18

priority scheme to the organization

o Represents a snapshot at a given point in time whichrequires maintenance and updating to ensure its continuedvalidity

o Need for audit planning to be separate from and sensitiveto data processing and business cycle planning processes

o Requires integrated skill knowledge that includesrelevant expertise in pertinent specialty areas

o Work priority scheme just another tool for audit management to consider in its decision-making process

o EDP audit resources still likely to be insufficient toprovide coverage suggested by scheme

o Requires up-to-date and complete inventory of AISs—allthose which are operational, developmental, and undergoing change

7.2 Methods for Overcoming Difficulties

o Make underlying questionnaire and data gathering methodsas simple as possible for administering it.

o Refine data collection methods through experience andlearning curve.

o Educate users (including DP community) regarding needsfor standards, planning, etc..

o Audit recommendations should emphasize necessaryimprovements to DP and business executives.

o Encourage early participation and collective editing toreach consensus on data collection instrument.

o Apply retroactively to existent systems to demonstratethe risks that audit coverage would have addressed.

o Emphasize that initial commitment would have long-termbenefits; and that once established, maintenance would beconsiderably less costly.

o Analyze dynamics of the organization and the auditcomponent within it to determine the frequency of"snapshot". Workload mix and control attributes may beaffected accordingly.

o Use means for staying attuned to planning cycles.

o Consider supplementing EDP audit resources with financial

19

and generalist auditors for areas not requiring specifictechnical expertise. They may even be more relevant forbusiness and institutional knowledge.

o EDP audit resources may be supplemented with consultantsfor areas requiring highly skilled data processingspeci alists.

8. Recommendations

The workshop attendees came up with a number of recommenda-tions for further activity in this crucial EDP audit area. Abrief enumeration of these follows.

1) The work priority scheme described here should be testedwithin organizations by applying it to the EDP planning consider-ations of a prior year's workload universe. This might helpascertain how EDP audit resources may have been allocateddifferently and whether that allocation may have better assistedmanagement in identifying and overcoming resultant controldeficiencies in the systems.

2) Feedback should be captured on institutional knowledge of whyand how systems have failed so that one could determine whetherthe draft scheme would have targeted EDP audit resources on themost vulnerable systems.

3) A prototype needs to be developed which would include asurvey questionnaire, a weighting and scoring system, a testingprocess, a methodology for evaluating results and modifying theprototype, a method for the selection of testing sites, and amethod of quantifying qualitative issues that would facilitate a

comprehensive cost-benefit evaluation of the work priorityscheme.

20

References

[1.] "ADP Internal Control Guide," U.S. Department of Energy,DoE/MA-0165 , August 1984.

[2.] Executive Office of the President, Office of Management andBudget, Management of the United States Government . Fiscal Year1986 . Washington, D.C.: U.S. Government Printing Office, 1985.

[3.] Federal Managers' Financial Integrity Act of 1982, PublicLaw 97-255, September 8, 1982.

[4.] "Standards for Audit of Governmental Organizations,Programs, Activities, and Functions," U.S. General AccountingOffice, 1981 Revision.

[5.] "Portfolio Approach to Information Systems," F. WarrenMcFarlan, Harvard Business Review, September - October 1981.

21

APPENDIX A

PCIE WORK GROUP ONEDP SYSTEMS REVIEW AND SECURITY

SUMMARY OF BACKGROUND AND CHARGE

President Reagan established the President's Council onIntegrity and Efficiency (PCIE) in March 1981 to coordinategovernment-wide efforts to attack fraud and waste and helpensure system integrity in government programs and operations.Chaired by the Deputy Director of the Office of Management andBudget, the Council is composed of the Inspectors General(IGs) , as well as representatives from the Federal Bureau ofInvestigations, the Department of Justice and the Office ofPersonnel Management. Among its other functions, the PCIE ischarged with developing interagency programs and projects todeal efficiently and effectively with those problems concerningfraud and waste which exceed the capability or jurisdiction ofan individual agency.

In October 1983, the Council established a working group onElectronic Data Processing (EDP) Systems Review and Securityunder the leadership of the Inspector General of the Departmentof Health and Human Services, to be included under his ongoingComputer Security Project. Composed of IG and managementrepresentatives from 14 Federal Departments and Agencies, thegroup is charged with facilitating and improving Office ofInspector General/Audit organization reviews of automatedinformation systems (AISs)

,particularly those systems under

development. The objective of the PCIE Work Group is to improvethe likelihood that auditable and properly controlled systems arebeing developed.

To achieve this objective, the PCIE Work Group participantsdrew from the Department of Defense life-cycle approach to themanagement of automated systems, and the National Bureau ofStandards' Institute for Computer Science and Technology's(NBS/ICST's) Special Publications and Federal InformationProcessing Standards, to develop a system life cycle functionalmatrix for AISs. That matrix, structured around critical AISdocumentation requirements, is intended to clarify the potentialfunctions of the internal auditor vis-a-vis other key partici-pants in the EDP planning, design, implementation and reviewprocesses. With the life cycle matrix as a conceptual framework,an audit guide is being developed to facilitate the successfulfulfillment of that role, focusing on systems under developmentand major modifications to existing systems.

A - 1

PCIE WORK GROUP MEMBERS

Bonnie Fisher(Project Leader)

Dep't of Health & Human ServicesOffice of Inspector General

John Bjork Small Business AdministrationOffice of Inspector General

Jim Cox Dep't of Health & Human ServicesOffice of Inspector General

David Decker Dep't of Housing and Urban DevelopmentOffice of Inspector General

Bob Gignilliat Dep't of Health & Human ServicesOffice of the Assistant Secretary

for Management & Budget

Mark Gillen Department of TreasuryIRS Internal Audit

Jim Hollohan Smithsonian InstitutionAudit Agency

Mike Houston Department of DefenseOffice of Inspector General

Doug Hunt National Aeronautics & SpaceAdministration

Office of Inspector General

Wally Keene Dep't of Health & Human ServicesOffice of the Assistant Secretary

for Management & Budget

John Lainhart Department of TransportationOffice of Inspector General

Jack Landers General Services AdministrationOffice of Information Resources

Management

Bill Lee Department of CommerceOffice of Inspector General

Mac MacDonald Veterans' AdministrationOffice of Inspector General

Larry Martin Department of EnergyOffice of ADP Management

A - 2

APPENDIX B

STRAW MAN PRIORITIZING SCHEME

FOR USE BY AUDITORS IN EVALUATING

AUDIT RISK IN AUTOMATED INFORMATION SYSTEMS

OBJECTIVE OF "STRAW MAN" PRIORITIZING SCHEME

The prioritizing scheme outlined in this paper is proposedas a "straw man" for use by auditors in evaluating the audit riskin automated information systems. An audit risk (sometimesreferred to as an exposure) is the probable unfavorable effectassociated with the occurrence (s) of an undesirable event.Audit risk needs to be evaluated for two purposes. The first isto determine the need for, and amount of, audit resources thatshould be assigned to an automated information system; and thesecond is to point the auditor toward those systemcharacteristics most susceptible to vulnerabilities. Thefollowing straw man has been developed primarily for use as astarting point for discussion by the attendees to the NBS/PCIEWork Group Invitational Workshop on "Work Priority Scheme for theEDP Auditor.

"

BACKGROUND INFORMATION AND ANALYSIS OF EXISTING RISK/PRIORITIZINGMETHODOLOGIES

Auditors traditionally use audit risk assessment methodolo-gies to allocate audit resources and identify areas for investi-gation. While various organizations approach audit risk assess-ment from different perspectives, their chronological approach toaudit risk assessment has usually gone through the following fourphases or approaches (note that audit groups currently performrisk/exposure assessment using all four approaches)

:

o Approach 1 - Audit judgment and instinct

This has been, and is still, the most prominently usedmethod of audit risk assessment. Using this approach, theauditor calls upon his/her personal experiences, coupled withother learning experiences and knowledge of organization missionand external mandates, in order to project those experiences,learning and knowledge to the automated information system (AIS)under review. The auditor intellectually tries to associate theAIS under review with past experience and knowledge to determinecomparable characteristics in order to estimate the magnitude ofthe audit risk/exposure and to select specific system character-istics for investigation. While this method can be effective, itis not a transferable skill, but, rather, one which must belearned over time and is unique to each practitioner.

B - 1

o Approach 2 - Dollar risk estimation using the risk

Risk is defined as the probability for loss. That probabil-ity is expressed through the formula "frequency of occurrencetimes loss per occurrence equals annual loss expectancy." The"frequency of occurrence" refers to the frequency with which aparticular vulnerability (flaw in the system) may combine with apossible threat (a man-made or natural exploitation of thevulnerability) . The "loss per occurrence" is then the negativeimpact of a threat/vulnerability pair. Audit risk based on thisformula can be quantified in dollars. This can, under certaincircumstances, provide the advantage of projecting, with highprecision, risk exposure in terms readily understandable by non-technicians. FIPS PUB 65 is based on this risk assessmentmethod. The disadvantages of projecting risks in dollars arethat the base numbers are difficult to get (i.e., frequency ofoccurrence and loss per occurrence) and it may therefore imply ahigher degree of precision than is realistic.

o Approach 3 - Identifying and weighting risk attributes

The attributes that cause risk/exposure to be realized havebeen at least partially identified. The relationship among theseattributes can be specified through weighting. Using theseattributes, an auditor can determine whether or not they arepresent in the automated information system under review, andthrough the accumulation of weighted scores rank automatedapplication systems according to their relative audit risks. Forexample, this method can show that application A is of higherrisk than application B. This method is most effective when theattributes are determined through statistical regression analy-sis.

o Approach 4 - Use of risk assessment software packages

Vendors have automated approaches two and three and madethem commercially available. The first software package ondetermining dollar risk was marketed by Pansophic as PANRISK, andthe first commercially available software package which used theattributes method to project risk was offered by Management andComputer Services as a package called ESTIMACS. The majoradvantages to the automated version are the ease of use and theopportunity with minimal effort to play "what if" strategiesthrough varying one or more of the risk characteristics.

The ideal audit risk/exposure assessment method has not yetbeen developed. No current approach can guarantee the completelycorrect prediction of audit risk. However, approaches 2, 3, and4 represent transferable skills, and because they have beenformalized can be evaluated and proved. One characteristic of a

risk assessment method that appears to be extremely important isits ease of use. The more difficult the method is to use, theless likely that an auditor will use it. Lacking a convenientstructured method, the auditor will revert to approach 1 and relyon instinct and judgment to make audit decisions.

B - 2

Many internal audit and data processing functions havedeveloped a prioritizing scheme to evaluate the audit risk ofautomated information systems within their own organization.There appears to be much similarity among the various approaches.F. Warren McFarlan has attempted to categorize the dimensions ofrisk that are common to many of these in-house developed priori-tizing schemes.

The Three Dimensions of Risk

F. Warren McFarlan, in a September-October 1981 HarvardBusiness Review article entitled "Portfolio Approach to Informa-tion Systems," identified three important dimensions whichcontribute to the risk exposure inherent in a project:

1) Project size - The larger it is in dollar expense,staffing levels, elapsed time, and number of depart-ments affected by the project, the greater the risk/ex-posure. Multimillion-dollar projects obviously carrymore risk than $50,000 projects and also, in general,affect the company more if the risk is realized. Arelated concern is the size of the project relative tothe normal size of a systems development Group'sprojects. The implicit risk is usually lower on a $1million project of a department whose averageundertaking costs $2-$3 million than on a $250,000project of a department that has never ventured aproject costing more than $50,000.

2) Experience with technology - Because of the greaterlikelihood of unexpected technical problems, projectrisk increases as familiarity of the project team andthe IS organization decreases with the hardware,operating systems, data base handler, and projectapplication language. A project that has a slight riskfor a leading-edge, large systems development group mayhave a very high risk for a smaller less technicallyadvanced group. Yet the latter groups can reduce riskthrough purchase of outside skills for an undertakinginvolving technology that is in general commercial use.

3) Project structure - In some projects, the very natureof the task defines completely, from the moment ofconceptualization, the outputs. The outputs of theseprojects are fixed and not subject to change during thelife of the project. Such schemes are classified ashighly structured. They carry much less risk thanthose whose outputs are more subject to the manager'sjudgment and hence are vulnerable to change.

An analysis of the attributes method of risk assessmentappears to emphasize these three dimensions. Thus, while it ispossible to divide audit risk/exposure into different dimensions,practice appears to support that there is consensus among those

B - 3

working with audit risk/exposure that these are importantdimensions. Therefore, the straw man audit prioritizing schemeproposed for this invitational workshop will be constructedaround these three dimensions.

NEED FOR AND USE OF AUDIT RISK PRIORITIZING SCHEME

Warren McFarlan, in his "Portfolio Approach to InformationSystems" article, states that:

"The typical project feasibility study covers exhaustivelysuch topics as financial benefits, qualitative benefits,implementation costs, target milestones and completiondates, and necessary staffing levels. In precise, crispterms, the developers of these estimates provide voluminoussupporting documentation. Only rarely, however, do theydeal frankly with the risk of slippage in time, costoverrun, technical shortfall, or outright failure. Rather,they deny the existence of such possibilities by ignoringthem. They assume the appropriate human skills, controls,and so on, that will ensure success."

McFarlan and others have proposed that through proper analysisthe auditor should be able to predict the probability of unfavor-able consequences such as:

o Failure to obtain all, or even any, of the anticipatedbenefits

o Cost and/or schedule overruns

o Inadequate system of internal control

o Technical performance of resulting system that turnsout to be significantly below estimate

o Incompatibility of the system with the selectedhardware and software

The internal auditor has only limited resources to performhis mission. Good audit practices dictate that those resourcesbe assigned to activities that offer the greatest payback to theorganization. In 1977, The Institute of Internal Auditors issuedthe report from the research project entitled Systems Auditabili-ty and Control. A major conclusion from this project was thatthe most fruitful use of internal audit time would be participat-ing in the automated information system development process. Inaddition, the U.S. General Accounting Office issued a standard,which in the 1981 revision was changed to a guideline, regardingauditor participation in system development. The general feelingof the PCIE Work Group, however, is that auditor participationduring the System Development Life Cycle (SDLC) is vital toassuring the development of secure and auditable systems. Thechallenge has been first, what systems should the auditorparticipate in, and second, if they participate, where should

B - 4

they spend their review time?

The audit-risk-based prioritization scheme is developed toanswer these challenges. It provides a basis for determiningwhat systems should be reviewed, and for those systems reviewedindicates the characteristics on which audit effort should beexpended.

AUDIT RISK PRIORITIZATION SCHEME

An effective audit risk prioritizing scheme has thefollowing four parts:

1) Identification of risk dimensions - Previously definedas project size, experience with technology, andproject structure.

2) Identification of risk characteristics - The attributesof an automated information system which permit theauditor to project the performance of an operationalinformation system.

3) Analysis of the audit risk characteristics - Determinesthe applicability and importance of the characteristicin predicting the operational performance of theautomated information system.

4) Use of the audit risk assessment - The objective of therisk prioritization scheme is to assist the internalauditor in using limited resources more effectively.Usage involves the interpretation and application ofthe risk assessment resulting from the utilization ofthe first three parts of the audit risk prioritizingsch erne.

part i - ideptifica_fc.io.n_ .of. the Rj. s k D imensions

The importance of having risk dimensions is to categorizeaudit risk by the determinant for that risk. This is importantbecause the audit risk characteristics within a dimension ordeterminant are more closely related than the characteristicsbetween dimensions. This concept can be helpful in both managingaudit risk from the data processing perspective, and selectingspecific characteristics to investigate from an audit perspec-tive .

Part 2 - Identification of Audit Risk Characteristics

Risk characteristics are attributes of automated informationsystems which correlate to operational behavior of the automatedinformation system. The presence or absence of these systemattributes can be used to predict behavior. An analogy would bepredicting the probability of a heart attack by using an individ-ual's heart attack risk characteristics such as blood pressure,weight, family health history, and amount of cigarettes smoked.

B - 5

The presence or absence of these characteristics can be used topredict whether a specific person might have a heart attack(i.e., a specific human behavior). While the word risk is used,it is not meant to imply that the undesirable event will occurbut, rather, that the probability of some type of behavior (e.g.,a heart attack) can be predicted. A prioritizing scheme willtell how probable the heart attack is.

The proposed "straw man" characteristics recommended forevaluating each of the three audit risk dimensions are presentedin Figure B.l. This straw man risk model is designed to identifyand explain the characteristics associated with the three auditrisk dimensions. These are the characteristics that mostcommonly appear in audit risk models currently used by auditors,and are believed to be those which can help auditors predict theoperational performance of information systems for audit purpos-es. Figure B.l has been placed at the end of this paper so thatthe attendees may detach it for use in their discussions.

Part 3 - Analysis of the Audit Risk Characteristics

This part of the audit risk prioritizing scheme is designedto measure the degree of audit risk associated with each individ-ual characteristic. The objective of this measurement is twofold.First is to determine the degree of importance of each character-istic in representing the magnitude of audit risk/exposure (i.e.,weighting of characteristics among the population of characteris-tics) and second, to determine the applicability of that charac-teristic to the specific automated information system beingassessed (to determine whether the characteristic is present orabsent in a manner that could cause an unfavorable event tooccur; for example, if an individual was overweight it would beindicative of a possible undesirable event such as a heartattack)

.

There are five approaches used to measure the applicabilityof a characteristic to predict a favorable or unfavorable result.These are:

1) Relational considerations - This asks the auditor todivide the application systems into three risk catego-ries, e.g., high, medium, and low, and then determineinto which category the system being assessed falls.For example, an NBS study! has shown that the largerthe size of a computer program, the more difficult it

is to implement. From a relational perspective, theauditor decides whether the size of the program for thesystem being assessed will fall within the largestthird sizewise for the department, the middle thirdsizewise, or the lowest third sizewise. The largestthird would be considered to have the highest risk.

! NBS Special Publication 500-99 entitled "StructuredTesting: A Software Testing Methodology Using the CyclomaticComplexity Metric," issued December 1982.

B - 6

2 ) Factors relating to risk - This approach attempts torelate specific factors to the expected outcome. Theauditor need only determine which factors are applic-able to the system under assessment to determine thedegree of audit risk. For example, in assessing a datavalidation characteristic? a factor relating to lowrisk would be extensive data validation includingrange, alphabetic? check digit, and relationship tests;while high-risk systems might be those that only usealphanumeric tests.

3) Dollar risk - Using the annual loss expectancy formula(i.e., frequency of occurrence times loss per occur-rence) the dollar value associated with each character-istic can be used to determine the magnitude of riskfor that characteristic.

4) Audit analysis - This method requires the auditor toconduct sufficient study to determine the potentialvulnerability. The most common approach to doing thisis an assessment of the system of internal controls inorder to identify vulnerabilities associated with thecharacteristic in question. Again, if thecharacteristic was data validation, the audit reviewcould determine the effectiveness of the data valida-tion controls to reduce the specific audit risksreducible by data validation controls.

5) Statistical regression analysis- Over a period of timethe audit group can record system characteristics andactual operational behavior. Feeding that informationinto statistical regression analysis, the auditor candetermine specific correlation between the variousattributes of the characteristics compared to theactual operational behavior of the information system.While this is the statistically proper approach, it canalso be very time-consuming and costly to obtain.

Experiences of audit risk model users indicate that the twomost popular approaches are relational risk (i.e., high, medium,and low) and risk factors to determine the applicability of thecharacteristics to the system under review.

Part 4 - Use of the Audit Risk Assessment

The audit risk prioritizing scheme can be used by both dataprocessing and audit personnel. Data processing personnel canuse the risk scheme to identify the attributes that may cause thesystem to be unsuccessful and manage those risks by changingdevelopmental approaches.

The performance of the first three parts of the audit riskscheme will result in the identification of the audit riskcharacteristics applicable to the automated information systemunder review, and some indication of the magnitude or degree of

B - 7

applicability. This audit risk information can then be used byinternal auditors in any or all of the following manners:

1) Audit risk score - This usage allocates points inaccordance with the magnitude of risk associated witheach characteristic. The most common scoring method isto divide the risk characteristic into specificsubcategories as was illustrated earlier in the datavalidation example. If the application being assessedfalls into the high-risk category, it would be assignedthree points, medium risk, two points, and low risk,one point. If a more sophisticated scoring method iswanted, the individual characteristics can be weighted.For example, one characteristic can be considered to betwice as important as another, and thus is multipliedby the weight 2 to give an individual characteristicrisk score. The resulting risk score is normallycompared to other scores developed for the same depart-ment. Thus, risk scoring is normally used to determinerelative risk between applications, and the score isnot used to determine an absolute measure of risk, suchas temperature of the human body, which has an absolutemeaning.

2) Create an audit risk profile - An audit risk profile isa pictorial representation of the various risk charac-teristics measured. While the audit risk score showsaudit risk for the entire automated information system,the risk profile shows the relational risk among thevarious risk characteristics. The objective of therisk profile is to graphically illustrate what charac-teristics contribute to the total audit risk, and inwhat proportion.

3) Modification of the characteristics contributing toaudit risk - Both the auditor and systems analyst canuse the audit risk scheme to identify those character-istics which may cause the information system to beless successful than proposed. For example, if theapplication project personnel do not understand thecomputer technology being used, the probability ofsuccess of the information system being developeddiminishes. Once the characteristics that may causethe system to be less successful than desired areknown, those characteristics can be altered such thatthe probability of the system being successful increas-es. In our example where the project personnel do notunderstand the proposed technology, a technology whichthe project group does know can be substituted and theprobability of success will increase.

4) Allocation of audit resources - The informationgathered during the audit risk analysis can be used asa basis for allocating audit resources to reviewapplication systems and/or review specific aspects ofthose systems. For example, high-risk information

B - 8

systems may receive extensive reviews, medium risk,cursory reviews, and low risk, no reviews. For thosesystems reviewed, the area of review can be selectedbased on the high-risk characteristics. For example,if computer technology is a high-risk characteristic,the auditors may want to expend time reviewing howeffectively the project team is using that technology.

5)

Data base of risk characteristics - The informationgathered during this process should be saved and usedfor two purposes. The first use is to improve theaudit risk prioritization scheme to make it morepredictive of audit risk; and the second use is toassist data processing management in structuring andplanning projects such that those projects will havethe highest probability of success.

REFERENCES USED TO BUILD THE "STRAW MAN "AUDIT RISK PRIORITIZINGSCHEME

The major references used in creating this "straw man"prioritizing scheme were:

1) OMB Circular A-123, "Internal Control Systems," August16 , 1983.

2) U.S. General Accounting Office document on internalcontrol— "Evaluating Internal Controls in Computer-Based Systems - Audit Guide" June 1981

3) Computer Control and Audit by William Mair, KeagleDavis, and Donald Wood, published by The Institute ofInternal Auditors (1977)

4) ESTIMACS software package marketed by Management andComputer Services, Valley Forge, PA

5) PANRISK audit software package and manual, originallymarketed by Pansophic Systems, Oak Brook, IL and nowcalled IST/RAMP and marketed by International SecurityTechnology, Inc. of New York City, its originator.

6) "Portfolio Approach to Information Systems" by F.Warren McFarlan, Harvard Business Review, September-Oc-tober 1981

7) FIPS PUB 65, "Guideline for Automatic Data ProcessingRisk Analysis" August 1, 1979

8) U.S. General Accounting Office "Standards for InternalControls in the Federal Government", 1983

B - 9

FIGURE B.l

RISK DIMENSION CHARACTERISTICS

CHARACTERISTICS FOR THE RISK DIMENSION — PROJECT SIZE

1. Size of user area - Number of employees, size of userbudget, number of user functions.

2. Data processing breadth - Size of project as expressed innumber of project staff, size of project budget, or number ofman-months to produce.

3. Size of information system - Expressed in number ofprograms, size of programs, number of transactions.

4. Expected frequency of change - The number and/or size ofchanges that will be made to the initial needs statement.

5. Number of unique logical business inputs that the systemwill process - Expressed in number of business transactionsprocessed in the course of a day.

6. Number of unique logical business outputs generated by thesystem - Number of business transactions or reports or messagesproduced per day by the system.

7. Number of logical files (views) that the system will access- The number of individual files or data base subschemas thatwill be accessed by the system during the totality of systemprocessing.

8. Number of major types of on-line inquiry expected - Thenumber of requests that will be made by users other than thenormal business outputs generated by the information system.

9. Telecommunications - The use of communication facilities inconjunction with automated information systems. Risk associatedwith the number of terminals, amount of hard-copy documentsproduced, and the sophistication of processing.

B - 10

FIGURE B.l (continued)

CHARACTERISTICS FOR THE RISK DIMENSION — EXPERIENCE WITHTECHNOLOGY

1. Makeup of project team in relationship to technology used -

The inclusion on the project team of the necessary skills toeffectively utilize the information system technology, e.g.,theinclusion of data base personnel for data base-related projects.

2. Applicability of the data processing design methodologiesand standards to the technology in use - The adaptability of theexisting data processing methodologies and standards to thetechnology being used. For example, if the information system isbeing developed under prototyping, are the design methodologiesand standards applicable to prototyping?

3. Margin of error - The amount of time between the entry of abusiness transaction and the response to that transaction. Forexample, is there reasonable time to make adjustments, correc-tions, or perform analyses before the transaction is completed?

4. Technical complexity of the information system - The numberof tasks and interrelationship between those tasks that must beaccomplished to satisfy the user needs.

5. Adaptability to change - The ease with which it is expectedthat changes to the information system requirements can beincorporated into the information system. This will be dependentupon the architecture of the system and its adaptability to theuser information needs.

6. Utilization of equipment - How much the information systemwill push the equipment to its capacity to meet user needs. Forexample, if a two-second response is needed and given thecomplexity of the tasks and the volume of work, what is theamount of tolerance within the system capacity to meet thoseprocessing needs?

7. Personnel - Skill level, number, and knowledge of userprocessing of the project team members including any supportingtechnical staff (s).

B - 11

FIGURE B.l (continued)

8. Documentation - Amount, currentness, type, and usability ofthe documents supporting the automatic information system.

9. Pioneering aspects - The newness of the technology and/ortechnological approaches used in this application system. Thenewness can be either within the organization (i.e., the firsttime any project has used this technology, such as data basetechnology) or the newness of the technology as offered by thevendors.

10. How knowledgeable is the user in data processing technology- Determines whether the user personnel can understand theimplications of use of data processing technology, and theirability to define requirements and discuss requirements inrelationship to its impact on technology.

11. Data processing knowledge of user tasks - The ability ofdata processing personnel to challenge the accuracy and need ofuser requirements in relationship to the mission and tasksperformed by the user.

12. Degree of complexity of processing logic - Measures whetherthe logic needed to perform the user requirements will be simple,average, or complex.

13. Need for automated error detection and correctionprocedures - Measures the complexity of the procedures that needto be incorporated into the information system to detect inaccur-ate or incomplete input transactions and make automatic correc-tion to those errors.

B - 12

FIGURE B.I - (continued)

CHARACTERISTICS FOR THE RISK DIMENSION — PROJECT STRUCTURE

1. Organizational breadth - The number of diverseorganizational units involved in the application system and/orthe number of user organizations that must sign off on therequirements definition.

2. Political implications of implementing the informationsystem - The level of agreement among all units in the organiza-tion as to the need for the system and the approach being used toaccomplish the system objectives.

3. Specificity of user requirements - The level of detail inwhich the requirements are specified. Measures the amount ofadditional detail and/or decisions that need to be made beforeprograms can be coded.

4. Problems associated with current system performance -

Measures the amount of problems that are occurring in the currentsystem as implemented. The thesis is that performance problemsin current systems may not be correctable by a new system.

5. Availability of backup hard-copy documents - The number oforiginal source documents and hard-copy format that will beproduced and retained during system processing.

6. Level of user management agreement on system objectives -

The agreement within the user(s) department on the statedobjectives for the system.

7. Percentage of the proposed information system that isalready performed by the user - Measures the newness of theinformation system tasks to the user area. Differentiatesbetween existing tasks being automated, and new tasks (newmeaning a new method for processing) .

B - 13

FIGURE B. 1 - (continued)

8. Importance/criticality of the business system to the user -

Measures the importance of this specific information system tothe user as it relates to the user completing the mission of theuser function.

9. Project management approach and structure - The organizationof the project in relationship to the size of the project and thetechnology being utilized. Includes such consideration asdivision of duties within the project, relationship between theuser and data processing personnel, as well as the management andstatus reporting structures.

B - 14

APPENDIX C

PCIE/NBS INVITATIONAL WORKSHOP

CO-CHAIRPERSONS: Bonnie T. Fisher & Zella G. Ruthberg

DISCUSSION GROUPS MEMBERSHIP

GROUP A

John Lainhart Department of TransportationOffice of Inspector General

(Group Leader) Director, Office of ADP Audits andTechnical Support

Robert L. Gignilliat(Recorder

)

Department of Health and HumanServices

Senior Systems Security Officer

Nander Brown Federal Home Loan MortgageCorporation

Assistant General Auditor

Peter S. Browne Profile Analysis CorporationPresident

James E. Haines Boeing Computer Services Co.Director, Quality Assurance

Kenneth Jannsen Blue Cross/Blue Shield of IllinoisDirector, Internal Audits

Jarlath O' Neill-Dunne Coopers and Lybrand, (New York, NY)Partner

Tyrone Taylor National Aeronautics and SpaceAdministration, Space Station

Management Analyst

John Van Borssum Security Pacific National BankVice President, EDP Auditor

J. Armand Villemaire Department of DefenseAir Force Audit Agency

Staff Auditor

Patricia D. Williams Department of TreasuryInternal Revenue Service

Head of Security

C - 1

GROUP B

Barry R 0 Snyder(Group Leader)

General Accounting Office, IMTECGroup Director, Technical Services

Mark J. Gillen(Recorder)

Department of TreasuryInternal Revenue Service

Internal Audit Manager

Robert P. Abbott EDP Audit Controls, Inc.President

Lorretta Ansbro Federal Reserve Bank of New YorkAudit Official

Stephen F. Barnett Department of DefenseComputer Security Center

Chief, Office of Application SystemEval uation

Larry Bergman Boeing Computer Services Co.EDP Audit Manager

Robert Berndt Bank of America (San Francisco)Vice President, EDP Audit Manager

Keagle Davis Touche Ross & Co. (Jacksonville)Partner

Michael Goldfine General Motors CorporationAssistant Director, Audit Staff

Ralph E c Gooch Department of TreasuryFinancial Management Services

Chief of Security Branch

Michael G. Houston Department of DefenseOffice of Inspector General

Program Director, Audit Policy andOversight

Jack Wheeler General Accounting Office, IMTECSpecial Assistant, Technical

Services

C - 2

GROUP C

Wallace 0. Keene(Group Leader)

Department of Health & Human ServicesActing Deputy Assistant Secretary for

Management Analysis and Systems

Allen Winokur(Recorder)

Navy Audit ServiceEDP Auditor

David L. Decker Department of Housing and UrbanDevelopment

Office of Inspector GeneralDirector, EDP Audit

Frederick Gallegos General Accounting Office (LosAngeles)

Manager, Management Services Group

Carole A. Langelier DeLoitte, Haskins and Sells (Washing-ton, D.C.)

Partner

Joseph T. McDermott Department of DefenseOffice of Inspector Gener al/AUDITProgram Manager

Gerald Meyers EDP Audit ConsultantsManaging Partner

Carl A. Pabst Touche Ross & Company (Los Angeles)Partner, Director of EDP Audit

Frederick G. Tompkins ORI , IncorporatedSenior Principal Scientist

Hart J. Will, Ph.D. University of Victoria, B.C.Professor of Public Administration

C - 3

GROUP D

Larry Martin(Group Leader)

Department of EnergyManager, Computer Security Program

Gail L. Shelton(Recorder

)

Department of Health & Human ServicesOffice of Inspector General

Program Analyst

James Cox Department of Health & Human ServicesOffice of Inspector General

EDP Auditor

Tim Grance, 2nd Lt. U.S. Air ForceComputer Security Program Office

Computer Security Staff Officer

Michael J. Henitz Peat Marwick Mitchell & Co.Computer Audit Office

Partner

William M. Hufford Sun Banks, Inc.Vice President, EDP Audit Manager;EDP Auditors AssociationRegional President

Stanley Jarocki Bankers Trust of New YorkVice President, Group Manager

William C. Mair Touche Ross & Co. (Detroit)Partner

Thomas Nugent Department of Navy, NARDACComputer Specialist

Kenneth A. Pollock EDP Auditors FoundationDirector of Research

F. A. Schlegel Management and Computer Services, IncPresident

D. L. Von Kleeck Management and Computer Services, IncGeneral Manager

H. C. Warner Florida PowerDirector, Internal Audits

C - 4

GROUP E

Douglas B. Hunt(Group Leader)

National Aeronautics and SpaceAdministrationOffice of Inspector General

Director, Technical Services

William C. Lee(Recorder

)

Department of CommerceOffice of Inspector GeneralOffice of Automated InformationSystems

Computer Specialist

Philip Carollo Sears, Roebuck and CompanyDirector, EDP Audits

Don Coiner Basic Data Systems, Inc.President

Robert V. Jacobson International SecurityTechnology, Inc.

President

Thomas Lux Touche Ross & Company (Chicago)Audit Supervisor

Jim Manara Security Pacific National BankQuality Assurance DivisionVice President

Brian McAndrew U.S. NavyNavy Audit Service

Assistant Director, Audit Policy

Brian Morse Coopers & Lybrand (Washington, D.C.)Partner

Benson J. Simon Environmental Protection AgencyProgram Analyst

Jane Tebbutt Department of Health and HumanServicesOffice of Inspector General

Director, Interagency ProjectsDivision

C - 5

APPENDIX D

TWO RISK SCORING METHODS

D.l A Simple Scoring Approach

D.1.1 The Scoring Method

This method risk scores each system by using Figure D.l tocalculate the scores as described below.

Step 1 - Assign Importance Weights. A weight, reflecting theimportance of the dimension to the system under review, isassigned to each of the five dimensions shown in Figure D.l.This weight will in turn reflect the importance of the dimen-sion's characteristics to the system under review. One of the twosuggested weighting schemes^ shown in Figure D.l can be used,although specific situations may require modification of these.The weights in set 1 add up to an arbitrary number while those inset 2 add up to 100. Set 2 allows for easy conversion of theweights to percentages.

Step 2 - Assign Risk Level . For each dimension assign a risklevel from 1-5 which reflects the degree of risk for thatdimension. Suggested risk level values are:

5 = High Risk3 = Medium Risk1 = Low Risk

For example, a system with demonstrated reliability would pose alow risk and warrant a low risk level value (=1)

.

Step 3 - Calculate Dimension Risk Score . The dimension riskscore is its weight times its risk level.

Step 4 - Calculate System Risk Score . For a Level I type systemrisk score, use the risk score for the Criticality/Mission Impactdimension. The Level II system risk score is the sum over eachof the five dimension's risk score.

Step 5 - Rank System Scores . Perform Steps 2, 3, and 4 for eachsystem under consideration and rank systems numerically from highto low. The highest scoring systems pose the highest risk andtherefore deserve more audit/review attention.

D.l. 2 Example of a Scored System

Table D.l is an example of a calculated risk score for one

l-The suggested weights were derived from data collected fromrepresentatives attending the PCIE Workshop.

D - 1

Figure

D.l

SYSTEM

RISK

SCORING

-

SIMPLIFIED

METHOD

V)

PC0)

oo

T)a) a>

p m& o00 o•H COa)

rHM a)

to >•H Q)« P)

/—

N

/-N

P • o m o o in4= <N LO rH rH *H rH00 'w' V-/ 'w' 'w' N—

'

0) /—

N

/^S /H /H3 • o m o o o

rH m rH rH rH rH'w' V-/ 'w'

c>» o

P >> P •Hc •H P •H pO P X •H i-i to

•i-t a o rH 00 VJ

to (t) iH •H a) 00(0 to ex a- x> p 0)

c •H S S t0 c po S M o p M c•H CJ CO Mto "—

.

>>c P a) p p >>a) •H rH c •H 00

a rH tO a) rH o•H CO o •H rHQ o CO 2 p 0

•H ’**—. o tO £p a) 1-1 •H £•H N •H rH CJ

i-i •H > a) 0)

o COcS

« H• • •

rH CM CO m

e 0)

3 wP ow o>> toCO

D - 2

Table

D.l

SYSTEM

RISK

SCORING

-

SIMPLIFIED

METHOD

EXAMPLE

co

>C/3

HC/3

>4C/3

03 1

CO CO CO

a a* cO Oc o Q) »H 03 42 -Ho P p p CO 42 c•H CO CO "H 3 P co 3+J -H T3 >> rH O p aa 03 CO *H P C co ac c 34 43 0) *iH -3 O

co 3 o CO 03 -H Q o1u <P -H CU G CO 3 T3 ps p a o c • C 3 CO £u *4 CO o rH o CO cO P 03

O 3-1 o CO 04 P G P • •H C•n o) 1 CO C cu p a ip

0 cO Cu rH T3 03 03 0) « c0

u 6 O rH G 34 a 42 cm p 03

qS • co 3 O CO 42 XI 303 iH a w PGP 0) tH P C O34 c0 • 0) 34 (0 3 H > CO CO CO 42cO 3 03 0) O 0 CO 03 03 p

C rH CO 42 co c cr 42 O P CO 03 0)

CO CO 43 •H p 44 03 C C 3 60 a4J S -H O CO P 03 CO 0) c0

c co 03 •h 42 P P CO H 4 CCO 4-> CO N i—

I

P T3 0) CO 03 rH O O34 3 O •H iH P -H C 42 C P •H Cfl *HOO ft CO CO M 3 CO E-4 *iH CU 3 ftp

3)

03 03• -> P o m o O m o42 0 m -O’ CO m m60 O rH CO

•H co03

s

rHM 0)

CO > a a>H Q3 m CO CO m m 3 uOS J P o

CO otn cncn

/—

s

/-N /—NU • o m o o m

rC CM m H rH rH rH60 VH 'w' 'w'

03 /—N /—

N

S • o o in m o o o o m orH CO CM rH rH iH rH rH i—i rH rH

'w' s-/

ctH tH op tH P •H

c •H P •H pO P X •H p c0

•H O 03 rH 60 PCO CO rH •H 0) 60

V) CO CU cu 43 P 03

c •H 0 a c0 c Po X M o P M CH o CO „ M(0 tH ">«—

,

tHc P 0) P P tH03 •H rH c •H 608 rH c0 03 rH O

•H c0 O •H rHC3 O CO 2 42 o

•H *—

^

o c0 cp 0) p •H £•H N •H rH o34 •H > 0) 0)O co w 02 Eh

H CM m <3- m

D - 3

system. The suggested weights of set 1 in Figure D.l were usedexcept for Technology Integration. This was given a higherweight of 15 because, in the organization, almost all new systemshave failed whenever any new technology is introduced. The fivedimensions were then given a risk level value based on auditknowledge and surveys. A total score of 350 was then calculatedfor ranking purposes.

D.2 A Detailed Scoring Approach

D.2.1 Risk Scoring a Dimension

Although the "strawman" paper describes five approaches toanalyzing risk (See Appendix B) , a method of ranking and ratingis suggested here as an approach commensurate with the softnessof the data available. Each dimension of the scheme is rated andranked separately, with scores then combined. Since Criticality/Mission Impact is the Level I dimension of the proposed scheme,one would analyze this dimension first. The procedure is as

First, the n characteristics with in, fl dimension are rankedaccording to their respective importance to that dimension. Theimportance rank number of characteristic i is I ( i ) and rangesfrom 1 to n with n correlated with the most important character-istic. For operational systems one can use discriminant analysisapplied to equal sets of known system failures and successes toobtain this ranking. For developmental systems a consensus viewof audit management can be used, ideally obtaining sponsor/user

Second, the importance ranking number, I(i), is converted toan importance weighting factor, W(i), that is normalized to 20.(The reason for selecting 20 will be explained in Section D.2. 3.)This means that the sum of the weighting factors for the charac-teristics within a dimension is set to 20 (or normalized to 20)

.

Since each of the five dimensions has a different number ofcharacteristics and we wish to treat the dimensions as equals,normalization will guarantee that the risk score range for eachdimension will be the same.

The normalization factor, F, is the number which convertsthe importance ranking number I ( i

)

to the importance weightingfactor W(i)

.

The relationships are:

follows

:

input

(1) w (i) = F x I (i)

t W (i)

i=l to nF x I (i) =20

i=l to n

Solving equation (2) for F, we find

D - 4

(3) F za

Hi (i)

i=l to n

and substituting for F in equation (1) yields the importanceweighting factor W(i) for characteristic i, i.e.,

(4) W (i) = 20 x I HI

i=l to n

Third, each characteristic is rated with respect to the riskof occurrence. One of the following risk ratings, R(i)

,

isassigned to characteristic i.

R (i) =3 (for High Risk)

R (i) =2 (for Medium Risk)

R (i) =1 (for Low Risk)

These ratings can be assigned by the auditor, again with userassistance if appropriate.

Finally, a Risk Score for that dimension is obtained bymultiplying the importance weighting by the risk rating of thecharacteristic and summing over the characteristics for thatdimension. The equation for this is the following:

DRS (j) = y [ W (i) X R (i)

i=l to n

where i = characteristics 1 to n

W (i) = importance weighting forcharacteristic i

R (i) = risk rating for characteristic i

DRS (j) = dimension j's risk score,j = 1 to 5

The Risk Score for each of the five dimensions will range from 20to 60 using these importance weighting and risk rating numberassignments.

D - 5

D.2.2 Level I System Risk Score

After completing a Level I review for an organization'suniverse of AlSs, using the analysis scheme in Section D.2.1, onecan use the Criticality/Mission Impact dimension risk score as afirst order approximation to a system risk score. Since theserisk scores have all been normalized to the same number (20), itis possible to compare these risk scores across AISs andeliminate from further consideration AIS's having a low risk withrespect to Critical ity/Mission Impact.

D.2.3 Level II Review Considerations

If it is decided that the more detailed Level II review isappropriate and/or affordable, one must decide upon a sequencefor reviewing the remaining dimensions of the high risk criticalAISs. While there is no "correct" way to do this, it might beappropriate to consider the following.

Since the Environment/Stability dimension includes theorganization's general controls, including the strength andinvolvement of quality assurance, project management, andsecurity functions throughout the SDLC (of both systems and majorenhancements to existing systems) , it may be most useful toreview this dimension first in a Level II review. These generalcontrols would heavily impact the need for audit coverage as wellas the scope and expertise necessary in that coverage. The EDPauditors could confidently reduce their scope and related testingof applications if they could rely on the organization's generalcontrols and the safeguards these various review functionsprovide in the SDLC process. Any ranking or prioritizing of theelements in the work priority scheme, beyond the overridingfactors described above (i.e., external influence and missioncriticality) , could not be reasonably accomplished without a

survey of the organization's general and applications controlsand/or without an institutional knowledge of the organization,its SDLC process, and any facts and circumstances affectingsystem development activities. The characteristics in all fourLevel II dimensions should be weighted and rated in the light ofsuch background information, and the dimension risk score, DRS,obtained for each of the four Level II dimensions.

B.2.4 Level II System Risk Score

As a second order approximation one can treat the dimensionsas equal contributors to the risk score for the AIS as a whole.Under this assumption the system risk score, SRS, is then a

simple sum of the five dimension risk scores, DRS.

D - 6

(5) SR

S

where

DRS (j)j=l to 5

SRS = system risk score

j = dimensions 1 to 5

DRS (j) =dimension j's risk score

Since DRS(j) can range from 20 to 60, SRS will range from 100 to300. The choice of 20 for the sum of the weights of thecharacteristics within a dimension is arbitrary and was made inorder to place SRS in a reasonable range for comparing onesystem's risk score to another's.

D.2.5 An Example

It may be a useful exercise to go through an example of thearithmetic involved. Assume we wish to calculate dimension riskscores and system risk scores for two AISs. To simplify matterswe shall assume small numbers of characteristics for eachdimension. Dimension 1 has four characteristics, dimension 2 hasthree characteristics, dimension 3 has five characteristics,dimension 4 has three characteristics and dimension five has 2

characteristics. The importance rankings I ( i ) and the riskratings R(i) are obtained from audit management and the auditorrespectively. The rest of the numbers in Tables D.2 and D.3 arecalculated using equations (1) - (5)

.

(A practice template ofthe table has been included in Figure D.2 to assist the reader inlearning the methodology.)

Using dimension 1 as a first order system risk score, wefind AIS 1, with DRS =42, is more at risk than AIS 2, with DRS =

38. We obtain the second order risk score by adding the fivedimension risk scores for each AIS. Using these numbers, AIS 1,with SRS = 191.4, is again more at risk than AIS 2, with its SRS= 180.0. Only experience with the method will enable thereviewer to obtain more refined interpretations of the calcula-tions .

D - 7

Figure D.2 PRACTICE TEMPLATE FOR RISK SCORING OF AN AIS

AIS

I DIMENSION1

1 I ( i)

1

1

1 F1

1

1 W(i)1

1

1 R(i)1

1

| W x R1

1

1 DRS(j)1

I DIM 1 1 1 1 1 1 1

1 C (1) 1 1 1 1 1 1

I C (2) 1 1 1 1 1 1

1 C (3) 1 1 1 1 1 1

C ( 4) 1

1

1

1

1

1

1

1

1

1

1

1

1

1

1

1

1

1

1 DIM 2 1 1 1 1 1 1

1 C(l) 1 1 1 1 1 1

1 C (2) 1 1 1 1 1 1

C ( 3) 1

1

1

1

1

1

1

1

1

1

1

1

1

1

1

1

1

1

1 DIM 3 1 1 1 1 1 1

1 C(l) 1 1 1 1 1 1

I C (2) 1 1 1 1 1 1

1 C ( 3) 1 1 1 1 1 1

1 C (4) 1 1 1 1 1 1

C ( 5) 1

1

1

1

1

1

1

1

1

1

1

1

1

1

1

1

1

1

1 DIM 4 1 1 1 1 1 1

1 C(l) 1 1 1 1 1 1

1 C (2) 1 1 1 1 1 1

C ( 3) 1

1

1

1

1

1

1

1

1

1

1

1

1

1

1

1

1

1

1 DIM 5 1 1 1 1 1 1

1 C(l) 1 1 1 1 1 1

1 C (2) 1

1

1

1

1

1

1

1

1

1

1

1

1

1

1

1

1

1

1

SRS 1

D - 8

Table D.2 DIMENSION RISK SCORES AND SYSTEM RISK SCORESFOR AIS 1

A IS 1

1

DIMENSION

|

I

I (i)1

1 F_l

1 1

1 W(i) |

1 1

R(i)1 1

I W x R |

.1 I

DRS(j) 1

DIM 1 | I I I I | |

C(l) 1 2 1 2 1 4 |1 1 4 |

C (2) |1 1 2

1 2 | 2 1 4 |

C (3) | 4 1 2 1 8 I 2 1 16 I

C (4) I 3 21 . ,6.... 1 3 1 18 . 1

1

L10

I

1 20 |

1 1

— 1 42 |

.1 I 42.0 1

DIM 2 | | | | I | |

C(l) 1 3 1 10/3 1 10 1 1 1 10 1

C (2) | 2 1 10/3 1 20/3 | 2 1 40/3 |

C (3) | 1 10/3 1 10/3 |3 1 1.0 . 1

1

L6

_l

1 20 |

1 1

— I 33.3 |

_l 1. 33.3 1

DIM 3 | I I I I | |

C (1) I 4 1 4/3 1 16/3 | 3 1 16 |

C (2) | 2 1 4/3 1 8/3 | 2 1 16/3 1

C (3) I 5 1 4/3 1 20/3 | 1 1 20/3 1

C ( 4) 1 1 1 4/3 1 4/3 | 2 1 8/3 1

C (5) I . .3 4/3 1 4 1 3 1 _JL2 1

1

1

151

” 1 20 |

1 1

— 1 42.7 |

I I 42.7 1

DIM 4 I I 1 1 I I |

C(l) 1 1 1 10/3 1 10/3 1 3 1 10 1

C ( 2) | 3 1 10/3 1 10 1 3 1 30 |

C (3) I 2 1 10/3 1 20/3 1 11 46.7 1

1

I

6

1

1 20 |

1 1

— 1 46.7 1

1 1 46 .7 1

DIM 5 ! I I I | | |

C(l) 1 1 1 20/3 1 20/3 | 2 1 40/3 |

C (2) | 2 1 20/3 1 40/3 | 1 1 40/3 |

1

1

3

1

1 20 |

1 1

— 1 26.7 I

1 1 26.7 1

1

SRS |. 191.4 1

1st Order SRS (Range = 20 to 60) = DRS(l) = 42.0

2nd Order SRS (Range = 100 to 300) = SRS = 191.4

D - 9

TABLE D. 3 DIMENSION RISK SCORES AND SYSTEM RISK SCORESFOR A IS 2

A IS _ZI 1

I DIMENSION

|

i 1

I ( i)

1

1 F1

1 1

1 W(i) |

1 1

R(i)1 1

I W x R |

1 1

DRS( j)

1 DIM 1 | I 1 1 1 1

1 C (1) | 4 I 2 1 8 1 3 1 24 |

! C ( 2) | 2! 2 I 4 | 1 1 4 |

I C (3) I 1 I 2 1 2 | 2 1 4 |

1 C (4) I ... 3. i 2 1 6 I 1 1 1

1 I 10 i 20 | 1 38 |

1 1 1 1 1 1 1 38.01 DIM 2 | 1 1 1 1 1

1 C(l) I 21 10/3 1 20/3 | 3 1 20 |

1 C ( 2) | 1 1 10/3 1 10/3 | 1 1 10/3 |

1 C (3) I 3 1 10/3 1 10... 1 2 1 _2L_ 1

I I 6 1 20 | 1 43.3 |

1 I 1 1 1 1 1 43.31 DIM 3 I 1 1 1 1 1

I C(l) I 5 1 4/3 1 20/3 | 3 1 20 |

1 C (2) I 3 1 4/3 1 4 I 1 1 4 I

1 C (3) I 11 4/3 1 4/3 | 2 1 8/3 |

1 C ( 4) | 2 1 4/3 1 8/3 | 1 1 8/3 |

1 C (5) I. A 4/3 1 16/3 1 3 1 16 1

I 1 15 1 20 f 1 45.4 I

1 S 1 1 1 1 1 45.4! DIM 4 | 1 1 1 1 1

1 C(l) I 2 1 4 1 20/3 1 2 1 40/3 |

I C ( 2) | 2 1 4 1 10 1 1 1 10 1

1 C (3) | 1 1 4 1 10/3 1 3 1 -J.0 .. 1

S 1 5 1 20 | 1 33.3 I

I 1 1 1 1 1 1 33.31 DIM 5 I 1 1 1 1 1

S C(l) ! 2 I 20/3 1 40/3 | 1 1 40/3 |

! C (2) 1 1 20/3 1 20/3 1 1 1 20/3 1

I 1 3 1 20 | i 20 |

I 1 1 1 1 1 1 20.01

SRS |. 180^0

1st Order SRS (Range = 20 to 60) = DRS(l) *= 38.0

2nd Order SRS (Range = 100 to 300) = SRS - 180.0

D - 10

NBS-114A 1REV. 2-8C)

U.S. DEPT. OF COMM.

BIBLIOGRAPHIC DATASHEET (See instructions)

1. PUBLICATION ORREPORT NO.

NBS IR-86-3386

2. Performing Organ. Report No. 3. Publication Date

JULY 1986

4. TITLE AND SUBTITLE

Work Priority Scheme for EDP Audit and Computer Security Review

5. AUTHOR(S)

Zella G. Ruthberg, and Bonnie T. Fisher

6. PERFORMING ORGANIZATION (If joint or other than NBS. see instructions)

National Bureau of Standards

Department of Commerce

Gaithersburg, MD 20899

7. Contract/Grant No.

8 . Type of Report & Period Covered

9.

SPONSORING ORGANIZATION NAME AND COMPLETE ADDRESS (Street. City, State, ZIP)

National Bureau of Standards Department of Health & Human Services

Department of Commerce and Office of Inspector General

Gaithersburg, MD 20899 330 Independence Avenue, SW

Washington, DC 20201

10.

SUPPLEMENTARY NOTES

This report documents a synthesis of ideas developed at an invitational

workshop sponsored by the National Bureau of Standards and the Department of

Health & Human ServicesDocument describes a computer program; SF-185, FIPS Software Summary, is attached.

11.

ABSTRACT (A 200-word or less factual summary of most significant information. If document includes a significantbibliography or literature survey, mention it here)

This report describes a high level risk analysis for Automated Information

Systems (AISs) that can be used by computer security reviewers and EDP auditors

to prioritize their non-di scretionary and discretionary review activities for

these AISs. It divides the risk analysis problem into five areas of risk

concern (called dimensions) with each area defined by a set of characteristics.

The five dimensions are: Criticality/Mission Impact, Size/Scale/Complexity,

Environment/Stability, Reliability/Integrity, and Technology Integration.

The report presents a possible two-level scoring scheme which calculates the

level of risk for each dimension, uses the Criticality score as a first order

system risk score, and then combines all five dimension risk scores for a

second order system risk score. An approach for deriving an EDP audit or computer

security review plan using these scores is outlined.

12. KEY WORDS (Six to twelve entries; alphabetical order; cdpitalize only proper names; and separate key words by semicolon s)

audit/review plan; automated information system risk analysis; computer security

review; criticality/mission impact; discretionary audit/review; EDP audit;

prwi rnnmpnt / Rtahi 1 i t.y;non-di scretionary audit/review; rel iabi 1 i ty/integri ty

;

ava i arm ity i TTlTT^ • / / L i ^ 4- . , . TTrTTTTTTTrTTTTTTrTTrtTrT . .13. availabilityr -j s |< SCOre ; size/scale/complexity; technology/integra-

[ X| UnlimitedtiOn

[1For Official Distribution. Do Not Release to NTIS

|

Order From Superintendent of Documents, U.S. Government Printing Office, Washington, D.C.20402.

[X] Order From National Technical Information Service (NTIS), Springfield, VA. 22161

14. NO. OFPRINTED PAGES

60

15. Price

$11.95USCOMM-OC 8043-P80

••• • • V • • •

. . .<

V-

-

..

-