Qualitative & Quantitative Methods Combining Qualitative and Quantitative Methods in Information...

17
Qualitative & Quantitative Methods Combining Qualitative and Quantitative Methods in Information Systems Research: A Case Study^ By: Bonnie Kaplan Department of Quantitative Anaiysis and information Systems University of Cincinnati Cincinnati, OH 45221-0130 Dennis Duchon Division of iVIanagement and Mariteting Coiiege of Business University of Texas at San Antonio San Antonio, TX 78285-0634 Abstract This article reports how quantitative and quaiita- tive methods were combined in a longitudinal multidisciplinary study of interreiationships be- tween perceptions of work and a computer in- formation system. The articie describes the problems and contributions stemming from different research perspectives and methodolo- gical approaches. It illustrates four methodolo- gical points: (1) the value of combining quaiita- tive and quantitative methods; (2) the need for context-specific measures of job characteristics rather than exciusive reliance on standard con- text-independent instruments; (3) the import- ance of process measures when evaluating in- formation systems; and (4) the need to explore the necessary relationships between a compu- ter system and the perceptions of its users, rather than unidirectional assessment of compu- ter system impacts on users or of users charac- teristics on computer system impiementation. Despite the normative nature of these points, the most important conclusion is the desirability for a variety of approaches to studying informa- tion systems. No one approach to information systems research can provide the richness that information systems, as a discipline, needs for further advancement. Keywords: Methodology, research methods, re- search perspectives, qualitative meth- ods, interpretivist perspective, com- puter system impacts, computer system evaluation, organizational im- pacts, work, medical and health care applications ACM Categories: K,4,3, H.O, J,3 ' This paper was presented at the Ninth Annual Inter- national Conference on Information Systems, Min- neapolis, MN, November 30-December 3, 1988, Introduction Information systems had its origins in a variety of reference disciplines with distinct theoretical research perspectives on the important issues to study and the methods to study them (Bariff and Ginzberg, 1982; Dickson, et al., 1982; Men- delson, et al,, 1987), This article describes a study that combined some of these distinct per- spectives and methods. The article discusses how limitations of one research perspective can be addressed by also using an alternative. This emphasis reflects the lesser familiarity informa- tion systems researchers might have with the perspective receiving more in-depth discussion and promotion, A key point to this article is the importance that both perspectives had for this study; no implication of preference in perspec- tives is intended. The discussion of perspectives provides back- ground for understanding the process and meth- ods of this research. Following the discussion. MIS Quarterly/December 1988 571

Transcript of Qualitative & Quantitative Methods Combining Qualitative and Quantitative Methods in Information...

Qualitative & Quantitative Methods

Combining Qualitativeand QuantitativeMethods inInformation SystemsResearch:A Case Study^

By: Bonnie KaplanDepartment of Quantitative

Anaiysis and informationSystems

University of CincinnatiCincinnati, OH 45221-0130

Dennis DuchonDivision of iVIanagement

and MaritetingCoiiege of BusinessUniversity of Texas

at San AntonioSan Antonio, TX 78285-0634

AbstractThis article reports how quantitative and quaiita-tive methods were combined in a longitudinalmultidisciplinary study of interreiationships be-tween perceptions of work and a computer in-formation system. The articie describes theproblems and contributions stemming fromdifferent research perspectives and methodolo-gical approaches. It illustrates four methodolo-gical points: (1) the value of combining quaiita-tive and quantitative methods; (2) the need forcontext-specific measures of job characteristicsrather than exciusive reliance on standard con-text-independent instruments; (3) the import-ance of process measures when evaluating in-formation systems; and (4) the need to explorethe necessary relationships between a compu-ter system and the perceptions of its users,rather than unidirectional assessment of compu-ter system impacts on users or of users charac-teristics on computer system impiementation.

Despite the normative nature of these points,the most important conclusion is the desirabilityfor a variety of approaches to studying informa-tion systems. No one approach to informationsystems research can provide the richness thatinformation systems, as a discipline, needs forfurther advancement.

Keywords: Methodology, research methods, re-search perspectives, qualitative meth-ods, interpretivist perspective, com-puter system impacts, computersystem evaluation, organizational im-pacts, work, medical and healthcare applications

ACM Categories: K,4,3, H.O, J,3

' This paper was presented at the Ninth Annual Inter-national Conference on Information Systems, Min-neapolis, MN, November 30-December 3, 1988,

IntroductionInformation systems had its origins in a varietyof reference disciplines with distinct theoreticalresearch perspectives on the important issuesto study and the methods to study them (Bariffand Ginzberg, 1982; Dickson, et al., 1982; Men-delson, et al,, 1987), This article describes astudy that combined some of these distinct per-spectives and methods. The article discusseshow limitations of one research perspective canbe addressed by also using an alternative. Thisemphasis reflects the lesser familiarity informa-tion systems researchers might have with theperspective receiving more in-depth discussionand promotion, A key point to this article is theimportance that both perspectives had for thisstudy; no implication of preference in perspec-tives is intended.

The discussion of perspectives provides back-ground for understanding the process and meth-ods of this research. Following the discussion.

MIS Quarterly/December 1988 571

Quaiitative & Quantitative Methods

the article describes how this study evolved. Theemphasis is on methods rather than on researchfindings.

The positivist perspective andquantitative methods

Despite the differences in reference disciplinesand the debate over a paradigm for informationsystems, American information systems researchgenerally is characterized by a methodology offormulating hypotheses that are tested throughcontrolled experiment or statistical analysis. Theassumption underlying this methodological ap-proach is that research designs should be basedon the positivist model of controlling (or at leastmeasuring) variables and testing pre-specifiedhypotheses (Kauber, 1986), although alternativemethods might be acceptable until research hasreached this more advanced and "scientific"stage. Despite some recent recognition given dif-ferent research perspectives and methods (e.g,,Ives and Olson, 1984; Klein, 1986; Kling, 1980;Lyytinen, 1987; Markus and Robey, 1988;Weick, 1984), even those who argue for intro-ducing doctoral students to such alternative ap-proaches as field study and simulation methodsnevertheless advocate research based primarilyon the positivist tradition (e.g,, Bariff andGinzberg, 1982; Dickson, et al,, 1982),

Exclusive reliance on statistical or experimentaltesting of hypotheses has been soundly criticizedin the social sciences, where some of its majorproponents have called its effects "disastrous"(Cook and Campbell, 1979, p, 92), There aretwo grounds on which the approach can befaulted. First, the assumption that only throughstatistical or experimental hypothesis testing willscience progress has come under attack, par-ticularly by psychologists who perhaps have thedubious distinction of having practiced it the long-est, Meehl (1978), for example, argues that sci-ence does not, and cannot, proceed by incre-mental gains achieved through statistical signifi-cance testing of hypotheses. Sociologists, too,have contributed to this debate, notably withGlaser and Strauss' (1967) influential argumentfor theory building through inductive qualitativeresearch rather than through continual hypothe-sis testing.

A second fault with the approach is the relianceon experimental or statistical control as the de-fining feature of scientific research. This reliance

stems from the admirable goal of controlling ex-perimenter bias by striving for objective meas-ures of phenomena. Achieving this goal hasbeen assumed to require the use of quantifiabledata and statistical analysis (Downey and Ire-land, 1983; Kauber, 1986) and also removingthe effects of context in order to produce gener-alizable, reproducible results.

However, because the study of social systemsinvolves so many uncontrolled — and unidenti-fied — variables, methods for studying closedsystems do not apply as well in natural settingsas in controlled ones (Cook and Campbell, 1979;Manicas and Secord, 1983; Maxwell, et al,,1986), Moreover, the simplification and abstrac-tion needed for good experimental design canremove enough features from the subject ofstudy that only obvious results are possible. Asillustrated in the next section, the stripping ofcontext buys "objectivity" and testability at thecost of a deeper understanding of what actuallyis occurring,^

The interpretive perspective andquantitative methodsThe need for context-dependent research hasbeen remarked upon by researchers in a varietyof disciplines that, like information systems, nec-essarily incorporate field research methods.Even such strong advocates of quantitative andexperimental approaches in behavioral researchas Cook and Campbell (1979) state, "Field ex-perimentation should always include qualitativeresearch to describe and illuminate the contextand conditions under which research is con-ducted" (p, 93),

Immersion in context is a hallmark of qualitativeresearch methods and the interpretive perspec-tive on the conduct of research. Interpretive re-searchers attempt to understand the way othersconstrue, conceptualize, and understand events,concepts, and categories, in part because theseare assumed to influence individuals behavior.The researchers examine the social reality andintersubjective meanings held by subjects (Bredoand Feinberg, 1982a) by eliciting and observingwhat is significant and important to the subjectsin situations where the behavior occurs ordinar-ily. Consequently, qualitative methods are char-acterized by (1) the detailed observation of, and

^ We would like to credit an anonymous reviewer withthis point.

572 MIS Quarterly/December 1988

Qualitative & Quantitative Methods

involvement of the researcher in, the natural set-ting in which the study occurs, and (2) the at-tempt to avoid prior commitment to theoreticalconstructs or to hypotheses formulated beforegathering any data (Yin, 1984),

Qualitative strategies emphasize an interpretiveapproach that uses data to both pose and re-solve research questions. Researchers developcategories and meanings from the data throughan iterative process that starts by developing aninitial understanding of the perspectives of thosebeing studied. That understanding is then testedand modified through cycles of additional datacollection and analysis until coherent interpreta-tion is reached (Bredo and Feinberg, 1982a; VanMaanen, 1983b), Thus, although qualitative meth-ods provide less explanation of variance in sta-tistical terms than quantitative methods, they canyield data from which process theories and richerexplanations of how and why processes and out-comes occur can be developed (Marcus andRobey, 1988),

Research traditions in informationsystemsThe growing recognition of the value of qualita-tive methods in social, behavioral, organizational,and evaluation research is manifest in studiesand research methodology texts (Argyris, 1985;Bredo and Feinberg, 1982b; Lincoln and Guba,1985; Miles and Huberman, 1984; Mintzberg,1973; Patton, 1978; Van Maanen, 1983c), VanMaanen (1983a), for example, has long ad-vanced and practiced these approaches in or-ganizational research. However, despite thestrong ties of information systems with organ-izational and behavioral research, the use of quali-tative research, though practiced and advocatedin information systems, has not been as visiblein this field as in others. Instead, recently therehas been greater reliance on laboratory studiesand surveys (Goldstein, et al,, 1986),

The dominant approach to information technol-ogy studies has been based on a positivistic ex-perimental ideal of research. Using this ap-proach, researchers examine the effects of oneor more variables on another. These analysestend to treat the research objects in one of twoways. Either they portray information technologyas the determining factor and users as passive,or they view users or organizations as actingin rational consort to achieve particular outcomesthrough the use of information technology. In

either case, the nature of the information tech-nology and users is considered static in that theyare assumed to have an essential character thatis treated as unchanging over the course of thestudy (Bakos, 1987; Lyytinen, 1987),

Markus and Robey (1988) characterize such ap-proaches as "variance theory formulations of logi-cal structure and an imperative conception ofcausal agency," In variance theories, some ele-ments are identified as antecedents, and theseare conceived as necessary and sufficient con-ditions for the elements identified as outcomesto occur. According to Markus and Robey, muchof the thinking about the consequences of infor-mation technology in organizations assumes thateither technology ("the technological imperative")or human beings ("the organizational impera-tive") are the antecedents, or agents, of changerather than that change emerges from complexindeterminant interactions between them ("theemergent perspective").

Most studies of computer systems are based onmethods that measure quantitative outcomes.These outcomes can be grouped into technical,economic, and effectiveness and performancemeasures. Such studies treat organizational fea-tures, user features, technological features, andinformation needs as static, independent, andobjective rather than as dynamic, interacting con-structs, i,e,, as concepts with attributes and mean-ings that may change over time and that maybe defined differently according to how individ-ual participants view and experience the rela-tionships between them.

Because such studies are restricted to readilymeasured static constructs, they neglect aspectsof cultural environment, and social interactionand negotiation that could affect not only the out-comes (Lyytinen, 1987), but also the constructsunder study. Indeed, most evaluations of com-puter information systems exhibit these charac-teristics. Published accounts of computing tradi-tionally focus on selected technical and eco-nomic characteristics of the computer systemthat are assessed according to what Kling andScacchi (1982) call the "discrete-entity model,"Economic, physical, or information processing fea-tures are explicitly chosen for study under theassumption that the computer system can bebroken down into relatively independent ele-ments of equipment, people, organizational proc-esses, and the like. These elements can thenbe evaluated independently and additively.Social or political issues often are ignored.

MIS Quarteriy/December 1988 573

Qualitative & Quantitative Methods

That these assumptions underlie much of theresearch on information technology is evidentin the implementation and impacts literature. Theeffects of some intervention are studied with re-spect to implementation success or impacts on,for example, organizational structure, user atti-tudes, or job satisfaction (Bakos, 1987; Danziger,1985; Ives and Olson, 1984; Kling, 1980; Markusand Robey, 1988), In these studies, informationsystems or computers are treated as having "im-pacts" (Danziger, 1985) rather than as sociallyconstructed concepts with meanings that are af-fected by the "impacts" and that change fromperson to person or over time. Few such impactstudies are longitudinal — a design promotedin information systems research to track changesover time by collecting data as events occurrather than retrospectively (Franz and Robey,1984; Vitalari, 1985),

Often such studies do not proceed from an in-teractionist framework — one that focuses onthe interaction between characteristics related topeople or subunits affected by the computersystem and characteristics related to the systemitself (Markus and Robey, 1988). For example,they tend not to explore interrelationships be-tween job-related issues and effectiveness of in-formation technology. Jobs are often consideredfixed, even though there are empirical and theo-retical reasons to expect that computers changethe amount and nature of work performed bysystem users (Brooks, et al,, 1977; Fok, et al,,1987; Kemp and Clegg, 1987; Kling, 1980; Mill-man and Hartwick, 1987; Zuboff, 1982; 1988),Moreover, job-related issues are interpreted dif-ferently by different individuals within an organiza-tion, and these differences can affect what consti-tutes a technology's "effectiveness" (Kling, 1980;Lyytinen, 1987), Goodhue (1986), for example,advises assessing the "fit" of an information sys-tem with a task. However, one person performingthe task may have a rather different view of it thananother person performing ostensibly the sametask, and thus the "fit" will differ for different users.Consequently, different individuals may havedifferent responses to the same system, Interac-tionist theories would account for this response.Theories that assume that the individual, the job or,task, the organization, or the technology is fixedand that one of these determines outcomes wouldnot (Markus and Robey, 1988),

Some argue that each type of research methodhas its appropriate uses (Markus and Robey,1988; Rockart, 1984; Weick, 1984); different re-

search perspectives focus on different researchquestions and analytical assumptions (Kling,1980; Kling and Scacchi, 1982; Lyytinen, 1987;Markus, 1983; Markus and Robey, 1988), How-ever, there is disagreement concerning the valueand use of suggested alternative theoretical per-spectives and practical approaches, such as criti-cal social theory (Klein, 1986), structurationtheory (Barley, 1986), case study (Benbasat, etal,, 1987), and socio-technical design (Fok, etal., 1987; Mumford and Henshall, 1979). On theone hand, Mumford (1985) advocates a qualita-tive approach. She calls for studies of a "total"situation through action-centered, interdisciplin-ary, participatory research in which researchquestions and hypotheses evolve as new de-velopments are introduced, Lyytinen (1987)makes a similar appeal for case studies and actionresearch on the grounds that "this researchstrategy seems to be the only means of obtainingsufficiently rich data" and because the validity ofsuch methods "is better than that of empiricalstudies," On the other hand, even some propo-nents of case study base their position on a re-search design that fits the quantitative or quasi-experimental approach rather than the qualitativeone (Benbasat, et al,, 1987; Campbell, 1984; Yin,1984), Moveover, there has been strong senti-ment that information systems researchers needto move beyond case study to more experimentallaboratory or field tests (Benbasat, 1984),

Combining MethodsAlthough not the dominant paradigm, qualitativemethods and interpretive perspectives havebeen used in a variety of ways in informationsystems research (Barley, 1986; Fok, et al,,1987; Franz and Robey, 1984; Goldstein, et al,,1986; Hirschheim, et al,, 1987; Markus, 1983;Mumford and Henshall, 1979; Mumford, et al.,1985), Interpreting information technology interms of social action and meanings is becom-ing more popular as evidence grows that infor-mation systems development and use is a socialas well as technical process that includes prob-lems related to social, organization, and concep-tual aspects of the system (Bland, 1978; Hirsch-heim, et al., 1987; Kling and Scacchi, 1982;Lyytinen, 1987; Markus, 1983), However, manyinformation systems researchers who recognizethe value of qualitative methods often portraythese methods either as stand-alone or as ameans of exploratory research preliminary to the"real" research of generating hypotheses to betested using experimental or statistical tech-

574 MIS Quarterly/December 1988

Qualitative & Quantitative Methods

niques (Benbasat, 1984), Even papers in whichqualitative and quantitative methods are com-bined rarely report the study's methodologicalrationale or details (Benbasat, et al,, 1987), Oneresult is the failure to discuss how qualitativemethods can be combined productively with quan-titative ones.

There has been a move in other fields towardcombining qualitative and quantitative methodsto provide a richer, contextual basis for interpre-tating and validating results (Cook andReichardt, 1979; Light and Pillemer, 1982;Maxwell, 1986; Meyers, 1981; Van Maanen, etal,, 1982; 1983a), These methods need not beviewed as polar opposites (Van Maanen,1983b), It is possible to integrate quantitative andqualitative methods (Maxwell, et al,, 1986), Com-bining these methods introduces both testabilityand context into the research. Collecting differ-ent kinds of data by different methods from dif-ferent sources provides a wider range of cover-age that may result in a fuller picture of the unitunder study than would have been achieved oth-erwise (Bonoma, 1985), Moreover, using multi-ple methods increases the robustness of resultsbecause findings can be strengthened throughtriangulation — the cross-validation achievedwhen different kinds and sources of data con-verge and are found congruent (Benbasat, etal,, 1987; Bonoma, 1985; Jick, 1983; Yin, 1984),or when an explanation is developed to accountfor all the data when they diverge (Trend, 1979),

This article describes how qualitative and quan-titative methods were combined during the firstphase of an ongoing multi-method longitudinalstudy. Detailed research results are reported else-where (Kaplan, 1986; 1987; Kaplan and Duchon,1987a; 1987b; 1987c) and summarized here. Thisarticle has a methodological focus. It describes thedevelopment and process of the research andomits all but sketches of the data and analysisnecessary to understand how the researchevolved. Particular attention is given to how bothquantitative and qualitative methods were usedproductively in a collaboration among investiga-tors from different research perspectives.

Research Setting

Organizational and informationcontextComputers have been used more in clinical labo-ratories than in many other areas of medical prac-

tice (Paplanus, 1985), The clinical laboratory rep-resents a microcosm of automation and infor-mation needs from throughout a medical center(Lincoln and Korpman, 1980), The laboratory isresponsible for performing tests ordered by phy-sicians to diagnose illness or track the courseof therapy and disease. Laboratories meet thisresponsibility by receiving and processing phy-sicians' test orders, i,e,, collecting specimens(such as blood) from patients, performing thedesignated tests (such as blood sugar meas-urements or assessments of bacterial sensitivityto antibiotics), and reporting the test results foruse by the physician and for inclusion in the pa-tient's medical record. Laboratory technologists,who are specially trained, perform the labora-tory tests. They may also report the results anddiscuss them with those treating the patient.

The principal function of computers in clinicallaboratories involves data management. Com-puters can relieve the clerical burden of dataacquisition and transcription while adding newdata entry and computer-related tasks. In addi-tion, they improve legibility, organization, and ac-curacy of laboratory results reports; increase pro-ductivity and efficiency; reduce transcriptionerror; and change laboratory organization andturnaround time for test results (Brooks, et al.,1977; Flagle, 1974; Lewis, 1979; Nicol andSmith, 1986), Thus, such computer informationsystems affect both the process of work as wellas the service product of a clinical laboratory.

The research siteResearch was conducted within the Departmentof Pathology and Laboratory Medicine at a 650-bed midwestern urban metropolitan universitymedical center, A widely used and well-re-spected commercial laboratory computer infor-mation system was installed in April 1985 foruse by all nine laboratories within the Divisionof Laboratory Medicine, These nine laboratorieswere responsible for laboratory work to supportthe care of patients admitted to the hospital andthose treated in clinics and in the emergencyunit. The laboratories also performed specialtytests for other institutions.

This research site was selected because a newcomputer information system was replacing amanual operation in a context representative ofthe entire institution's information systems needs.Another advantage was that multiple organiza-tional units would use the same computer

MIS Quarterly/December 1988 575

Qualitative & Quantitative Methods

system. Consequently, a comparison betweenunits would be possible under conditions wheresystem variables would be reasonably constant.Thus, the study could include both macro andmicro units of analysis: individuals, workers andmanagers, individual laboratories as a whole, thedepartment in which the laboratories were situ-ated, and the organization as a whole. Mixinglevels of analysis made it possible to explorethe interplay among units at each level andacross levels (Markus and Robey, 1988),

The site became available because of the firstauthor's prior contact with the director of labora-tory medicine. The director arranged for entryinto the laboratories and meetings and gave hissupport throughout the study's duration. He wasavailable to the research team as needed.

Research teamThe project was undertaken by four faculty mem-bers in the College of Business Administrationat the University of Cincinnati, The researcherfrom the information systems (IS) area, BonnieKaplan, conceived the project, conducted thefieldwork, and provided knowledge of the re-search setting. She envisioned the purpose ofthe study as researching what happens whena computer information system is installed intoa new setting.

The other three researchers were from the or-ganizational behavior area. Two of them left thestudy at an early stage; Dennis Duchon re-mained. Their primary interest was in testing pre-existing theory in a new setting using question-naires as the means of gathering quantitativedata for statistical analysis. They viewed quali-tative methods as a means for deriving quanti-tative measures, rather than as rich sources ofresearch data useful for grounding theory andinterpretation. Consequently, they approachedthe study differently from Kaplan in three ways:(1) they decided to research the impact of thecomputer information system on work in the labo-ratories; (2) they began the study intending totest theory through statistical analysis of quanti-tative survey data; and (3) they did not considerinterviewing and observation as a means of datacollection,

iVIethods

Research questionEach member of the original research team for-mulated different research questions. The three

organizational behavior members viewed thenew computer system as an opportunity to testexisting theory concerning job characteristicsand job satisfaction in a new setting. Althougheach of their research questions differed, thesethree wished to investigate how job characteris-tics varied in the clinical laboratories. Conse-quently, the study focused on laboratory tech-nologists. The research questions for these threeresearchers were to investigate (1) the effectsof the computer information system on job char-acteristics, (2) how departmental technology af-fected computer acceptance, (3) how leader-subordinate relationships affected computer ac-ceptance, and (4) how job characteristics variedamong laboratories and changed over time.

Working in the interpretive tradition, Kaplan ex-pected to shuttle among questions, hypotheses,and data throughout the study. Because of theother team members' primary interest in thelaboratory work, she also adopted this focus. Herresearch question was to identify and account forboth similarities and differences among labora-tory technologists and among laboratories intheir responses to the computer informationsystem.

Research designEach research question reflected differences be-tween a quantitative hypothesis-testing approach(wliere the effects of an intervention on depend-ent variables are statistically assessed) and aqualitative approach (where categories and theo-ries are developed inductively from the data, gen-eralizations are built from the ground up, andvarious interpretive schemes are tried and hy-potheses are created and reformulated duringthe course of the study) (Glaser and Strauss,1967; Van Maanen, 1983b), These differencesresulted in a longitudinal multi-method casestudy that incorporated each team member's in-terest and skills. Because this initial case designincluded both qualitative and quantitative meth-ods, both positivist and interpretive perspectiveswere incorporated in order to best link methodto research question,^

^ Case study, an investigation using multiple sourcesof evidence to study a contemporary phenomenonwithin its real-life context (Bonoma, 1985; Yin, 1984),has been advanced for information systems researchin order to understand the nature and complexity ofthe processes taking place (Benbasat, et al,, 1987),Although they are often distinguished from quantita-

576 MIS Quarterly/December 1988

Qualitative & Quantitative Methods

The initial design was developed after consider-able discussion and, as is common in qualita-tive research, was left open for modification andextension as necessary during the course of thestudy (Glaser and Strauss, 1967), Because re-search access to the site was not secured untilshortly before the new system was installed, nopre-installation survey measures were taken. Con-sequently, as is often true in case studies, therecould be no comparison of measures after in-stallation (Cook and Campbell, 1979, p, 96),

The first step in the design was to interview labo-ratory directors and selected hospital adminis-trators prior to system installation. The secondstep was to observe in each laboratory after in-stallation. The remaining steps were to admini-ster questionnaires at several periods after thecomputer system was installed.

The first wave of questionnaire data gatheringoccurred when a new routine had beenestablished after the computer system wasinstalled. The second wave was planned forapproximately one year later, when the initialchanges caused by the computer systembecame part of normal procedure. Future waveswould be at intervals depending on initial results.Initially, regular participant observation atlaboratory management meetings and staffmeetings was not included in the design. Thiscomponent was added when the meetings wereinstituted.

Data collectionQualitative methods included open-ended inter-viewing, observation, participant observation,and analysis of responses to open-ended itemson a survey questionnaire. Quantitative meth-ods were employed to collect and analyze datafrom survey questionnaires. All participants wereassured of confidentiality.

Although both qualitative and quantitative ap-proaches were used, it quickly became appar-ent that they were viewed differently by researchteam members. Each team member conductedinterviews and observations, but only the quali-

tive or quasi-experimental research (Bonoma, 1985;George and McKeown, 1985), case studies may ormay not exhibit the defining conditions of either quan-titative or qualitative research (Campbell, 1984; Yin,1984), There are a variety of recognized approachesto case study (Benbasat, et al,, 1987; Bonoma,1985),

tatively-oriented member kept systematic fieldnotes to be used for data analysis, Qther teammembers viewed interviews and observations asproviding "background" rather than "data," Con-sequently, Kaplan's field notes from each ofthese activities were used for analysis.

Interviews and Observations

The director of the laboratories, the chairmanof the department, and the administrator of thehospital were interviewed early in the study.Teams of two or three researchers also inter-viewed the individual laboratory directors andsome chief supervisory personnel during theweek prior to computer system installation,Kaplan was present at all but two of these inter-views. The purpose of the interviews was three-fold: (1) to determine what interviewees ex-pected the potential effects of the computersystem to be on patient care, laboratory opera-tions, and hospital operations; (2) to inquireabout possible measures and focus of the study;and (3) to generate questionnaire items for asurvey of laboratory technologists.

During the month prior to administering thesurvey, individual researchers were present inthe laboratories to observe and talk with labora-tory staff while they worked. These observationswere intended to influence questionnaire itemdevelopment.

Starting three months after the computer infor-mation system was installed, Kaplan was urgedby one of the laboratory directors to attendweekly meetings where directors and head su-pervisors discussed laboratory management prob-lems. These meetings were instituted as a resultof system installation and they became a regu-lar feature of laboratory management even aftersystem problems ceased to be discussed,Kaplan attended these meetings regularly through-out the study as an observer and occasional par-ticipant, and was a participant observer at otherdepartmental meetings.

Survey Questionnaire

A survey instrument was developed for labora-tory technologists, the primary direct users of thecomputer information system. It was composedof three parts. The first part consisted of meas-ures adapted from the standard instruments thataddressed job characteristics (Hackman andQldham, 1976), role conflict and ambiguity(House and Rizzo, 1972), departmental technol-

MIS Quarterly/December 1988 577

Quaiitative & Quantitative Methods

ogy (Withey, et al,, 1983), and leader-memberrelationships (Dansereau, et al,, 1975),

The second part of the questionnaire used Likert-scale measures of expectations, concerns, andperceived changes that may be related to theuse of the computer system. These measureswere developed by analyzing the interviews andobservations to derive categories for questionsthat focused on the primary expectations ex-pressed by interviewees, attendees at meetings,and the laboratory technologists who were ob-served. Additional questions concerning expec-tations were adapted from Kjerulff, et al, (1982),

The survey instrument concluded with four open-ended questions that assessed changes causedby the computer system and elicited suggestionsfor improved system use. These questions werealso derived from the observations and inter-views. They were intended to serve two pur-poses. The first was to ensure that importantissues were addressed even if they had beenincluded in scaled-response questions. Thesecond was to elicit information about impactsfor which measures were difficult to develop.

All questionnaire items were pretested on asample of laboratory personnel selected by headsupervisors. After revision, the questionnaire wasadministered to all 248 members of the labora-tory staff seven months after computer systeminstallation. The staff knew about the study be-cause of the prior laboratory observations andannouncements at meetings. Each survey wasaccompanied by an explanatory letter, A re-search team member also explained the studyduring weekly staff meetings when the question-naires were distributed. In some laboratories, thismeeting was devoted to answering the question-naire. In others, staff members were allowed tocomplete the survey during work hours providedthat it not interfere with their job functions.

A modified version of the questionnaire wasdistributed to all laboratory technologists for thesecond wave of data collection beginning nearlya year after the first. No further surveys wereconducted.

SampleJust before the system was installed, 11 inter-views were conducted with 20 interviewees rep-resenting all the laboratories. These interview-ees included laboratory directors at all levels.

head supervisors, and an administrator of oneunit of the hospital.

All 248 members of the laboratory staff were sur-veyed starting seven months after system im-plementation. Data from all 119 completed(48%) questionnaires were analyzed, Qnlyseven of the respondents had been interviewed.Most respondents were technologists who hadcollege degrees, worked first shift, and had notworked previously in a laboratory with a com-puter information system. As is typical of labora-tory technologists, almost all were women.

Analysis and ResultsData are presented from interviews immediatelyprior to installation and from the first wave ofsurvey questionnaires seven months later. Thequantitative data were analyzed using a stan-dard statistical software package. Interview notesand responses to open-ended questions on thequestionnaire were analyzed by the constant com-parative method (Glazer and Strauss, 1967),Using this method, categories reflecting com-puter system issues important to laboratory di-rectors and technologists were derivedsystematically.

Open-ended questionsKaplan first analyzed open-ended questions onthe questionnaire. Three themes predominatedin the answers: (1) changes in technologists'work load, (2) improvements in results report-ing, and (3) the need for physicians and nursesto use computer system terminals rather thantelephones for results inquiry. Technologists ex-pressed a general sense that their clerical dutiesand paperwork had increased and productivityhad suffered. However, they credited the com-puter system with making laboratory test resultsavailable more quickly. They said that results re-ports were also more complete, more accurate,easier to read, and provided a picture of "thewhole patient," Even though phone calls inter-rupted laboratory work, they felt that doctors andnurses expected to get test results by calling thelaboratories rather than by using the computersystem. In addition, respondents sensed theywere being blamed by others in the medicalcenter for problems caused by the computersystem.

When responses were grouped by laboratory,marked differences were evident among some

578 MIS Quarterly/December 1988

Qualitative & Quantitative Methods

Of the laboratories. Laboratories differed in theirassessment of changes in their workload,number of telephone calls, improvement in re-porting, and attitudes expressed toward the com-puter system.

Scaled-response questionsResponses to the questionnaire items pertain-ing to job characteristics and to the scaled-response items assessing computer expecta-tions and concerns were analyzed next by tworesearchers who intentionally remained unawareof the findings on open-ended questions. Havingalready analyzed the responses to open-endedquestions, Kaplan assisted them in an initial Q-son of computer system questionnaire items andlater interpretation of a factor analysis of theseitems. A fourth researcher had left the studyteam. Another of the original team members leftthe study during data analysis, leaving Duchonand Kaplan to interpret the results of Duchon'sstatistical analysis,

A factor analysis of job items provided evidenceof construct validity for the measures. Four fac-tors were extracted: skill variety, task identity,autonomy, and feedback. These factors com-prise a four-factor model of the core job dimen-sions (Birnbaum, et al,, 1986; Ferris andGilmore, 1985), and are also used to assess jobcomplexity (Stone, 1974), Qverall, no job char-acteristic dififerences due to environmental or in-dividual factors were found. Respondents re-ported the same levels of job characteristicsregardless of age, gender, job experience, etc.

Five computer system variables were extracted:external communications, service outcomes, per-sonal intentions, personal hassles, and in-creased blame. Reliability for these five factorsranged between ,53 and ,87, Data on these vari-ables indicated that respondents generally werepositive about the computer system. They re-ported that the system had improved relationsand aided communication between staff in thelaboratories and the rest of the medical center(external communications), and that results re-porting and service was better (serviceoutcomes).

Respondents were very positive about their owncontinued use of the computer system (personalintentions). They did not think that irritants oftheir jobs, such as the number of phone callsand the amount of work (personal hassles), had

increased, or that laboratory staff was blamedmore and cooperated with less by physiciansand nurses (increased blame). There were nostatistically significant correlations between jobcharacteristic measures and computer systemmeasures.

At this point, nothing reportable had been foundin the quantitative data. This was surprising be-cause computer system variables were similarto themes identified in responses to the open-ended questions, where some laboratories dif-fered markedly in their responses. Consequently,Kaplan continued to seek explanations for thedifferences among laboratories that were evidentin the qualitative data.

InterviewsNext, Kaplan again analyzed the interviews withlaboratory directors, this time to determine ex-pectations of the directors prior to system im-plementation. She thought that perhaps differentexpectations among directors could have contri-buted to different responses within the laborator-ies. Different expectations were not found. Theanalysis indicated that prior to implementation,directors generally agreed that there would bemore work for laboratory technologists, but thattechnologists' jobs would not be changed by thecomputer system.

This assessment was made with full knowledgethat technologists would have to enter test re-sults and some test orders, and that their pa-penwork and billing duties were expected to de-crease. Interviewees also expected that testresults would be available more quickly. Theythought that reports would have fewer transcrip-tion errors, would be more legible, and wouldprovide useful cumulative information and pa-tient profiles. They expected these improve-ments to reduce the number of unnecessary labo-ratory tests and stat (i,e., emergency) andduplicate test orders, and anticipated that thenumber of telephone inquiries concerning resultswould decrease.

Developing an InterpretativeTheoretical ModelThe five computer system variables that weredeveloped from the questionnaire items reflectedthemes very similar to those found in the quali-tative data from interviews and responses to

MiS Quarterly/December 1988 579

Quaiitative & Quantitative Methods

open-ended questionnaires: changes in work-load and changes in results reporting. This con-gruence of themes from independent sourcesof data strengthened confidence in their validity.

The analysis of qualitative data from the ques-tionnaires suggested important differencesamong laboratories, even in the absence of sta-tistical verification, because the differences wereso striking. Knowledge obtained from observa-tions in the laboratories and at meetings, as wellas from comments made to her by individual labo-ratory technologists and directors, strengthenedKaplan's conviction that these differences wereimportant. It remained to determine the natureof the differences and to explain the lack of re-portable statistical results.

An impression gained from the qualitative datasuggested an interpretation, Kaplan had noticedthat laboratory technologists seemed to empha-size either the increased work or the improvedresults reporting in their answers to the open-ended questions. The repetition of these themesin all the data sources reinforced their impor-tance. This repetition suggested that there weretwo groups of respondents corresponding tothese two themes. The finding that directors didnot expect technologists' jobs to change raisedthe question of what constituted a technologist'sjob. It seemed that directors and technologistsmight have different conceptions of the technolo-gist's job and that these differences were re-flected in their assessment of the computer

system. Individual laboratory technologists mightalso differ in their views of their jobs, just asthey differed in what they emphasized about thecomputer system in their comments on thequestionnaires.

These insights led to a model depicting twogroups of laboratory technologists. According tothis model, one group saw their jobs in termsof producing results reports, the other in termsof the laboratory bench work necessary to pro-duce those results reports. As shown in Figure1, the group who saw its jobs in terms of benchwork was oriented toward the process of pro-ducing laboratory results, whereas the groupwho viewed its work in terms of reporting resultswas oriented toward the outcomes of laboratorywork; the members of this group saw themselvesas providing a service.

The ways in which the computer affected theproduction work in the laboratories were as-sessed in those questionnaire items comprisingpersonal hassles and increased blame, e,g,, ef-fects on paper work and telephone calls. Thus,the group who saw its jobs in terms of the benchwork (i.e,, the process of producing laboratorytests results) would have responded to the com-puter system according to how it had affectedwhat was assessed in these items. The product-oriented group of respondents who saw its jobsin terms of the service provided (i,e,, the prod-uct, rather than the process, of laboratory work)would have assessed the computer system in

View of Job

Responses toComputer System

Computer SystemVariables

Orientation

Process Product

Bench Work

IncreasedWork Load

HasslesBlame

Results, Service

Improved ResultsReporting

CommunicationsService

Orientation = (Communications + Service)- (Hassles + Blame)

Figure 1. A Modei Depicting Different Orientations to Work and theComputer in Clinicai Laboratories

580 MIS Quarterly/December 1988

Quaiitative & Quantitative Methods

terms reflected in its responses to external com-munications and service outcomes items, e,g,,improved results reporting.

This interpretation indicated a reason why jobcharacteristics measures did not depict differ-ences among laboratories and why there wasno correlation between job characteristics andcomputer system variables. The kinds of differ-ences in job orientation depicted in the modelwould not have been measured by job charac-teristic measures.

After this model was proposed, two new vari-ables were created to measure whether tech-nologists' responses differed according to thecomputer system's impact on process versusproduct aspects of their jobs. As shown in Figure1, one variable combined scores on external com-munication and service outcomes, and the othercombined scores on personal hassles and in-creased blame. The variable personal intentionswas omitted because there was no theoreticalbasis for including it; personal intentions did notassess the interaction between specific aspectsof the computer system and the job. Moreover,this variable was not a good discriminator be-tween laboratories or individuals because indi-vidual respondents' scores were all high.

There was a significant negative correlation be-tween these two new variables, thus indicatingthat respondents tended to have high scores onone variable and low scores on the other, i.e.they were either product- or process-oriented.An orientation score for each respondent wasthen computed by subtracting the sum of thatperson's scores on personal hassles and in-creased blame from his or her scores on externalcommunication and service outcomes.

When the orientation score was regressed onlaboratories, statistically significant differences inorientation were found across laboratories. Thus,some laboratories, like some technologists, wereprocess-oriented while others were product-oriented. Moreover, respondents from the labo-ratories rating the strongest process orientationexpressed the most hostility on the open-endedquestions from the survey, whereas respondentsfrom laboratories with the strongest product ori-entation expressed strong satisfaction with thecomputer system.

Thus, our results suggested that the interpreta-tion was correct. Discussion with the laboratorydirector about the intermediate and final results

and about the research papers produced fromthe study supported the interpretation.

DiscussionThis study illustrates the difficulties as well asthe strengths that occurred when research per-spectives from different disciplines were com-bined. Although both authors agree that this col-laboration enriched the study as well as theirown understanding of research, it was rife withfrustrations. Neither author initially realized theextent to which differing values, assumptions,and vocabularies would interfere with the pro-ject. Continued interaction was necessary to rec-ognize that there were differences even in un-derstanding the same words. Persistent effortwas needed to identify and resolve these differ-ences. A key incident in the study stemmed fromthese differences. Because the incident is illus-trative of the difficulties as well as the enrich-ment caused by the different perspectives of re-search team members, we recount it here.

After the initial statistical analysis of data fromthe scaled response questions, Duchon thoughtthat there were no statistical results worth re-porting, Kaplan thought he meant that there wereno statistically significant differences among labo-ratories in their reaction to the computer system.However, she did not agree with these resultsbecause they did not fit with her observationsand analysis, Duchon remained convinced thatthere were no results worth pursuing. Conse-quently, Kaplan began to analyze the remainingdata from the interviews. This analysis strength-ened her convictions that the qualitative data didindicate patterns worth investigating and thatDuchon had to be convinced of this. That deter-mination led to the development of the interpre-tive theoretical model.

The turning point in the study, and in the authors'collaboration, happened when Duchon re-analyzed the quantitative data as suggested byKaplan, This new analysis supported her inter-pretation. When the initial analysis was repeated,there were statistically significant differencesamong laboratories for the computer system vari-ables. We were not able to determine the reasonfor the apparent lack of reportable differencesduring the first statistical analysis. Because ofcontinued difficulties in communication, misun-derstandings, in retrospect, were not surprising.

Some of the difficulties we experienced havebeen described so that others who participate

MIS Quarterly/December 1988 581

Qualitative & Quantitative Methods

on similar research teams may shorten the learn-ing period and reduce the communication prob-lems. It should be clear from this account that,as is common in research that includes qualita-tive approaches, the process is messy. Impres-sions, interpretations, propositions, and hypothe-ses were developed over the course of thestudy, a process that hardly fits the positivistideal of objective collection of neutral or purelydescriptive "facts" (Van Maanen, 1983b), Thismessiness should not dismay those who experi-ence it.

Despite the methodological and communicationsproblems, the collaboration was productive andfriendly. Our tenacity in holding to our initial in-dependent analyses of the different data —though a problem at the time — and the in-creased respect each of us developed for theother's approach, in fact, were positive aspectsof our research. As has happened in other pro-jects (Trend, 1979), a strong determination toaccommodate each approach and to reconcileapparently conflicting data resulted in an inter-pretation that synthesized the evidence. We can-not state too strongly that the advantages of ourcollaboration outweighed the difficulties,

ConciusionsThis article describes how qualitative and quan-titative approaches were combined in a casestudy of a new information system. It illustratesfour methodological points: (1) the value of com-bining qualitative and quantitative methods; (2)the need for context-specific measures of jobcharacteristics; (3) the importance of processmeasures when evaluating information systems;and (4) the need to explore the necessary rela-tionships between a computer system and theperceptions of its users.

Combining qualitative and quantitative methodsproved especially valuable. First, the apparentinconsistency of results between the initial quan-titative analysis and the qualitative data requiredexploration. The initial statistical analysis re-vealed only that certain well-recognized job char-acteristic measures were not capturing the dif-ferences in the sample and that the correlationsbetween job characteristics and computersystem variables were not significant. However,systematically different responses to the com-puter system were present in the qualitativedata. Further analysis to resolve the discrepancyled to the use of new measures developed from

the quantitative questionnaire data that capturedjob-related responses to the computer systemand that supported conclusions drawn from thequalitative data.

Thus, triangulation of data from different sourcescan alert researchers to potential analyticalerrors and omissions. Mixing methods can alsolead to new insights and modes of analysis thatare unlikely to occur if one method is used alone,in the absence of qualitative data, the studywould have concluded that there were no re-portable statistically significant findings. How-ever, on the strength of the qualitative data, atheoretical interpretative model was developedto drive further statistical analysis. The inclusionof qualitative data resulted in the generation ofgrounded theory.

The results also suggest the value of investigat-ing specifically how a computer system affectsusers' jobs. In this study, as in others, no corre-lation was found between standard job charac-teristic measures and variables measuring re-sponses to the computer system. Nevertheless,these responses were related to differences injob orientation. Studies that independentlyassess characteristics and attitudes ignore thespecific impacts of a computer system on work.Standard job characteristic measures do not nec-essarily assess job-specific factors or capture thedifferent job orientations held by workers with"the same job," Instead of sole reliance on thesemeasures, instruments need to include context-specific measures of how a computer systemsupports work, as conceptualized by the worker.In this study, measures assessing these aspectswere derived from knowledge of the settinggained by field experience, again suggesting thevalue of combining qualitative and quantitativemethods.

The study further suggests the need to movebeyond outcome evaluations to investigatinghow a computer system affects processes. Byextending research to examine specific effortsof individual computer information systems in par-ticular contexts, our general understanding ofwhat affects the acceptance and use of com-puter information systems can be improved.

Lastly, there is a growing amount of literaturethat assesses the impacts of computer informa-tion systems. This study was intended, by partof the research team, as an assessment of theimpacts of a computer information system onwork. However, impact studies consider only one

582 MIS Quarterly/December 1988

Qualitative & Quantitative Methods

side of the important interaction between com-puter information system users and the computerinformation system: the effects of the system onusers, the organization, or society. Research de-signs could just as well consider the impacts ofusers, the organization, or society on the com-puter information system. Sorting out the direc-tion of causality is a difficult, if not meaningless,task that is made all the more difficult by themultitudes of confounding factors common toany field setting. It also may be impossible todo because the design does not account forchanges due to the interrelationships betweenthe "dependent" and "independent" variables.

For example, in this study, the job orientationof the users could have mediated their responseto the computer information system. Alterna-tively, the system itself could have caused thedifferences in job orientation. Rather than for-mulate the analysis in either of these ways, itseemed more fruitful to take an interactionist ap-proach and consider the interrelationships be-tween users' job orientations and their responsesto the system, "Impacts" implies unidirectionalcausality, whereas, as has been found in otherstudies, investigations of interactions can bemore productive and meaningful.

Despite the normative nature of themethodological points, the most important con-clusion is the need for a variety of approachesto the study of information systems. For thesame reasons that combining methods can bevaluable in a particular study, a variety of ap-proaches and perspectives can be valuable inthe discipline as a whole. No one method canprovide the richness that information systems,as a discipline, needs for further advance.

AcknowledgementsJoseph A, iVIaxwell of the Harvard GraduateSchool of Education provided invaluable theore-tical assistance and editorial advice. We are alsograteful to the anonymous reviewers whose com-ments resulted in furtherclarifying and strengthen-ing the paper.

ReferencesArgyris, C. Action Science: Concepts, Methods,

and Skiiis for Research and Intervention,Jossey-Bass, San Francisco, CA, 1985,

Bakos, J,Y, "Dependent Variables for the Studyof Firm and Industry-Level impacts of Infor-

mation Technology," Proceedings of theEighth International Conference on Informa-tion Systems, Pittsburgh, PA, December 6-9,1987, pp, 10-23.

Bariff, M,L, and Ginzberg, M.J, "MIS and the Be-havioral Sciences: Research Patterns and Pre-scriptions," Dafa Base (14:1), Fall 1982, pp.19-26,

Barley, S,R. "Technology as an Occasion forStructuring: Evidence from Observations of CTScanners and the Social Order of RadiologyDepartments," Administrative Science Quar-teriy (31), March 1986, pp, 78-108,

Benbasat, i. "An Analysis of Research Method-ologies," in The Information Systems Re-search Challenge, F.W, McFarlan (ed,). Har-vard Business School Press, Boston, MA,1984, pp. 47-85,

Benbasat, I,, Goldstein, D,K, and i\/lead, M, "TheCase Research Strategy in Studies of infor-mation Systems," MIS Quarterly (11:3), Sep-tember 1987, pp, 369-386.

Birnbaum, P.H,, Farh, J.L, and Wong, G,Y,Y,"The Job Characteristics Model in HongKong," Journal of Appiied Psychoiogy (71:4),November 1986, pp, 598-605,

Boland, R.J, "The Process and Product ofSystem Design," Management Science(24:9), May 1978, pp, 887-898.

Bonoma, T.V, "Case Research in Marketing: Op-portunities, Problems, and a Process," Jour-nai of Marketing Research (22:2), May 1985,pp, 199-208,

Bredo E, and Feinberg W. "Part Two: The Inter-pretive Approach to Social and EducationalResearch," in Knowledge and Values inSocial and Educational Research, E. Bredoand W, Feinberg (eds.), temple UniversityPress, Philadelphia, PA, 1982a, pp, 115-128,

Bredo, E, and Feinberg, W, (eds), Knowiedgeand Vaiues in Social and Educationai Re-search, Temple University Press, Philadelphia,PA, 1982b.

Brooks, R.C,, Casey, I,J, and Blackmon, P,W,Jr, Evaiuation of the Air Force Clinical Labo-ratory Automation Systems (AFCLAS) atWright-Patterson USAF Medical Center, Vol,I: Summary, HDSN-77-4 (NTIS no, AD-A043664); Vol. Ii: Analysis, HDSN-77-5 (NTIS no.AD-A043 665), Analytic Services, Arlington,VA, 1977.

Campbell, D,T. "Foreword" in Case Study Re-search: Design and Methods, R.K. Yin (ed,).Sage Publications, Beverly Hills, CA, 1984, pp.7-9,

MiS Quarteriy/December 1988 583

Qualitative & Quantitative Methods

Cook, T,D. and Campbell, D.T, Quasi-Experimen-tation: Design and Analysis Issues for FieidSettings, Houghton Mifflin, Boston, MA, 1979,

Cook, T.D. and Reichardt, C,S. (eds,). Qualita-tive and Quantitative Methods in EvaluationResearch, Sage Publications, Beverly Hills,CA, 1979,

Dansereau, F, Jr,, Graen, G, and Haga, W,J,"A Vertical-Dyad Linkage Approach to Lead-ership within Formal Organizations: A Longi-tudinal investigation of the Role Making Proc-ess," Qrganizational Behavior and HumanPerformance (13:1), February 1975, pp, 46-78.

Danziger, J.N, "Social Science and the Socialimpacts of Computer Technology," Social Sci-ence Quarterly (66:1), March 1985, pp, 3-21.

Dickson, G.W., Benbasat, I. and King, W.R. "TheMIS Area: Problems, Challenges, and Oppor-tunities," Data Base (14:1), Fall 1982, pp. 7-12,

Downey, H,K, and Ireland, R,D, "Quantitativeversus Qualitative: The Case of EnvironmentalAssessment in Organizational Studies," inQualitative Methodology, J. Van Maanen(ed.). Sage Publications, Beverly Hills, CA,1983, pp. 179-190.

Ferris, G.R. and Gilmore, D.C, "AMethodological Note on Job Complexity In-dexes," Journal of Applied Psychoiogy (70:1),February 1985, pp. 225-227,

Flagle, CD, "Qperations Research with Hospi-tal Computer Systems," in Hospital ComputerSystems, M,F, Gollen (ed,), John Wiley andSons, New York, NY, 1974, pp. 418-430.

Fok, L.Y., Kumar, K. and, Wood-Harper, T. "Meth-odologies for Socio-Technical-Systems (STS)Development: A Comparison," Proceedings ofthe Eighth International Conference on Infor-mation Systems, Pittsburgh, PA, December 6-9, 1987, pp, 319-334.

Franz, C.R. and Robey, D. "An Investigation ofUser-Led System Design: Rational and Politi-cal Perspectives," Communications of theACM (27:12), December 1984, pp. 1202-1209.

George, A.L, and McKeown, T,J, "Case Studiesand Theories of Qrganizational DecisionMaking," in Advances in Information Process-ing in Qrganizations (2), JAI Press,Greenwich, CT, 1985, pp, 21-58,

Glaser, B,G. and Strauss, A.L, The Discoveryof Grounded Theory: Strategies for Quaiita-tive Research, Aldine, New York, NY, 1967.

Goldstein, D., Markus, M.L., Rosen, M. and Swan-son, E.B. "Use of Qualitative Methods in MIS

Research," Proceedings of the Seventh In-ternationai Conference on Information Sys-tems, San Diego, CA, December 15-17, 1986,pp. 338-339,

Goodhue, D, "IS Attitudes: Towards TheoreticalDefinition and Measurement Clarity," Proceed-ings of the Seventh International Conferenceon Information Systems, San Diego, CA, De-cember 15-17, 1986, pp, 181-194,

Hackman, J,R, and Qldham, G,R, "MotivationThrough the Design of Work: Test of aTheory," Qrganizationai Behavior and HumanResources (16:2), August 1976, pp, 250-279.

Hirschheim, R,, Klein, H, and Newman, M, "ASocial Action Perspective of Information Sys-tems Development," Proceedings of theEighth Internationai Conference on Informa-tion Systems, Pittsburgh, PA, December 6-9,1987, pp, 45-57.

House, R.J. and Rizzo, J.R. "Toward the Meas-urement of Qrganizational Practice: Scale De-velopment and Validation," Journal of AppliedPsychology (56:6), Qctober 1972, pp, 388-396.

Ives, B. and Qlson, M,H. "User Involvement andMIS Success: A Review of Research," Man-agement Science (30:5), May 1984, pp, 586-603,

Jarvenpaa, S,L,, Dickson, G,W. and DeSanc-tis, G. "Methodological Issues in Experimen-tal IS Research: Experiences and Recommen-dations," MIS Quarterly (9:2), June 1985, pp.141-156.

Jick, T.D. "Mixing Qualitative and QuantitativeMethods: Triangulation in Action," in Quaiita-tive Methodoiogy, J, Van Maanen (ed,). SagePublications, Beverly Hills, CA, 1983, pp. 135-148.

Kaplan, B. "impact of a Clinical Laboratory Com-puter Systems: Users' Perceptions," inMedinfo 86: Fitth Worid Congress on Medi-cal Informatics, R. Salamon, B. Blum and M.J.Jorgensen (eds.), North-Holland, Amsterdam,1986, pp. 1057-1061.

Kaplan, B. "Initial Impact of a Clinical Labora-tory Computer System: Themes Common toExpectations and Actualities," Journai of Medi-cal Systems (11:2/3), June 1987, pp, 137-147,

Kaplan, B, and Duchon, D, "Job-Related Re-sponses to a Clinical Laboratory Computer In-formation System Seven Months Post Im-plementation," in Social, Ergonomic andStress Aspects of Work with Computers, R,Salvendy, S,L, Sauter and J,J, Hurrell, Jr,

584 MIS Quarterly/December 1988

Quaiitative & Quantitative Methods

(eds,), Elsevier, Amsterdam, 1987a, pp, 17-24,

Kaplan, B. and Duchon, D, "Employee Accep-tance of a Computer Information System: TheRole of Work Qrientation," Working Paper IS-1988-004A, Department of Quantitative Analy-sis and Information Systems, University of Cin-cinnati, Cincinnati, QH, April 1987b,

Kaplan, B, and Duchon D, "A Qualitative andQuantitative Investigation of a ComputerSystem's Impact on Work in Clinical Labora-tories," Working Paper IS-1987-001, Depart-ment of Quantitative Analysis and InformationSystems, University of Cincinnati, Cincinnati,QH, December 1987c,

Kauber, P, "What's Wrong With a Science ofMIS?" Proceedings of the 1986 Decision Sci-ence Institute, Honolulu, HA, November 23-25, 1986, pp, 572-574,

Kemp, N,J. and Clegg, C.W. "Information Tech-nology and Job Design: A Case Study on Com-puterized Numerically Controlled MachineTool Workers," Behavior and Information Tech-nology (6:2), 1987, pp, 109-124,

Kjerulff, K, H,, Counte, M,A,, Salloway, J,C, andCampbell, B.C. "Predicting Employee Adap-tation to the Implementation of a Medical In-formation System," Proceedings of the SixthAnnual Symposium on Computer Applicationsin Medical Care, IEEE Computer SocietyPress, Silver Springs, MD, 1982, pp. 392-397,

Klein, H, "The Critical Social Theory Perspec-tive on Information Systems Development," Pro-ceedings of the 1986 Decision Science Insti-tute, Honolulu, HA, November 23-25, 1986,pp. 575-577.

Kling, R. "Social Analyses of Computing: Theo-retical Perspectives in Recent Empirical Re-search," Computing Surveys (12:1), March1980, pp. 61-110.

Kling, R. and Scacchi, W, "The Web of Comput-ing: Computer Technology as Social Qrgani-zation," in Advances in Computers (21), M,G,Yovits (ed,). Academic Press, New York, NY,1982, pp, 2-90,

Lewis, J,W, "Clinical Laboratory Information Sys-tems," Proceedings of the IEEE (67:9), Sep-tember 1979, pp, 1229-1300,

Light, R.J. and Pillemer, D,B, "Numbers and Nar-rative: Combining Their Strengths in ResearchReviews," Harvard Educationai Review (52:1),February 1982, pp, 1-26,

Lincoln, T,L, and Korpman, R,A, "Computers,Health Care, and Medical Information Sci-ence," Science (210:4467), Qctober 17,1980,pp, 257-263,

Lincoln, Y,S, and Guba, E,G. Naturaiistic Inquiry,Sage Publications, Beverly Hills, CA, 1985,

Lyytinen, K, "Different Perspectives on Informa-tion Systems: Problems and Solutions," ACMComputing Surveys (19:1), March 1987, pp,5-46.

Manicas, P.T. and Secord, P.F, "Implications ForPsychology of the New Philosophy of Sci-ence," American Psychologist (38:4), April1983, pp, 399-413,

Markus, M,L, "Power, Politics, and MIS implemen-tation," Communications of the ACM (26:6),June 1983, pp. 430-444.

Markus, M.L, and Robey, D. "Information Tech-nology and Qrganizational Change: CausalStructure in Theory and Research," Manage-ment Science (34:5), May 1988, pp. 583-598.

Maxwell, J.A,, Bashook, P.G. and Sandlow, L.J,"Combining Ethnographic and ExperimentalMethods in Educational Research: A CaseStudy," in Educational Evaluation: Ethnogra-phy in Theory, Practice, and Poiitics, D,M, Fet-terman and M,A, Pitman (eds.). Sage Publi-cations, Beverly Hills, CA, 1986, pp, 121-143.

Meehl, P.E. "Theoretical Risks and Tabular As-terisks: Sir Karl, Sir Ronald, and the Slow Pro-gress of Soft Psychology," Journal of Consult-ing and Clinicai Psychoiogy (46:4), August1978, pp, 806-834.

Mendelson, H., Ariav, G., Moore, J,, DeSanctis,G, "Competing Reference Disciplines for MISResearch," Proceedings of the Eighth Inter-national Conference on information Systems,Pittsburgh, PA, December 6-9, 1987, pp. 455-458,

Meyers, W,R, The Evaiuation Enterprise, Jossey-Bass, San Francisco, GA, 1981,

Miles, M,B, and Huberman, A,M. QuaiitativeData Anaiysis: A Sourcebook of New Meth-ods, Sage Publications, Beverly Hills, CA,1984.

Millman, Z. and Hartwick, J. "The Impact of Auto-mated Qffice Systems on Middle Managersand Their Work," MIS Quarterly (11.4), De-cember 1987, pp. 479-490.

Mintzberg, H. The Nature of Manageriai Work,Harper and Row, New York, NY, 1973,

Mumford, E, "From Bank Teller to Qffice Worker:The Pursuit of Systems Designed for Peoplein Practice and Research," Proceedings of theSixth International Conference on InformationSystems, Indianapolis, IN, December 16-18,1985, pp. 249-258,

Mumford, E, and Henshall, D, A Participative Ap-proach to Computer Systems Design: A Case

MiS Quarterly/December 1988 585

Qualitative & Quantitative Methods

Study of the Introduction of a New ComputerSystem, John Wiley and Sons, New York, NY,1979.

Mumford, E,, Hirschheim, R,, Fitzgerald, G, andWood-Harper, T, Research Methods in Infor-mation Systems, North-Holland, Amsterdam,1985,

Nicol, R, and Smith, P, "A Survey of the Stateof the Art of Clinical Biochemistry LaboratoryComputerization," International Journal of Bio-Medical Computing (18:2), March 1986, pp,135-144.

Paplanus, S.H, "Clinical Pathology," in ComputerAppiications in Ciinical Practice: An Overview,D. Levinson (ed,), Macmillan, New York, NY,1985, pp. 118-122.

Patton, M.Q. Utiiization-Focused Evaiuation,Sage Publications, Beverly Hills, CA, 1978.

Rockart, J.F, "Conclusion to Part i," in The In-formation Systems Research Challenge, F,W,McFarlan (ed,). Harvard Business SchoolPress, Boston, MA, 1984, pp, 97-104,

Stone, E.F, "The Moderating Effect of Work Re-lated Values on the Core Job-Scope Satisfac-tion Relationship," unpublished doctoral dis-sertation. University of California, Irvine, GA,1974,

Trend, M,G. "Qn the Reconciliation of Qualita-tive and Quantitative Analyses: A CaseStudy," in Qualitative and Quantitative Meth-ods in Evaiuation Research, T.D. Gook andG.S, Reichardt (eds,). Sage Publications, Bev-erly Hills, GA, 1979, pp, 68-86,

Van Maanen, J. "Reclaiming Qualitative Meth-ods for Qrganizational Research," in Qualita-tive Methodology, J, Van Maanen (ed,). SagePublications, Beverly Hills, GA, 1983a, pp, 9-18.

Van Maanen, J. "Epilog: Qualitative Methods Re-claimed," in Qualitative Methodoiogy, J, VanMaanen (ed,). Sage Publications, BeverlyHills, GA, 1983b, pp. 247-268.

Van Maanen, J. (ed.). Qualitative Methodoiogy,Sage Publications, Beverly Hills, GA, 1983c.

Van Maanen, J., Dabbs, J.M, Jr, and Faulkner,R,R. Varieties of Qualitative Research, SagePublications, Beverly Hills, GA, 1982,

Vitalari, N,P, "The Need for Longitudinal Designsin the Study of Gomputing Environments," inResearch IVIethods in Information Systems, E.Mumford, R, Hirschheim, G. Fitzgerald and T,Wood-Harper (eds,), North-Holland, Amster-dam, 1985, pp. 243-265.

Weick, K.E. "Theoretical Assumptions and Re-search Methodology Selection," and ensuingdiscussion, in The Information Systems Re-

search Challenge, F.W. McFarlan (ed.). Har-vard Business School Press, Boston, MA,1984, pp. 111-133.

Withey, M., Daft, R.L and Gooper, W,H, "Meas-ure of Perrow's Work Unit Technology: An Em-pirical Assessment and a New Scale," Acad-emy of Management Journal (26:1), March1983, pp, 45-63,

Yin, R,K. Case Study Research: Design andMethods, Sage Publications, Beverly Hills,GA, 1984.

Zuboff, S. "New Worlds of Gomputer-MediatedWork," Harvard Business Review (60:5), Sep-tember-Qctober 1982, pp. 142-152,

Zuboff, S, In the Age of the Smart Machine: TheFuture of Work and Power, Basic Books, NewYork, NY, 1988.

About the AuthorsBonnie Kaplan is an assistant professor of in-formation systems at the University of Gincin-nati's College of Business Administration. Sheis also an adjunct assistant professor of clinicalpathology and laboratory medicine. Dr. Kaplanreceived her Ph.D. in history from the Universityof Chicago. She has had extensive practical ex-perience in information systems development atmajor academic medical centers, Dr, Kaplan'sresearch interests include behavioral and policyissues in information systems, implementationof technological innovations, information systemsin medicine and health care, and history and so-ciology of computing. Her publications have ap-peared in Journai of Medical Systems, Interna-tionai Journal of Technology Assessment inHealth Care, Journal of Health and Human Re-sources Administration, Journal of Clinical En-gineering, and volumes on human-computer in-teraction and on medical computing,

Dennis Duchon is an assistant professor of or-ganizational behavior in the Gollege of Businessat the University of Texas at San Antonio, Beforejoining the faculty there, he was an assistant pro-fessor of organizational behavior at the Univer-sity of Gincinnati. He received a Ph.D. in organ-izational behavior from the University of Houston.He has worked as a manager both in the UnitedStates and abroad. Dr. Duchon's research inter-ests include behavioral decision making and theimplementation of new technologies. He has pub-lished in the Journai of Appiied Psychoiogy,IEEE Transactions in Engineering Management,and Decision Sciences.

586 MiS Quarterly/December 1988