The Journal of Applied Instructional Design - EdTech Books

96
The Journal of Applied Instructional Design July 2021 Version: 0.47 Built on: 12/17/2021 06:00pm

Transcript of The Journal of Applied Instructional Design - EdTech Books

The Journal of Applied Instructional Design

July 2021

Version: 0.47

Built on: 12/17/2021 06:00pm

This book is provided freely to you by

CC BY: This work is released under a CC BY license, which means that youare free to do with it as you please as long as you properly attribute it.

The Journal of Applied Instructional Design, 10(2) 3

Table of Contents

About the Journal 4 ............................................................................................................................................................ A/B Testing on Open Textbooks 7 ................................................................................................................................... Applying the Design of Narrative Distance in Instruction 15 ..................................................................................... Enabling Interactivity through Design: Outcomes from a Gamified Health Insurance Onboarding Course .. 23Chefs in Training! Engaging Pharmacy Students through Course Gamification 34 ................................................ A Study of Instructional Design Master's Programs and Their Responsiveness to Evolving ProfessionalExpectations and Employer Demand 43 ......................................................................................................................... Ladders and Escalators: Examining Advancement Obstacles for Women in Instructional Design 54 ................. "I Can Do Things Because I Feel Valuable": Authentic Project Experiences and How They Matter toInstructional Design Students 68 .................................................................................................................................... Exploring Faculty Perceptions of Professional Development Support for Transitioning to Emergency RemoteTeaching 82 ......................................................................................................................................................................... Back Matter 95 ...................................................................................................................................................................

Citation Information 96 ..............................................................................................................................................

The Journal of Applied Instructional Design, 10(2) 4

About the Journal

During the past 50 years, journals in the field ofinstructional design have been responsive to thechanging needs of both scholars and to a lesser degree,the practitioner. We have seen an evolution of AVCR toECTJ, the emergence of JID, and finally the merging ofECTJ and JID to form ETR&D. ETR&D is a widelyrecognized, scholarly journal in our field that maintainsrigorous standards for publications.

During the past 50 years, we have also witnessed achange in the field due in part to the success ofinstructional design in business and other nonschoolenvironments. The number of instructional designersworking outside the university has dramaticallyincreased. Of particular importance is the rise in thenumber of instructional designers with doctorates whoconsider themselves practitioners, but not necessarilyscholars. This growing group of designers might be bestdescribed as reflective practitioners who can make asignificant contribution to the knowledge of our field.

This growth and success in the application ofinstructional design has also changed the field. From theearly days of the field until the mid-1980’s, the theory andpractice of instructional design was almost exclusivelyinfluenced by the academic community. With the growthof instructional designers, the theory and practice of thefield is now defined by both academics and practitioners.There is a need for greater communication between thescholars and the practitioners in a scholarly journal thatwill support innovation and growth of our knowledgebase.

ISSN: 2160-5289

GoalsThe purpose of this journal is to bridge the gap betweentheory and practice by providing reflective practitioners ameans for publishing articles related to the field. Thejournal establishes and maintains a scholarly standardwith the appropriate rigor for articles based on designand development projects. Articles include evaluationreports (summative and formative), lessons learned,design and development approaches, as well as appliedresearch. The articles are based on design anddevelopment projects as opposed to pure researchprojects and focus on lessons learned and how to improvethe instructional design process. Rigor is established

through articles grounded in research and theory.

A secondary goal of this journal is to encourage andnurture the development of the reflective practitioner inthe field of instructional design. This journal encouragesthe practitioner as well as collaborations betweenacademics and practitioners as a means of disseminatingand developing new ideas in instructional design. Theresulting articles inform both the study and practice ofinstructional design.

PhilosophyThis journal will provide a peer-reviewed format for thepublication of scholarly articles in the field of appliedinstructional design. The journal recognizes the role ofthe practitioner in the work environment and realizes thatoutside constraints may limit the data collection andanalysis process in applied settings. The limitations ofreal-world instructional design of the practitioner can stillprovide valuable knowledge for the field.

Sponsoring OrganizationJAID is a publication of the Association for EducationalCommunications and Technology (AECT).

JAID is an online open-access journal and is offeredwithout cost to users.

Journal Staff

Role Name AffiliationCo-Editor Jill E. Stefaniak University of

GeorgiaCo-Editor Julie A. Bridges Dupont

SustainableSolutions

Assistant Editor Max C. Anderson University ofIllinois College ofMedicine

Copy Editor Rebecca M. Reese Colorado Schoolof Mines

ProductionEditor

Royce Kimmons Brigham YoungUniversity

CopyeditingAssistant

Joanna Kohlhepp Old DominionUniversity

CopyeditingAssistant

Rebecca Clark-Stallkamp

Virginia Tech

CopyeditingAssistant

Lauren Bagdy Florida StateUniversity

The Journal of Applied Instructional Design, 10(2) 5

Editorial Board

Name AffiliationAndy Gibbons Brigham Young UniversityDavid Richard Moore Ohio UniversityWilhelmina Savenye Arizona State UniversityJames Ellsworth U.S. Naval War CollegeDavid Wiley Lumen LearningEllen Wagner Sage Road Solutions, LLCBarbara Lockee Virginia TechTheodore J. Kopcha University of GeorgiaTutaleni Asino Oklahoma State UniversityShahron Williams VanRooij

George Mason University

Beth Sockman East Stroudburg UniversityM.J. Bishop University System of Maryland

About AECT

The Association for Educational Communications andTechnology (AECT) is a professional association ofinstructional designers, educators and professionals whoprovide leadership and advise policy makers in order tosustain a continuous effort to enrich teaching andlearning. Seizing opportunities to raise awareness andleverage technology, our members may be found aroundthe world in colleges and universities, in the ArmedForces and industry, in museums, libraries, and hospitals,and in the many places where educational change isunderway. Our research and scholarly activity contributeto the knowledge base in the field of Learning. We are onthe cutting edge of new developments and innovations inresearch and application.

AECT is the premier organization for those activelyinvolved in the design of instruction and a systematicapproach to learning. We provide an international forumfor the exchange and dissemination of ideas for ourmembers and for target audiences. We are the nationaland international voice for improvement of instructionand the most recognized association of informationconcerning a wide range of instructional and educationaltechnology. We have 24 state and six InternationalAffiliates all passionate about finding better ways to helppeople learn.

Since 1923, AECT has been the professional home for thisfield of interest and has continuously maintained acentral position in the field, promoting high standards, in

both scholarship and practice with nine Divisions and aGraduate Student Assembly that represent the breadthand depth of the field. Other journals sponsored by AECTinclude Educational Technology Research andDevelopment and TechTrends.

The Journal of Applied Instructional Design (JAID) is arefereed online journal designed for the publication ofscholarly articles in the field of applied InstructionalDesign. The purpose of JAID is to provide the reflective IDscholar-practitioners and researchers a means forpublishing articles on the nature and practice of ID thatwill support the innovation and growth of our knowledgebase. The journal is for practitioners, instructors,students, and researchers of instructional design.

Call for SubmissionsJAID is for reflective scholar-practitioners, who throughdocumentation of their practice in ID, make significantcontributions to the knowledge of our field. Authors areinvited to submit articles documenting new or revisedapproaches to ID; the processes of ID including in-depthdocumentation of analysis, design, and development,implementation and evaluation; design-based research; aswell as applied research. Articles must be based oninstructional design projects as opposed to pure researchprojects and focus on documented processes, lessonslearned, and how to improve the overall process of ID.Articles must be grounded in research and theoryconnecting the intellectual foundations of the ID field andhow these foundations shape its practice.

The journal will establish and maintain a scholarlystandard with the appropriate rigor for articles based ondesign and development projects. A secondary goal of thisjournal is to encourage and nurture the development ofthe reflective practitioner in the field of ID. This journalencourages the practitioner as well as collaborationsbetween academics and practitioners as a means ofdisseminating and developing new ideas in ID. Theresulting articles should inform both the study andpractice of ID.

Submit an Article

Article TypesJAID currently accepts submissions of three article types.

Instructional Design Practice

This is an applied journal serving a practicing community.Our focus is on what practitioners are doing in authenticcontexts and their observed results. These articles covertopics of broad concern to instructional design

The Journal of Applied Instructional Design, 10(2) 6

practitioners. The articles should represent issues ofpractical importance to working designers.

Research Studies on Applied InstructionalDesign

JAID is interested in publishing empirical studiesexploring the application of instructional designprinciples in applied settings. Quantitative andqualitative studies are welcome.

Instructional Design/Performance DesignPosition Papers

JAID also accepts position papers that attempt to bridgetheory and practice. Examples may include conceptualframeworks and new ideas facing the instructional designcommunity. The paper must also provide enoughinformation to allow the replication of the innovation orcontinuation of the research in other settings. Positionpapers must be based in the context of a theoreticalframework. Efficacy data is strongly preferred, but notalways required, contingent upon the potentialgeneralizability or value of the innovation.

Submission GuidelinesThe journal will focus on in-depth applications of the IDprocess and publish a variety of articles including casestudies of the ID process; application articles that go

beyond a mere how-to approach that provideimplementation insights, guidance and evaluation of aprocess; evaluation articles that focus on the viability of aproduct or process; applied research resulting fromevaluation of materials, studies of projectimplementation, articles on ways to improve the IDprocess from the perspective of the practitioner, andshort essays that provide a scholarly debate of relevantissues related to the application of ID and relevant bookreviews. When applicable, articles should includesupplementary materials including examples of IDproducts, evaluation instruments, media files, and designartifacts.

The articles in the journal will be from the perspective ofthe scholar-practitioner rather than from the researcher.However, the manuscripts must demonstrate scholarlyrigor appropriate to applied manuscripts.

Articles, including tables or figures, must follow APA 7thedition formatting and be submitted in a word or docformat using at least 12-point New Times Roman font. Each article must have an abstract (75-100 words) and alist of keywords. While there is some flexibility in thelength of an article, 4,000 to 5,000 words is a best-guessestimate. If in doubt, contact the editor prior tosubmitting the article. Identifying information must onlybe located on the cover page including contactinformation for the first author.

You may contact the editors via email if you have furtherquestions.

Contact the Editor

The Journal of Applied Instructional Design, 10(2) 7

A/B Testing on OpenTextbooks

A Feasibility Study for ContinuouslyImproving Open Educational Resources

Royce Kimmons

This study examined the feasibility of employing A/Btests for continuous improvement by focusing onuser perceptions of quality of six chapters of apopular open textbook over the course of a year.Results indicated non-significant differences in allcases but also suggest that future work in this areashould (a) employ A/B testing at a broader, less-granular (e.g., platform-level) scale to increasesample sizes, (b) explore autonomous approaches toexperimentation and improvement, such as banditalgorithms, and (c) rely upon more universallycollected dependent variables to reduce sample sizelimitations emerging from self-reports.

IntroductionOpen educational resources provide great promise toinstructional designers as low-cost, high-impacteducational materials that can be used, shared, remixed,and adapted with ease. Especially when viewed throughthe lens of the “5Rs” of openness (Wiley, n.d.)—Retain,Revise, Remix, Reuse, Redistribute—or the lens of“expansive openness” (Kimmons, 2016), such resourcesgive instructional designers the ability to create andshare learning materials at a massive scale, to adaptexisting resources for better meeting the needs of targetlearners, and to remix resources from various authorsinto multi-faceted and rich learning experiences.

Because of the ubiquity of textbooks in higher education,the open textbook as a medium promises to be a valuablemeans for providing learning opportunities to manystudents while also driving down costs. Students at four-year universities in the U.S. currently spend an averageof $1,240 on textbooks per year (College Board, 2019),and textbook cost hikes have far outpaced inflation,consumer costs, and recreational book costs, makinghigher education opportunities more cost-prohibitive andrequiring students to skip meals, enroll in fewer courses,and work longer hours (Whitford, 2018). While open

textbooks provide an opportunity for universities to drivedown student costs and to improve learning experiences,open textbooks are not widely used (Seaman & Seaman,2018). This is presumably due to perceptions of timelimitations emerging from tenure and promotionpractices and perceptions that open textbooks are ofrelatively poor quality when compared to their copyright-restricted alternatives (Kimmons, 2015; Martin &Kimmons, 2020).

Though systemic challenges to open textbook adoptionmay be outside the realm of instructional designers toaddress, one clear way that we can make a difference isto help improve the quality of these resources. Someinitial work has sought to establish quality metrics foropen textbooks and other open resources (Bodily et al.,2017; Woodward et al., 2017), and Dinevski (2008)proposes that the quality control of these resources isrelatively unique by placing accountability in the hands oflearners, teachers, and local designers to addresslocalized or demographic-specific needs, rather than uponmarket-driven publisher considerations. Furthermore,though traditionally published textbook editions areviewed as static entities that are either high- or low-quality, because of their live and open nature, opentextbooks can also undergo continuous improvementefforts that iteratively improve their quality over time,correcting mistakes, refining formatting, and providingsupplements as needed to improve learning (Wiley et al.,2021).

For these reasons, applying continuous improvementcycles to open educational resources is of increasinginterest to designers, but we are only just beginning tofigure out how to do this well, especially when large-scaledata are involved and resources are being used by a widearray of learners. Borrowing from the softwaredevelopment field (the same field where the notion ofopenness came from, to begin with; Kimmons, 2016;Open Source Initiative, n.d.; Stallman, 2013), it seemsreasonable to consider how modern approaches tosoftware improvement might apply to educationalresources as well. As a promising example, A/B or splittesting is an approach to software development thatplaces at least two different versions of a product in frontof random sets of actual users and analyzes theirbehaviors over time to determine which is superior(Kohavi & Longbotham, 2017).

When it comes to education, A/B testing has beenproposed not only as a process for improving design butalso as a process for choosing between competingpedagogical methods or other decisions of educationalimportance (UpGrade, n.d.). In the case of opentextbooks, A/B testing would require having at least twoversions of content that users interact with. The “A”version (otherwise called the original version or control)

The Journal of Applied Instructional Design, 10(2) 8

represents the default version of the resource asoriginally created by the author, while the “B” version(otherwise called the experimental flight or fork)represents a variation of the resource that the researcherhypothesizes might yield differing behaviors or results. Tomake comparisons, audience size for each version maynot need to be equal, and relative sampling for differentversions may involve an assessment of the urgency andrelative importance of experimental variations. Asreaders are assigned to the competing versions of thetextbook, a variety of analytics could be collected to testwhich version is superior, and successive tests couldtheoretically be employed on the same resource togradually improve it in many different ways.

Bringing these ideas together, this study explores thefeasibility of using A/B testing to inform continuousimprovement and increase the perceived quality of opentextbooks. Relying upon data collection and analysismechanisms of a popular open textbook forundergraduate and teacher education, the guidingresearch question of this study was “How feasible is it toconduct A/B testing on highly-used open textbookchapters for the purpose of improving perceptions ofquality?”

MethodsTo conduct this study, experimental flights were createdwithin the EdTech Books system by copying six chaptersas new flights (or “B” versions), adjusting their contents,and setting each chapter’s “Flight Mode” to “Automatic.”The automatic mode meant that whenever any readernavigated to the chapter, they were randomly assigned toeither view the original or the experimental flight. Thisassignment was done without the reader’s awareness andensured true randomization. Flight assignment wasenabled for a period of 12 months (February 2020 toFebruary 2021), and results were then analyzed tocompare reader behaviors and perceptions for the timeperiod. As a methodological note, though this timeframecoincided with the COVID-19 pandemic in many countriesand resulting shifts to online and remote learning mighthave influenced overall usage of open resources, such ashift would not be expected to influence the types of userbehaviors measured here between groups. For instance,though more people might have started reading thetextbooks because of the pandemic, we would not expectthis to influence the relationship between text size withinthe textbooks and reading behaviors. For this reason, wedid not conclude that the targeted timeframe for thestudy should be considered as an additional variable ormeaningful frame of analysis.

Context

EdTech Books is a free online publishing platform for

open textbooks. Built with PHP, MySQL, and Javascript,the platform operates on four guiding values of freedom,accessibility, usability, and quality, providing authorswith tools to easily create, remix, and share textbooks(Kimmons, n.d.). Currently, the platform provides contentto roughly 50,000 unique readers per month,representing students, teachers, and the general public.Content is provided in simple HTML via web pages andalso as PDFs for download, representing millions of pageviews over the course of its two-year lifespan.

Central to the mission and design of EdTech Books is thegoal of supporting continuous improvement and improvedperceptions of open textbook quality. Toward this end,the system provides A/B testing features, qualityassurance mechanisms, advanced analytics, and variousother tools to support ongoing analysis, adjustment, andimprovement of materials. However, since the notion ofcontinuous improvement is not commonly connected tothe development of published materials, like textbooks, itis unclear how to do this well and how to develop systemsthat both empower and encourage authors to engage inthis process.

For this study, I analyzed results from six experimentsconducted within EdTech Books upon separate chaptersof a popular open textbook: The K-12 EducationalTechnology Handbook by Ottenbreit-Leftwich andKimmons (2020). This textbook has been accessed over120,000 times in its short lifespan and is widely used forteacher education courses and professional developmentefforts and is also commonly accessed from search engineresults on topics related to technology’s role in education.

Participants

As readers accessed the textbook on the platform for thefirst time, they were notified that the system collectsanonymous analytics related to their behaviors, and theywere given the option to opt-out of being tracked in thisway. For this study, I focused on opted-in reader dataassociated with this single textbook.

As with other textbooks in the platform, readers of thetextbook accessed chapters in many ways but generallyfell into two categories: (a) formal learners who accessedchapters from links or LMS embeds associated withofficial university courses and (b) non-formal or informallearners who accessed chapters from organic searchengine results (e.g., those searching Google for “techintegration”). Backlink analysis of the textbook revealedthat it was heavily used by students at a number ofuniversities, including Brigham Young University, MaristUniversity, Oklahoma State University, State Universityof New York, Montana State University, PurdueUniversity, and others. The breakdown of formal vs.non/informal learners, however, varied from chapter tochapter with some chapters like “Technology Integration”

The Journal of Applied Instructional Design, 10(2) 9

experiencing a relatively even split between the two andothers exhibiting high skew in one direction or the other.Even within these categories, we would expect to findgreat variation in reader goals, purposes, and activities,as higher education institutions use these resources fordiverse courses. For the purpose of this study, readertype was not considered in data analysis, and the flightassignment procedure did not take reader category intoconsideration for random assignment, meaning that thedemographics of both the original and experimentalversions of each chapter would be expected to exhibitsimilar distributions of reader types to the overallchapter. This was an intentional design decision butassumes that optimal design decisions for improvingperceived quality would not vary by reader category.

Dependent Variable

Because perceptions of poor quality are a major barrier toopen textbook adoption and diffusion (Kimmons, 2016;Martin & Kimmons, 2020) and the improvement ofperceived quality is a major goal stated on the platform,we constructed experiments with the goal of improvingreader perceptions of quality, as measured by a simplesurvey. This single-question survey was provided as anunobtrusive “End-of-Chapter Survey” at the bottom ofeach chapter that asked the following: “Overall Quality:How would you rate the overall quality of this chapter?”Possible responses were coded to an ordinal scale asfollows: (1) “Very Low Quality,” (2) “Low Quality,” (3)“Moderate Quality,” (4) “High Quality,” and (5) “VeryHigh Quality.” The form was then automaticallysubmitted as readers navigated away from the chapter orclosed their browser tab, resulting in an average qualityrating of 4.1/5.0 for the targeted textbook chapters (n =963 ratings, SD = .67). Results also exhibited a stronglynegative skew, with only 4 ratings (0.4%) falling below“Moderate Quality” (see Figure 1). These ratingsrepresented results from 810 different users with theaverage user leaving 1.19 ratings across chapters in thebook (SD = .75, Max = 10).

Figure 1

Distribution of Textbook Ratings

Chart showing the distribution of textbook ratings

The unobtrusive and optional nature of this survey helpedto avoid Hawthorne effects in results and provided similarbenefits to those found in the analysis of public internetdata sources (Kimmons & Veletsianos, 2018), eventhough some interpretive power was lost with limitedcontextual information about readers. This approach alsoprovided minimal risk, effort, and discomfort to users andprevented analyses from being classified as humansubjects research according to NIH definitions, becausethe process (a) did not collect information aboutindividuals and (b) did not include identifiable data, suchas demographics, names, user type information (e.g.,student vs. faculty), or IP addresses. This means that thesample size for each experiment was limited to those whoanonymously answered the quality assurance measure atthe end of the chapter, which accounted for around 1% ofreaders for each chapter.

Though such a low response rate would be troubling insome research settings, the fact that readers wererandomly assigned to the two groups helps to alleviateconcerns of self-selection bias, and low rates of responsewill always be a necessity when using unobtrusivemeasures of relatively free-roaming user activities likethese. This point is of special importance when studyingopen resources, because most of the traffic (or userbehavior) associated with these resources constituteslurking (Bozkurt et al., 2020) or those who may brieflyopen the chapter without any intent to actually read it. Toillustrate, Google Analytics reported that the bounce ratefor the book in this time period (or the number of userswho navigated away after viewing only one page) was71.85% with the average user session lasting less than 3minutes. This is why, for instance, MOOCs have suchnotoriously low completion rates (Gütl et al., 2014;Rivard, 2013) and why when studying open environmentsand resources it makes sense to limit analyses to userswhose behaviors suggest an intent to participate in thebehaviors we are measuring (e.g., Veletsianos et al.,2021). Judging by user scrolling behaviors, time on page,textual length, and chapter text complexity for the targettextbook, it is estimated that only about 22.7% of pageviews actually constituted a “read” of the contents, andamong those who read the contents, there was noincentive or prodding to complete the end-of-chaptersurvey. Yet, such data should nonetheless be valuable forunderstanding user perceptions of resources in the sameway that user ratings are valuable on sites like Amazon orYelp to determine the quality of products or services,even if the relative representation of ratings is very smallin comparison to the total number of customers on thosesites.

Embedded automatically by the platform at the end ofevery chapter, quality assurance surveys provided resultsto authors in an “Analytics” dashboard at the flight,chapter, and book levels (see Figures 2 and 3). In the

The Journal of Applied Instructional Design, 10(2) 10

“Analytics” dashboard at the flight level, an additionaltable was also provided to authors that providesstatistical comparisons between the original and theexperimental flight (see Figure 4). These tables allowedauthors to compare reader behaviors between theoriginal and the experimental flight on the “OverallQuality” measure as well as embedded learning checksand surveys in the chapter. In the provided example, forinstance, each row (except for the final “Overall Quality”row) represents a different learning check within thechapter, and the table reveals to the author whether theexperimental flight influenced performance on thelearning measure. Because these learning measures arechapter-dependent, they cannot be compared betweenchapters and will not be included in this study. However,common learning measures could be compared in futurestudies as readers are more likely to complete these thanquality assurance surveys, thereby providing more robustsample sizes at a faster rate.

Figure 2

Screenshot of the Analytics Overview for a Chapter onEdTech Books

Chart showing the analytics categories to evaluate a chapter on EdTech Books

Figure 3

Screenshot of a Chapter Quality Display for a Chapter

Screenshot Showing the Chapter Quality Ratings

Figure 4

Screenshot of a Flight Comparison Table

Chart showing a flight comparison table

Independent Variables

To improve perceived quality of the targeted chapters,format- and content-based experiments were created forsix different chapters in the textbook, with eachexperimental flight representing a different variable to betested. When creating learning content, design decisionsare highly contextual. For instance, there is no consensusin the design research literature on whether video isuseful for learners simply because the answer depends somuch upon contextual factors—such as (a) the type ofvideo, (b) the quality of video, (c) its relationship to thetext, (d) the age and characteristics of the learner,etc.—and even proposing decontextualized designdecisions that are intended to be universally applied (like“what are the effects of video on instruction?”) has cometo be viewed as a misguided or altogether confoundedresearch strategy (Honebein & Reigeluth, 2021). Thealternative to this is to employ research efforts initerative, continuous improvement where a variety ofstrategies might be tested in deeply contextualized waysto improve learning products, such as adding or removinga specific video to a live textbook chapter. Toward thisend, this study focused on six chapters in a singletextbook and experimentally tested a different designchange for each chapter (representing two versions ofeach chapter) to determine the feasibility of testing andrevising these kinds of design decisions on-the-fly withlive products. For instance, in the “TechnologyIntegration” chapter, the experimental flight removedstock photos to determine whether the mere presence ofphotos influenced perceptions of quality. Similarly, in the“Lifelong Learning” chapter, the experimental flightremoved an introductory video for the same purpose.Other changes made to remaining chapters included (a)

The Journal of Applied Instructional Design, 10(2) 11

adding extra images (for “Information Literacy”), (b)removing direct illustrative quotations (for “OnlineProfessionalism”), (c) increasing the font size (for “OnlineSafety”), and (d) changing the sans-serif font style to aserif font (for “Universal Design for Learning”). In everycase, chapters were set to “Automatic” flight assignmentfor a one-year period, and a series of Welch’s t-tests wereconducted to determine whether the change influencedoverall quality ratings for the chapter in the target timeperiod.

In constructing these experiments, we did not expect tosee drastic differences in results, but we did anticipatethat if we could identify small formatting or contentchanges that resulted in small quality differences, then asthese changes were aggregated together and applied tothe entire textbook, overall quality could be improved inmeaningful ways. For instance, even if adjusting stockphotos, fonts, or videos only affected less than a 10%change each in perceived quality, by applying theseresults to all of the chapters we hoped to be able toimprove chapters in ways that would show significantaggregate benefit. Additionally, because all of theseexperiments reflected relatively low-cost adjustments toresources that are used by a large number of people,even small improvements would be expected to haveconsiderable relative advantage. For instance, if a smallchange can improve readability by only 1% of a textbookwith a readership of 50,000, that small change couldmean that 500 more people might actually benefit fromthe resource. Thus, though small improvements mayhistorically be treated as insignificant in educationalsettings that are constantly seeking after silver-bullet or2-sigma solutions (e.g., Bloom, 1984), when we move intothe realm of high-impact open resources that we canadjust at low-cost, even tiny improvements can yielddrastic results in learning for the broad population.

Results and DiscussionThe simple result of this study is that after one year ofconstant data collection on a popular open textbook, allexperiments came back as having statistically non-significant effects on perceived open textbook chapterquality. It is no secret that educational research exhibitsa strong bias against reporting null effect studies, whichleads many researchers to not publish valuable work andcontributes to “publication bias, a positively skewedresearch base, and policy and practices based onincomplete data” (Cook & Therrien, 2017, p. 149), buteven though results for this study were non-significant,the results may nonetheless be valuable for informingongoing research and practice with continuousimprovement efforts and open educational resources.

Table 1 provides a summary of the results for all sixexperiments, and there are at least two items of interest

from the results that seem noteworthy. First, though non-significant, the Cohen’s d values for several of theexperiments approach levels that suggest mild tomoderate strength (e.g., d = .58 in the case of removingthe introductory video for “Lifelong Learning,” and d =.45 in the case of switching to a serif font for “UniversalDesign for Learning”). Though we cannot say for sure,these values suggest that with a larger sample size wemight see effects that could mildly influence overallchapter quality perceptions, let alone aggregate effects.

Table 1

Results Summary of A/B Test Experiments for SpecificChapters

OriginalVersion (A)

ExperimentalFlight (B)

Experiment MeanRating

n SD MeanRating

n SD Change Welch'st-Test

p-value Cohen'sd

RemoveStockPhotos

4.09 256 0.7 4.19 195 0.63 0.11 1.66 NS 0.23

RemoveIntro Video

4.19 70 0.66 3.95 44 0.6 -0.23 -1.92 NS 0.58

Add ExtraImages

4.16 56 0.73 3.98 49 0.65 -0.18 -1.34 NS 0.38

RemoveQuotations

4.26 100 0.73 4.16 88 0.6 -0.1 -1.04 NS 0.23

IncreaseFont Size

4.21 78 0.72 4.2 45 0.62 -0.01 -0.04 NS 0.01

Serif FontStyle

4.09 58 0.7 3.88 24 0.67 -0.21 -1.29 NS 0.45

Building off of this, the second noteworthy element is theseemingly small sample size for each experiment. ThoughI explained this phenomenon and provided justificationfor why we might not expect larger sample sizes fromfree-roaming user behaviors above, the difficulty that thisplaces on using these data for continuous improvement isthat we seem to need an absurdly large amount of readeractivity in order to collect a sufficient amount of optionalself-report data for reliable testing. However, theseresults suggest that doing such work is feasible but that itjust takes time and lots of data, especially when data arecollected in unobtrusive ways and focus on userperceptions rather than discrete behaviors. Using the“Technology Integration” chapter as an example, only1.2% of original version readers and 2.0% of experimentalflight readers answered the quality survey, which meansthat even though tens-of-thousands of users read thechapters, we still were not able to rely upon these users’data to provide sufficient evidence for improvement. Thisis further exacerbated by what is likely the low effect thateach of these factors (on their own) has on overallperceptions of chapter quality, because smaller effectswill require larger sample sizes to prove significance, andif we are only conducting experiments that we expect tohave small effects, then even relatively large datasetsmay leave us wanting for significance. Furthermore, ifthese data were to be used in ongoing continuousimprovement efforts, authors and researchers would find

The Journal of Applied Instructional Design, 10(2) 12

themselves in the predicament of having to throw outprevious data every time they made an iterativeimprovement, because the original version would nolonger be a valid control. The upshot of this reality is thateven with a large reader base, using optional self-reportdata to improve open textbooks may not be a feasibleapproach to continuous improvement (at least not untilthe reader base reaches hundreds of thousands of usersor more), making it difficult for most authors to makemeaningful, data-driven improvements to their textbooks.

To address both of these issues, future research anddevelopment efforts would likely benefit from three keypractices. First, rather than doing testing at theindividual chapter or even book level, these sorts of testsmight best be explored at the platform level where flightsare created on all content to test for small changes. Forinstance, instead of removing stock photos on only the“Technology Integration” chapter, running a platform-wide flight of all chapters and programmatically removingstock photos for randomly-selected users would allowplatform developers to determine the value of stockphotos for EdTech Books users broadly with comparativeswiftness. Similarly, doing a site-wide analysis of theeffect that textual complexity has on reading likelihoodreveals that likelihood goes down as complexity goes up,suggesting that as authors write chapters they shouldgenerally aim to simplify language (see Figure 5). Thetrade-off with this platform-level approach is that it wouldlose context, because not all chapters might benefitequally from the presence or lack of stock photos due todifferent content and audiences and some content mightrequire greater textual complexity, but it would at leastprovide platform developers with data-based guidelines toprovide suggestions to authors on what effects theirdecisions might be having on readers (e.g., “includingmore than three stock photos is predicted to reduce userquality perceptions of your chapters by 11.5%”).

Figure 5

Relationship Between the Reading Grade Level ofChapters and Reading Likelihood

Picture of a Chart Showing the Relationship Between the Reading Grade Leveland Reading Likelihood for Chapters

Note. R2Linear = 0.199

Second, many of these types of tests can potentiallybecome automated not just at the random assignmentphase but also at the implementation and continuousimprovement phase. For instance, if a font sizeexperiment was implemented across an entire platformwith a font-size increment of 10%, the system couldcreate an experiment that increases font size for randomusers by 10% while reducing it by 10% and leaving it thesame for others. This site-level test could continue untilenough data were collected to determine which of thechoices was optimal. In probability theory, this type ofapproach is called a “bandit algorithm” as it attempts toaddress the “multi-armed bandit problem” by maximizingpositive outcomes (e.g., chapter reads, positive ratings)while simultaneously employing an exploratorymechanism to discover whether other options or featuresmight improve results (Berry & Fristedt, 1985).Employing bandit algorithms for improving any designfeature could utilize an infinite number of variables (e.g.,different font sizes, types, or colors) in experimental waysthat both produce actionable results and minimizeundesirable outcomes. For many design decisions, thiscould allow continuous improvement to occur in anautomated fashion without the need for authors or evendevelopers to manually adjust designs to respond toexperimental results. Rather, the design of the platformcould become self-correcting in many regards to accountfor ongoing user behaviors.

And third, though relying on self-report data like qualityratings may still have a place (especially in larger scaleanalyses), more granular and faster improvements wouldneed to rely upon unobtrusive user behavior data that ismore universally collected. For instance, based on thetextual complexity of a chapter and the time-on-pagebehaviors of a reader, we can determine whether eachuser actually read the page. Using this as the dependent

The Journal of Applied Instructional Design, 10(2) 13

variable would mean that we would have reliableexperimental data for all learners rather than just thesmall subset that self-report data provides and wouldallow us to predict how experimental changes areaffecting behaviors for all learners (e.g., does changingthe font style influence the likelihood that a user will readthe page?). Though this may limit our experiments insome ways, it would allow for rapid and continuousimprovement (especially when coupled with the othersuggestions above) that would not be readily possiblewhile waiting for self-report data.

Furthermore, many of these possible dependent variableswould likely be correlated to one another. For instance,conducting a simple post hoc bivariate correlation ofquality measures, predicted reads, and textual complexityon all chapters in the platform with at least 10 qualityratings (n = 63) revealed a significant, moderaterelationship between these variables (see Table 2). Thissuggests that even if the primary goal is to improveperceived quality of textbooks, movement toward thisgoal might be accomplished in part by engaging in effortsthat seek to influence more easily measurable variables(like reading likelihood).

Table 2

Bivariate Correlations of Chapter Factors

TextualComplexity

ReadingLikelihood

Quality Rating .526** .288*TextualComplexity

.415**

* Denotes significance at the p < .05 level.

** Denotes significance at the p < .01 level.

ConclusionIn conclusion, though the experiments presented in thisstudy yielded non-significant results, findings remainvaluable for helping researchers and authors interestedin engaging in data-driven continuous improvementefforts for several reasons. First, this study points out therelative difficulty of engaging in these efforts at agranular level (e.g., at the chapter or resource level),especially when the resources that we are seeking toimprove do not enjoy viral popularity. Rather, such effortsare likely best addressed at the system level whereexperimental flights may be created with, randomized for,and aggregated from many different resources at once.Second, due to the relative simplicity of many of theseexperimental conditions, platform developers shouldexplore automating not just the randomization aspect ofA/B tests but also the actual implementation and

experimental creation of tests, allowing the system toiteratively experiment-improve-experiment in valuabledirections by employing bandit algorithms. And third,because these efforts rely upon unobtrusive datacollection, continuous improvement will most effectivelybe influenced by data that can be collected from as manyusers as possible without relying upon low-probabilityparticipation metrics such as prompting users to answera survey or to provide a rating. Incorporating thesesuggestions into any open textbook continuousimprovement effort would offer great promise for makingthe most of user experience data that is readily availablein many open platforms today. By doing so, thetheoretically achievable goal is to create continuousimprovement systems that are not only comparable totraditional publishing mechanisms but that far exceedthem in ensuring the usefulness, usability, and perceivedquality of open resources.

ReferencesBerry, D. A., & Fristedt, B. (1985). Bandit problems:

Sequential allocation of experiments (Monographs onstatistics and applied probability). Chapman and Hall,5(71-87), 7-7.

Bloom, B. S. (1984). The 2 sigma problem: The search formethods of group instruction as effective as one-to-one tutoring. Educational Researcher, 13(6), 4-16.

Bodily, R., Nyland, R., & Wiley, D. (2017). The RISEframework: Using learning analytics to automaticallyidentify open educational resources for continuousimprovement. International Review of Research inOpen and Distributed Learning, 18(2), 103-122.https://doi.org/10.19173/irrodl.v18i2.2952

Bozkurt, A., Koutropoulos, A., Singh, L., & Honeychurch,S. (2020). On lurking: Multiple perspectives onlurking within an educational community. TheInternet and Higher Education, 44, 100709.https://doi.org/10.1016/j.iheduc.2019.100709

College Board. (2019). Published prices. Trends in HigherEducation.https://trends.collegeboard.org/college-pricing/figures-tables/published-prices-national

Cook, B. G., & Therrien, W. J. (2017). Null effects andpublication bias in special education research.Behavioral Disorders, 42(4), 149-158.https://doi.org/10.1111/ldrp.12163

Dinevski, D. (2008). Open educational resources andlifelong learning. In ITI 2008 - 30th InternationalConference on Information Technology Interfaces(pp. 117–122). IEEE.

The Journal of Applied Instructional Design, 10(2) 14

Honebein, P. C., & Reigeluth, C. M. (2021). To prove orimprove, that is the question: the resurgence ofcomparative, confounded research between 2010 and2019. Educational Technology Research andDevelopment, 1-32.https://doi.org/10.1007/s11423-021-09988-1

Gütl, C., Rizzardini, R. H., Chang, V., & Morales, M.(2014, September). Attrition in MOOC: Lessonslearned from drop-out students. In LearningTechnology for Education in Cloud, MOOC, and BigData (pp. 37-48). Springer, Cham.

Kimmons, R. (n.d.). About EdTech Books. EdTech Books.https://edtechbooks.org/about

Kimmons, R. (2015). OER quality and adaptation in K-12:Comparing teacher evaluations of copyright-restricted, open, and open/adapted textbooks. TheInternational Review of Research in Open andDistributed Learning, 16(5).

Kimmons, R. (2016). Expansive openness in teacherpractice. Teachers College Record, 118(9).

Kimmons, R., & Veletsianos, G. (2018). Public internetdata mining methods in instructional design,educational technology, and online learning research.TechTrends, 62(5), 492-500.doi:10.1007/s11528-018-0307-4

Kohavi, R., & Longbotham, R. (2017). Online controlledexperiments and A/B testing. Encyclopedia ofMachine Learning and Data Mining, 7(8), 922-929.

Martin, M. T., & Kimmons, R. (2020). Faculty members'lived experiences with open educational resources.Open Praxis, 12(1), 131-144.doi:10.5944/openpraxis.12.1.987

Open Source Initiative. (n.d.). The open source definition.http://opensource.org/osd

Ottenbreit-Leftwich, A. & Kimmons, R. (2020). The K-12Educational Technology Handbook (1st ed.). EdTechBooks. https://edtechbooks.org/k12handbook

Rivard, R. (2013, March 8). Measuring the MOOCdropout rate. Inside Higher Education.https://www.insidehighered.com/news/2013/03/08/researchers-explore-who-taking-moocs-and-why-so-many-drop-out

Seaman, J. E., & Seaman, J. (2018). Freeing the textbook:Educational resources in U.S. higher education,2018. Babson Survey Research Group.https://www.onlinelearningsurvey.com/reports/freeingthetextbook2018.pdf

Stallman, R. (2013). FLOSS and FOSS. GNU OperatingSystem.https://www.gnu.org/philosophy/floss-and-foss.html

UpGrade. (n.d.). UpGrade: An open source platform forA/B testing in education. Carnegie Learning.https://www.carnegielearning.com/blog/upgrade-ab-testing/

Veletsianos, G., Kimmons, R., Larsen, R., & Rogers, J.(2021). Temporal flexibility, online learningcompletion, and gender. Distance Education, 42(1).doi:10.1080/01587919.2020.1869523

Whitford, E. (July 26, 2018). Textbook trade-offs. InsideHigher Ed.https://www.insidehighered.com/news/2018/07/26/students-sacrifice-meals-and-trips-home-pay-textbooks

Wiley, D. (n.d.). Defining the "open" in open content andopen educational resources. iterating towardopenness. http://opencontent.org/definition/

Wiley, D., Strader, R., & Bodily, R. (2021). Continuousimprovement of instructional materials. In J. K. McDonald & R. E. West (Eds.), Design for learning:Principles, processes, and praxis. EdTech Books.https://edtechbooks.org/id/continuous_improvement

Woodward, S., Lloyd, A., & Kimmons, R. (2017). Studentvoice in textbook evaluation: Comparing open andrestricted textbooks. The International Review ofResearch in Open and Distributed Learning, 18(6).doi:10.19173/irrodl.v18i6.3170

The Journal of Applied Instructional Design, 10(2) 15

Applying the Design ofNarrative Distance in

Instruction

Stephan Taeger

The field of instructional design has a history ofexploring the possibilities of narrative ininstruction. One aspect of narrative that has notreceived significant attention is the relationshipbetween the indirect nature of narrative (narrativedistance) and its power to create powerfultransformative experiences. This article builds uponTaeger and Yanchar’s (2019) qualitative study ofstorytelling experts by offering practicalapplications of the indirect nature of story intoinstruction. Numerous examples and designpatterns are offered in order to illustrate howinstructional designers (IDs) can use the potentiallytransformative effect of narrative distance.

IntroductionHumans are storytelling creatures. Not only do weengage and understand the world through narrative(Bruner, 1990; Polkinghorne, 1988; Young & Saver,2001), we can be invited to change in significant waysthrough stories (Green & Brock, 2000; Kaufman & Libby,2012). The effects of incorporating elements of narrativeinto instructional design have been explored in a varietyof ways. For instance, the use of narrative or storytellingtechniques have been discussed in contexts such asonline learning (Hirumi et al., 2012; Lindgren &McDaniel, 2012), storification (Akkerman et al., 2009),interactive storytelling (Baldwin & Ching, 2017), thecreation of design stories (Parrish, 2006), audioinstruction (Carter, 2012), narrative-centered learningenvironments (Rowe et al., 2011) and problem solving(Dickey, 2006; Jonassen & Hernandez-Serrano, 2002).

One way that narrative may inform instructional design iscreating what Wilson and Parrish (2011) calltransformative learning experiences. According to Wilsonand Parrish, a transformative learning experience (TLE)“results from an especially meaningful engagement withthe world that leaves a lasting impact on a person’s sense

of themselves and their relationship to a subject matter”(p. 12). In this article, TLEs are understood to occur invarying degrees; they could be as small as helpinglearners become more patient with coworkers or as largeas inspiring learners to be more environmentallyconscious. Taeger (2019) argues that narrative distanceor “the cognitive or emotional space afforded by indirectcommunication” (p. 2) can facilitate transformativelearning experiences (Brothers, 2003). This effect issimilar to the experience of being challenged by a pieceof literature, movie, or play to make significant changesin one’s life. A central feature of narrative distance is itsindirectness; the author (or director, etc.) does notdirectly ask readers to reconsider their beliefs andbehaviors. Rather, the messages and invitations areinherent to the story itself. Since the story is indirect, thelistener is less defensive (Warner, 2001) and is free todecide how to incorporate the message of the story intohis or her own life (Craddock, 2002).

Designing for Narrative DistanceTo better understand the affordances created bynarrative distance, I participated in performing aqualitative study (Taeger & Yanchar, 2019) interviewingsix storytelling experts in different fields. This studyreveals a variety of principles and practices that helpincorporate narrative distance into instruction that isdesigned to create transformative learning experiences.In this article, I will expand further on theimplementation of narrative distance in instructionaldesign by offering practical design principles and fictionalexamples based on Taeger and Yanchar’s work that areintended to help designers use narrative distance inpractice. In addition, most of the examples I discussillustrate narrative distance in online learning settings.The headings and subheadings below are quoted from thethemes and ideas presented in Taeger and Yanchar’sresearch.

Cognitive Space

Granting a learner cognitive space allows the student tointerpret aspects of the learning material for themselves.As one participant in Taeger and Yanchar’s (2019) studysaid, “[d]on’t teach me a lesson. Tell me a story, and ifthere’s a lesson in it, that’ll seep in” (p. 170). This insightmainly applies to learning content meant to create TLEsas opposed to material that students need to learn moredirectly (e.g., operating heavy machinery or memorizingphysics formulas). It should be noted that grantingcognitive space is not primarily expressed in learningactivities, but in the learning content itself. For example,asking students to answer questions about a case study isnot necessarily using cognitive space because the casestudy may have been written in a way that did not invitemuch room for interpretation.

The Journal of Applied Instructional Design, 10(2) 16

Use Concrete Material Without Abstract Moralizing

One of the ways to maintain cognitive space is to avoidusing language that contains abstract morals, values, orprinciples during the portions of instruction intended touse narrative distance. For example, a corporate salestraining would avoid phrases such as, “working yourhardest always brings the best results” or “effectivesalespeople are organized and diligent.” Although thesephrases may be helpful on some level, they do not fit wellwith material designed to create narrative distancebecause the learner is not given space for interpretation.This principle is based on the assumption that to beinspired to change, one must feel some sort of ownershipin that process. Simply offering vivid and compellingconcrete examples of salespeople who are diligent,organized, and hardworking in training material withoutabstract moralizing allows for more depth ininterpretation (Allen, 2008) and the cognitive space forthe learner to apply the content to themselves. This doesnot necessarily mean that instruction should contain aseries of loosely related anecdotes or images. Rather,concrete material can be strategically placed alongsidemore traditional instruction.

For example, suppose an online course contains fiveprinciples for becoming a better leader in the workplace.These five principles are not simply techniques, but theystretch and challenge learners in potentiallytransformative ways. In typical instructional fashion, thecourse contains definitions, examples, opportunities forpractice, and assessments. However, at the end of theonline course, a video appears of actors depicting all fiveof the principles just discussed in the training. Instead ofthe online training indicating that the learner should lookfor the five principles in the video or that the learner willbe tested on how the five principles were manifested inthe video (assuming that adequate assessment hasalready occurred), simply presenting the video at the endof the training invites the listener to discover theprinciples for himself or herself. A quality feature filmdoes not warn a viewer what he or she should look for ina scene, but it assumes that the viewer can make sense ofwhat is depicted. Since the learner is not told how tointerpret this final scene in the training, the learner isinvited to do so. As Craddock (2002) argues, “[t]he heareris free, and yet the response permitted is a responsedemanded” (p. 106).

A design pattern for implementing this aspect of narrativedistance is to (a) identify ways learners could change; (b)create or find concrete illustrations of those changes; (c)refrain from abstract moralizing during this portion ofinstruction; and (d) insert these concrete illustrations instrategic places where learners will be able to makeintended connections without explicit instructions to doso.

Use the Behavior of Characters to CommunicateMeaning

In the last section, I contended that explicitly stating themessage of an illustration or concrete example can limitcognitive space. Presumably, this moralizing would occurbefore or after an illustration is offered. Incorporating thetechnique of using the behavior of characters tocommunicate meaning helps prevent an instructionaldesigner (ID) from creating an illustration that wouldinherently violate a learner’s cognitive space. In otherwords, some instructional stories or illustrations soobviously contain a message that the learner is not giventhe opportunity or motivation to interpret the story.

Suppose a university learning center creates an onlinetraining for adjunct professors to introduce them to thevarious responsibilities associated with working in ahigher educational setting. The IDs use direct instructionto explain the university’s learning management system,parking instructions, and grading procedures. However,some of the material is designed to be moretransformative in its approach because it is intended toinspire the adjunct professors to act professionally asthey interact with students. In order to reach thislearning objective, the IDs decide to utilize narrativedistance by inserting two stories using the same fictionalcharacters at both the beginning and end of this sectionin the online training module. The fictional exampleconsists of a meeting between an adjunct professor (Dr.Thompson) and a student (Ashley) to review a recentlyadministered exam. As students watch this fictionalexample, there are no instructions to look for ways thatthe adjunct professor could have treated the studentmore professionally (those principles will be coveredduring the traditional instruction using differentexamples). Furthermore, cognitive space is alsomaintained throughout the example because the IDs havethe characters communicate indirectly through theirbehavior. In other words, rather than having a narratorsay, “As Ashley entered Dr. Thompson’s office, shenoticed that he didn’t treat her with much respect orkindness.” Rather, the narrator says, “As Ashley enteredthe office, Dr. Thomas said, ‘Hello, how are you today?’ ashe continued to type on his computer.” Later in theexample, instead of the narrator saying, “Dr. Thompsonwas obviously not listening very intently,” the narratormentions that Dr. Thompson kept glancing at hiscomputer while Ashley was speaking. Since the messageis communicated indirectly by showing how Dr.Thompson was acting, the learners have the cognitivespace to make sense of this behavior for themselves.

The design pattern for using this aspect of narrativedistance consists of (a) asking which attributes,behaviors, or values the learner could acquire; (b)identifying the behaviors that someone with those

The Journal of Applied Instructional Design, 10(2) 17

attributes manifests; (c) finding or creating exampleswhere the characters demonstrate or fail to exhibit thosebehaviors; and (d) refraining from any material (such asnarration, description, etc.) that would describe thoseattributes in any other way besides behaviorally.

Use Moral Ambiguity

When a character or portion of instruction contains amixture of moral viewpoints it is morally ambiguous. Forexample, an adjunct professor who welcomes a studentwith a smile into his office, but also keeps glancing at hiscomputer, depicts a complex human being with bothadmirable and less than commendable characteristics.The previous section emphasized using behavior tocommunicate meaning to maintain cognitive space.Without using moral ambiguity, however, the behaviorsmanifested by the character would appear simplistic, andthe message would become as obvious as if the narratorsaid, “Consider how Dr. Thompson treats his studentunprofessionally.” Moral ambiguity keeps a learnerguessing (Lowry, 2001) and invites him or her to make apotentially transformative decision about their own moralframework.

Suppose an ID creates an online course intended formiddle school students. The course includes an immersivenarrative learning environment (Dettori & Paiva, 2009) inwhich students can control an avatar through a typicalweek at school. At various points during the week, thelearner encounters other students inviting him or her toparticipate in undesirable behavior (e.g., underagedrinking, cheating, bullying, etc.) and then is trained onhow to handle such situations. However, throughout theonline training module, these fellow students the learnerencounters also demonstrate admirable qualities. Forexample, the peer who invites the learner to participatein underage drinking is a good student, is a loyal friend,and is genuinely funny. In this example, not only are thestudents taught various techniques and principles forhandling difficult situations, they also are given enoughcognitive space to make a decision about the behaviorsthey are invited to participate in. The moral ambiguitydemonstrated by the characters in the training preventsthe learner from easily deciding if the characters’behaviors are always acceptable.

The design pattern for using moral ambiguity to createnarrative distance could be described as (a) identifyingthe specific ways that learners could change; (b) findingor creating concrete examples that manifest the change;and (c) highlighting or giving characteristics to thesecharacters that are also admirable in the eyes of thelearner.

Emotional Space

Whereas cognitive space allows someone to decide how

to make sense of learning material, emotional space is anaspect of narrative distance that gives someone the roomto decide how they will experience the material. Asopposed to feeling manipulated, the learner feels“respected, valued, safe, and understood” (Taeger &Yanchar, 2019, p. 172). This aspect of narrative distanceis important in creating TLEs because learners are lesslikely to be influenced if they are pressed uponemotionally (Craddock, 2002). As one of the participantsin Taeger and Yanchar (2019) said:

[T]he word that most people use is“manipulative.” You’re just trying tomanipulate me... if we try to takeshortcuts ... then it’s the same thing asjust using a lot of violins in a musicalsoundtrack. Sweeping violins to create anemotion that’s not really beingrepresented. (p. 173)

Use Authentic Material

McDonald (2009) states, “[a]uthenticity helps viewers feelempathy for characters, and recognize themselves (theiremotions and their reactions) in those characters'' (p.117). Goldsworthy and Honebein (2010) also argue thatlearners can only connect with instructional stories thatseem authentic. Inauthentic stories are less likely toinfluence learners because they appear manipulative. Forexample, if a character in a safety training movie acts inways that do not seem authentic, the student can sensethat the creators of the film are trying to get a messageacross as opposed to offering a compelling story.

Although Taeger and Yanchar (2019) suggest multipleprinciples related to narrative distance that can help IDscreate authentic learning material, only three areemphasized here: (a) “avoid simplifying human conflict”;(b) “avoid expressing emotion through dialogue”; and (c)avoid changing “how a character would act in order toserve the story” (p. 178). This section will further explorethese three principles for the purpose of illustrating howto create authentic learning material that allows foremotional space during instruction.

Suppose the training department for a large organizationis assigned the responsibility to implement an onlinetraining regarding new environmental regulations. TheIDs not only want to explain the new environmentalpolicies the company has decided to adopt, they also wantto help create a culture of environmental responsibility,and thus they need to construct a training that istransformative. In order to take advantage of theaffordances of narrative distance, IDs decide to weave astory consisting of still photos and narration throughoutthe online training. The story is designed to both

The Journal of Applied Instructional Design, 10(2) 18

illustrate the learning material and inspire the employeesto become more environmentally conscious. As the storyunfolds, learners are never directly invited to makeconnections. Instead, they are given the space to makesense of the story for their own situations.

Avoid Simplifying Human Conflict

In order to maintain narrative distance so that thelearners are not emotionally “taken out” of the story bythe obviousness of the message, the IDs seek to useprinciples of authentic storytelling. For example, one ofthe scenes depicts two employees standing by a copymachine. The narrator says that one of these employeeshas noticed over a period of time that the other employeeis wasteful in the amount of copies he or she produces.The IDs avoid simplifying the conflict in the story bydepicting the nuances involved in asking a coworker toabide by certain work policies. These IDs do not addnarration that says something like, “although it was hardfor Jessica to ask Miranda to stop making so many copies,she worked up the courage and asked her.” Instead, thenarrator describes how Jessica does not want to appeardemanding or how she unsuccessfully tried to hint herconcern to Miranda in the past. Perhaps the IDs couldeven have Miranda verbally agree to change, but then failto reflect that change in her behavior. Regardless of howthe IDs eventually show the resolution to this encounter,they should first show some of the difficult facets tohuman conflict so that the story feels more authentic. Asmentioned above, without that authenticity, it is difficultfor learners to be emotionally open enough to beinfluenced by a story.

Avoid Expressing Emotion Through Dialogue

Continuing with the previous example, it might bedifficult for learners to consider the story authentic ifJessica had said, “I become really upset when you makeso many copies,” or if Miranda had responded by saying,“I’m so angry because you are telling me what to do.” Asargued above, meaning is often communicated mosteffectively through the behaviors of the characters in astory.

Do Not Change How a Character Would Act in Orderto Serve the Story

When stories are used to teach, instructors might feel aneed to have the characters act in ways that express themessage of the story. This can lead to charactersbehaving in ways that do not seem consistent with howthey should act, and thus the story appears inauthentic.Parrish (2007) argues that “while plot is primary, plotmust arise from character and not merely be imposed oncharacters” (p. 521). If the scene with Jessica andMiranda ended with Miranda saying something like,“Okay, I’ve learned my lesson. I’ll try to be more

environmentally conscious,” the learner probably wouldnot have considered the dialogue realistic. Instead,perhaps Miranda would have complied with Jessica’ssuggestion, but only reluctantly after administrativepressure. Of course, characters in instructional storiesoften need to learn lessons, but it should happen in waysthat reflect how the characters in the story wouldauthentically change.

A simple design formula for this kind of authenticcharacter change is to (a) identify which lessons acharacter needs to learn that reflect learning objectives;(b) consider which types of factors (i.e., logic, experience,social influence, authority, etc.) would invite thatcharacter to change; and (c) find ways to naturallyinclude those changes to elements in the story.

Avoid Unearned Emotion

Some participants in Taeger and Yanchar (2019) spoke ofemotional experience in a story as needing to be “earned”(p. 173). Occasionally, stories attempt to create anemotional experience without enough context for theexperience to feel natural. IDs can avoid this problem bygiving learners enough time and background to invest ina character and the character’s associated struggles(Rollings & Adams, 2003). When attempting to design aninstructional story in this way, it is helpful to rememberto keep other aspects of narrative distance central to thedesign process because context itself will not be enoughto prevent a potential moment of unearned emotion.

Suppose IDs for a government agency are developing anonline training that instructs new employees regardingtheir responsibilities in helping those in lowersocioeconomic situations apply for and receivegovernment housing. The IDs want the training to covermore than just teaching the new employees the process ofobtaining the right information and documents from thoseneeding government assistance. The IDs want thetraining to be a transformative learning experience. Inorder to do this, they strategically place documentarystyle video clips throughout the training of an immigrantfamily who first struggle to find employment and theneventually receive assistance for government housing.The clips follow the same family throughout the moduleso that learners have time to invest in the family’sstruggles and the complexities the family faces by comingto a new country, seeking employment, learning a newlanguage, and so forth. When the training is completed,the learners watch a clip of the family entering into newgovernment housing. Since the learners know howmeaningful this would have been to this particular family,they have the emotional space to experience this event asthey watch the family gratefully occupy the newresidence for the first time.

To design for this kind of experience, IDs can (a) identify

The Journal of Applied Instructional Design, 10(2) 19

moments during instruction that are intended to create apowerful emotional effect; (b) identify what necessarybackground information learners need to fully invest in astory or topic; and (c) locate strategic places where toinclude this kind of information in earlier portions of theinstruction.

Invite Change

For narrative distance to help create transformativelearning experiences, cognitive and emotional spaceshould be blended with aspects of instruction that alsoinvite the learner to make changes. In this section, I willcontinue to draw upon Taeger and Yanchar (2019) tooffer three ways to invite learners to change that alignwith the indirect nature of narrative distance.

Help Learners Change Vicariously Along withCharacters

Kaufman and Libby (2012) demonstrate how readers canchange their behavior to match that of a character in astory if the readers have vicarious experiences with aparticular narrative. Similarity, as one of the participantssaid in Taeger and Yanchar’s (2019) study, “in a novel,the character is the one who changes and the readerchanges vicariously with that character” (p. 175). IDs cantap into this transformative aspect of narrative distanceby creating or using stories in which characters movefrom one set of beliefs or values to new ones. Whenlearners are invited to transform in this way, two aspectsof narrative distance are manifested alongside theinvitation to change. First, learners are given thecognitive space to decide how they will identify withcharacters. Second, learners are granted emotional spacebecause the characters (real or fiction) first manifest thebeliefs or values of the learners and then come to newdiscoveries or insights. In other words, learners feelunderstood as they encounter characters who hold theirsame beliefs, but learners are also invited to change asthose characters shift in values.

For example, suppose a consulting company asks a groupof IDs to help update an online time managementseminar. They use narrative distance by first inserting avideo of a former student of the time managementseminar emphasizing how poorly he or she previouslymanaged their time. However, rather than just brieflystating how hard it was to organize time, the formerstudent also takes a few minutes to explain differentattempts he or she made to improve their timemanagement skills. Furthermore, the student explains thestress that was associated with always being behind in hisor her work, the difficulty of losing track of tasks, and thesense that he or she consistently spent time on thingsthat were not valuable. In addition, the IDs ask the formerstudent to emphasize the details of any personal

reservations that he or she may have had prior toplanning on a regular basis. These reservations couldinclude believing that planning regularly would stifleindividual freedom or that it would waste valuable timethat could be used working. After the student describesthese previous views, the student also begins to describethe process of coming to believe that planning regularlyhas positive benefits that outweigh any negatives. Aslearners listen to this first-person narrative, they areinvited to vicariously join the interviewee in shifting theirown views on the subject.

A simple design pattern for creating vicarious change isto (a) identify what learners currently believe about agiven issue; (b) identify what new beliefs or values thelearners could adopt; and (c) create or identify aninstructional story where a character shifts from the oldviewpoints to the new ones.

Define the Learner in a Way That Connects Them toa Larger Narrative

Besides using a specific story to invite change in learners,another indirect way to invite change is by naming ordefining learners within a larger narrative. By largernarrative I mean a series of connected stories, events,rituals, or myths that grow out of past historical eventsthat inform one’s identity. For example, someoneindirectly identifies himself or herself within a largernarrative when he or she says “I am American,” or “I aman impressionist painter,” or “I am a Boston Red Sox fan.”Each of these terms can imply a larger set of historicalcultural practices, stories, and values. IDs can indirectlyinvite learners to change by naming learners as membersof these larger narratives.

Suppose a corporation asks its training department tocreate a small online training explaining how members ofthe company can donate to a local charity. Afterdemonstrating the procedures for donating, the onlinetraining includes a section that explains how the companyhas had a long and unique history of donating tocharitable causes in ways that exceed the norm. Forexample, the training depicts an employee saying, “Incompany x, we do things differently. One of our firstpriorities is to use our resources to help those who needit most.” The employee (in this fictional example) used adeclarative statement as opposed to an imperative one(Taeger & Yanchar, 2019). She did not say, “You shouldgive to this charity because you are a member of thiscompany.” Instead, she simply described what membersof that company have had a history of doing. This is anindirect way of inviting learners to see themselves asparticipants in this corporation’s larger narrative.

It should be noted that learners should want to identifywith these larger narratives. This technique will mostlikely work when the history and tradition of larger

The Journal of Applied Instructional Design, 10(2) 20

narratives matter in meaningful ways to the learners. Forexample, it may or may not mean very much to aprofessor to be identified with the history and culture ofthe university where she works. A design pattern forusing this aspect of narrative distance is to (a) identifywhich larger narrative learners could be given anopportunity to identify with; (b) decide if that largernarrative is already meaningful to learners; (c) if it is notmeaningful, seek to find ways to help the larger narrativebecome meaningful and attractive to learners; and (d) usedeclarative statements as opposed to imperative ones ingiving learners a chance to identify with a largernarrative.

Meaningful Content

Meaningful content is learning material that matterssignificantly to the student. Without this, narrativedistance will not lead to a TLE because learners will notinterpret indirect invitations to change as something thatis connected to their world.

Create Identification

Identification means that learners see their own thoughts,feelings, and behaviors reflected in the learning material.In addition, identification is connected to the “emotionalproximity” one feels towards characters (Dickey, 2006 p.251). When combined with narrative distance (emotionaland cognitive space), identification can occur indirectly(Craddock, 2002).

The following illustration from Warner (1986) about self-deception demonstrates how a story can contain poignantmoments of identification which invite students to seethemselves in the learning material. The illustrationdescribes the true story of a husband awakened in themiddle of the night by his crying baby:

At that moment, I had a fleeting feeling, afeeling that if I got up quickly I might beable to see what was wrong before mywife would have to wake up. I don’t thinkit was even a thought because it went toofast for me to say it out in my mind. It wasa feeling that this was something I reallyought to do. But I didn’t do it. I didn’t goright back to sleep either. It bugged methat my wife wasn’t waking up. I keptthinking it was her job. She has her workand I have mine. Mine starts early. Shecan sleep in. Besides, I was exhausted.Besides that, I never really know how tohandle the baby. Maybe she was lyingthere waiting for me to get up. Why did Ihave to feel guilty when I’m only trying toget some sleep so I can do well on the job?

She was the one who wanted to have thiskid in the first place. (p. 39)

When an illustration genuinely reflects the thoughts,feelings, and behaviors of learners, there is no need todirectly point out ways they should connect with thelearning material. In order to better echo the thoughtsand feelings of learners, IDs can use methods of learneranalysis that increase their ability to empathize with thelearner (for more on these methods, see Parrish, 2006). Asimple design pattern for creating identification thatmaintains narrative distance is to (a) empathize with howlearners genuinely think, feel, and act and (b) find waysto indirectly reflect these thoughts, feelings, andbehaviors through stories, anecdotes, characters,descriptions, etc.

Do Not Change the Characters with Whom LearnersAre Intended to Identify

If it is intended that learners identify with learningcontent, it can be helpful to have them identify with justone character. Of course, it may be effective to use astory that contains multiple lessons to be learned fromvarious characters in the same story. However, focusingmainly on just one character or viewpoint allows learnersto become fully invested in that character and thus canproduce a more meaningful emotional or experiential payoff later in instruction. As one participant said in Taegerand Yanchar’s (2019) study, “A good film story is aboutone person... because if it’s only about one person thatgets you to identify emotionally, much more intensively”(p. 176).

Suppose administrators at a hospital hire an instructionaldesign team to create an online training about hospitalsecurity. Beyond describing basic security procedures,the IDs also want to create a transformative learningexperience that will inspire the employees to catch thevision of the importance of hospital security. Therefore,the IDs decide to interview different exemplaryemployees about their security practices. However, tocreate meaningful instruction, they decide to follow oneemployee closely over a period of time to get manyinterviews and shots of this particular employee at work.Although the training will ultimately contain differentvideo clips from a variety of employees, the trainingfocuses on weaving one central employee throughout theonline learning modules. Ideally, this will invite learnersto connect to one central character and thus create amore meaningful experience. A simple design pattern tocreate this effect is to (a) identify characters or peoplethat are compelling enough to be a central focus ofinstruction and (b) seek to use this one character toillustrate as many of the learning objectives as possible.

The Journal of Applied Instructional Design, 10(2) 21

Narrative Distance in a Variety ofSettingsAlthough this article focuses mainly on using narrativedistance in online learning settings, it might be helpful tobriefly discuss how narrative distance can be usedeffectively in face to face, synchronous, and blendedlearning environments. The basic principles for usingnarrative distance with any delivery method includegiving students time to ponder the learning material forthemselves and minimizing student opportunities to hearother interpretations of content meant to create narrativedistance.

For example, in face to face environments, it might beeffective to not have regular classroom discussions on theportions of instruction that contain narrative distance. Ifstudents know that other classmates or the teacher willdo the hard work of clarifying and applying the learningmaterial, they may feel less motivated to do so themselves(Taeger, 2019). This may be difficult for some IDsbecause it takes a certain amount of trust that thebenefits associated with narrative distance are working.However, if IDs feel that the students should have moretime to consider or think deeply about certain portions ofinstruction containing narrative distance, they can assignindividual writing assignments or journal entries. Thesekinds of activities create some of the benefits ofdiscussion without losing the affordances of narrativedistance. In synchronous environments, the use ofnarrative distance will be very similar to that of face toface settings. Perhaps chat boxes should be turned offduring portions of the lesson that contain narrativedistance so that students do not prevent one another frominterpreting the learning material for themselves throughdiscussion.

Blended learning environments provide uniqueopportunities for the use of narrative distance. IDs canuse moments of narrative distance during in person classtime that students can then discuss online. This allowsstudents time to ponder the learning material forthemselves and also gives learners the benefits of formalreflection through a writing activity. In addition, ifstudents are not allowed to see each other’s commentsuntil their comments are posted, this makes it more likelythat the students will interpret the material individually.Moreover, it will also allow students to compare andcontrast their thoughts with other students.

ConclusionThis paper sought to build upon Taeger and Yanchar’s(2019) inquiry on designing for narrative distance byfurther illustrating the use of these principles ininstructional design practice. These principles were

discussed in order to illustrate how one can better createtransformative learning experiences (Wilson & Parrish,2011). Further research into the effectiveness of theprinciples mentioned above could help IDs understandhow to better incorporate narrative distance intoinstruction. Creating instruments that measure studentperception of cognitive space, perception of emotionalspace, the invitation to change, and the level ofmeaningfulness in instruction would also help IDs knowwhich methods are best creating narrative distance.Simple qualitative longitudinal studies based on the TLEframework offered in Wilson and Parrish could alsoindicate how transformative the instructional experiencewas over the long term.

When narrative distance is used for the purpose ofcreating TLEs, it can mimic the experience of a profoundfilm or inspiring piece of literature (Taeger, 2019). Theideas offered here represent a limited number of ways todesign for narrative distance; there are many morepotential avenues for creating this effect in instruction.Not all instruction can be transformative, but whenlearning objectives call for that purpose, narrativedistance can help IDs move toward that direction.

ReferencesAllen, R. J. (2008). Theology undergirding narrative

preaching. In M. Graves & D. J. Schlafer (Eds.),What’s the shape of narrative preaching? (pp. 27-40).Chalice Press.

Akkerman, S., Admiraal, W., & Huizenga, J. (2009).Storification in history education: A mobile game inand about medieval Amsterdam. Computers &Education, 52, 449-459. https://edtechbooks.org/-QZK

Baldwin, S., & Ching, Y. (2017). Interactive storytelling:Opportunities for online course design. TechTrends,61, 179-186.http://dx.doi.org/10.1007/s11528-016-0136-2

Brothers, M. A. (2003). The role of “distance” inpreaching: A critical dialogue with Fred Craddockand postliberal homiletics (Unpublished doctoraldissertation). Princeton Theological Seminary,Princeton, NJ.

Bruner, J. S. (1990). Acts of meaning. Harvard UniversityPress.

Carter, C. W. (2012). Instructional audio guidelines: Fourdesign principles to consider for every instructionalaudio design effort. TechTrends, 56, 54-58.https://doi.org/10.1007/s11528-012-0615-z

Craddock, F. B. (2002). Overhearing the gospel. ChalicePress.

The Journal of Applied Instructional Design, 10(2) 22

Dettori, G., & Paiva, A. (2009). Narrative learning intechnology-enhanced environments: An introductionto narrative learning environments. In N. Balacheff,S. Ludvigsen, T. de Jong, A. Lazonder, & S. Barnes(Eds.), Technology-enhanced learning: Principles andproducts. Springer. https://edtechbooks.org/-zXwb

Dickey, M. D. (2006). Game design narrative for learning:Appropriating adventure game design narrativedevices and techniques for the design of interactivelearning environments. Educational TechnologyResearch and Development, 54, 245-263.https://doi.org/10.1007/s11423-006-8806-y

Goldsworthy, R., & Honebein, P. C. (2010). Learnerstories and the design of authentic instruction.Educational Technology, 50, 27-33.

Green, M. C., & Brock, T. C. (2000). The role oftransportation in the persuasiveness of publicnarratives. Journal of Personality and SocialPsychology, 79, 701-721.https://psycnet.apa.org/doi/10.1037/0022-3514.79.5.701

Hirumi, A., Sivo, S., & Pounds, K. (2012). Storytelling toenhance teaching and learning: The systematicdesign, development, and testing of two onlinecourses. International Journal on E-Learning, 11,125-151. https://eric.ed.gov/?id=EJ972179

Jonassen, D. H., & Hernandez-Serrano, J. (2002). Case-based reasoning and instructional design: Usingstories to support problem solving. EducationalTechnology Research and Development, 50, 65–77.https://link.springer.com/article/10.1007/BF02504994

Kaufman, G. F., & Libby, L. K. (2012). Changing beliefsand behavior through experience- taking. Journal ofPersonality and Social Psychology, 103,1-19.http://doi.org/10.1037/a0027525

Lindgren, R., & McDaniel, R. (2012). Transforming onlinelearning through narrative and student agency.Educational Technology & Society, 15, 344-355.https://edtechbooks.org/-GrhX

Lowry, E. L. (2001). The homiletical plot: The sermon asnarrative art form. Westminster John Knox Press.

McDonald, J. K. (2009). Imaginative instruction: Whatmaster storytellers can teach instructional designers.Educational Media International, 46, 111-122.https://doi.org/10.1080/09523980902933318

Parrish, P. E. (2006). Design as storytelling. TechTrends,50, 72-82. https://doi.org/10.1007/s11528-006-0072-7

Parrish, P. E. (2007). Aesthetic principles for instructionaldesign. Educational Technology Research andDevelopment, 57, 511–528.http://doi.org/10.1007/s11423-007-9060-7

Polkinghorne, D. E. (1988). Narrative knowing and thehuman sciences. SUNY Press.

Rollings, A., & Adams, E. (2003). Game design. NewRiders.

Rowe, J. P., Shores, L. R., Mott, B. W., & Lester, J. C.(2011). Integrating learning, problem solving, andengagement in narrative-centered learningenvironments. International Journal of ArtificialIntelligence in Education, 21,115-133.http://dx.doi.org.erl.lib.byu.edu/10.3233/JAI-2011-019

Taeger, S. D. (2019). Using narrative distance to invitetransformative learning experiences. Journal ofResearch in Innovative Teaching & Learning.https://doi.org/10.1108/JRIT-05-2019-0057

Taeger, S. D., & Yanchar, S. C. (2019). Principles andpractices of designing narrative distance fortransformative learning experiences. EducationalMedia International, 56(2), 164-181.https://doi.org/10.1080/09523987.2019.1614322

Warner, C. T. (1986). What We Are. BYU StudiesQuarterly, 26(1), 39-63.

Warner, C. T. (2001). Bonds that make us free: Healingour relationships, coming to ourselves. ShadowMountain.

Wilson, B. G., & Parrish, P. E. (2011). Transformativelearning experience: Aim higher, gain more.Educational Technology, 51, 10-15.

Young, K., & Saver, J. L. (2001). The neurology ofnarrative. SubStance, 30, 72-84.https://doi.org/10.2307/3685505

The Journal of Applied Instructional Design, 10(2) 23

Enabling Interactivitythrough Design:Outcomes from aGamified Health

Insurance OnboardingCourse

Nicole Buras, Lauren Merrild, & WooRiKim

The purpose of this study was to examine new hires’learning outcomes and perceptions of an interactivegamified e-learning course in a health insuranceorganization. The conceptual framework drew fromliterature surrounding the Understanding by Designprocess and the relationship between engagementand interactivity in e-learning. The researchers alsoexplored the role of course design to supportinteractivity. This article employed a pre-test, post-test, and a survey implemented to 121 new hires toexamine learning outcomes and perceptions of agamified e-learning course. To provide an in-depthunderstanding of the quantitative data, the authorsfollowed up with 10 semi-structured interviews.Results from this study highlighted that participantsexperienced high levels of engagement andunderstanding of foundational insuranceinformation.

IntroductionAsynchronous web-based learning is a prevalentworkplace learning strategy used by learning anddevelopment professionals (Lester et al., 2013) who turnto the modality to adapt to changing technology and aflexible workforce (Akdere & Conceicao, 2006; Long &Smith, 2003). Within this context, gamification hasemerged as a popular tool for enhancing learnerengagement in both instructor-led and web-based courses(Alsawaier, 2017; Calvo & Reio, 2018; Jabbar & Felicia,2015). There is limited research, however, on the effect ofgamification on engagement and knowledge gains.Additionally, there is limited research on design

strategies to enable engagement in asynchronousgamified courses. To add to the body of literature, theauthors explored the impact of a gamified course designon engagement and knowledge gained in anasynchronous gamified course delivered in a healthinsurance organization.

Gamification is the practice of incorporating elements ofgame design into a course to increase engagement andmotivation (Dichev & Dicheva, 2017). Game elementsinclude, but are not limited to, challenges, points,leaderboards, leveling up, and badges. A course is“gamified” when it incorporates elements of games. Forthis reason, gamification and gamified courses arehereafter accompanying terms. The authors employedgamification techniques in a course titled MemberJourney that was delivered asynchronously to new hireson their first day of functional job-based training. Thegamification techniques used included challenges,achievements, leveling up, rules, and goals.

The course covered the customer life cycle from acustomer’s perspective. By the end of the course, learnerswere expected to be able to define terminology,categorize business units, and order steps in thecustomer experience. In the course, two guides helpedthe learners through four levels of challenges withincreasing complexity. These challenges were presentedas destinations in the Member Journey. Learners receivedstamps in their passports when a new destination wasreached. At each destination, the learners completed achallenge. When passed, they were able to “level-up” tothe following destination. Each new destination wouldcontain a challenge with prerequisite knowledge from theprevious destinations. The learner’s goal was toaccumulate all stamps in their passports. To explore theimpact of course design on engagement and performance,the authors incorporated features to enable thegamification in the course, such as narrative, plot, rules,increasing complexity, and characters.

The conceptual framework drew from literatureaddressing the relationship between engagement andgame elements in e-learning and how course designpromotes greater engagement and understanding ofcourse content in gamified courses. This study isbeneficial for field practitioners and scholars who areinterested in strategies to enhance gamificationapproaches.

Literature Review

Importance of the Design Process

Technology-based learning represents an ongoinggrowing trend in businesses (Ho, 2017), however, criticaldiscussions around the effectiveness of different delivery

The Journal of Applied Instructional Design, 10(2) 24

methods have created several classes of thought (Bell &Federman, 2013). One class of thought identified by Belland Federman (2013) maintains that the effectiveness ofe-learning instruction is dependent upon the vigor of theinstructional design process. The perception of this groupis that the most appropriate delivery method varies basedon the learning context, content, and audience (Bell &Federman, 2013). The authors also believe thatpedagogical approaches are an avenue to learning andnot learning itself and the success of a learning approachis dependent upon a thorough instructional designprocess that considers organizational objectives,audience, technology and learning environment.

The Relationship Between Engagement andInteractivity in E-Learning

If e-learning is determined to be an effective approach, itis important to consider participant engagement.Engagement in learning refers to the level of positiveemotional response and commitment a learner feels whilecompleting learning tasks (Young, 2010). This is criticalbecause research has indicated that engagement inlearning influences critical thinking, determination, andperformance (Young, 2010). Performance, in this context,refers to competence in a role or task and determinationto complete role responsibilities or tasks.

Research supports increased learner engagement canlead to increased knowledge gains (Bloom, 1956; Nkhomaet al., 2014). For some researchers, learning engagementis seen to have a significant positive effect on knowledgeand performance that as a result, the level of engagementinfluences the breadth and depth of the knowledgegained (Nkhoma et al., 2014). For others, the effects oflearning engagement can be captured in a hierarchy ofengagement levels and outcomes (Bloom, 1956).

An emergent area of research centers on the need forinteractivity to promote engagement in e-learning courses(Zhang & Zhou, 2003). In a comparison of learners in aninteractive e-learning environment to those in thetraditional classroom environment, Zhang and Zhou(2003) found that interactive courses increased flexibilityand engagement and achieved higher levels of knowledgegains and participant satisfaction.

Interactivity in e-learning may refer to any action inwhich the learner influences the learning programthrough inputs or actions, such as text entry, clicks, androllovers. Additionally, interactivity goes beyond physicalactions (Hong et al., 2014; Woo et al., 2010). When thedefinition of interactivity is expanded to include anycomponents of two-way communication, then complexcognitive interactions or online social exchanges can beconsidered e-learning interactivity (Hong et al., 2014). Infact, intangible interactions such as games, stories, and

other strategies were found by Chatterjee (2010) toincrease engagement at a greater rate than physicalcomponents.

Interactivity in Gamified Courses

Gamification represents an interactive approach that isused by organizations to engage employees and reduceknowledge gaps (Calvo & Reio, 2018). Underlying gamecomponents activate learner engagement throughcognitive interactivity, but gamification is distinct in alsorequiring physical action. For example, physicalinteractions in an e-learning game would include clicks,drag-and-drop, text entry, and other one-way approachesto influence an e-learning module. Cognitive interactionsare reciprocal between the e-learning solution and thelearner. Examples of these include problem-solvingscenarios and increasingly difficult challenges. Byincorporating both mental and physical components ofinteractivity, gamified e-learning achieves advancedlevels of interactivity (Chang et al, 2018).

Course Design to Support InteractivityThrough Gamification

In the Member Journey course, storytelling techniqueswere used to support the gamification. Storytelling hasbeen shown to increase interactivity and engagement in acourse (Baldwin & Ching, 2016) and to promoteinformation storing and comprehension (Novak, 2015). These techniques included a plot and support for theoverarching plot through characters, a journey, conflict,and resolution (Baldwin & Ching, 2016). Characters inparticular assisted in the development of the story andguided learners through plot points, a technique that hasbeen shown to increase engagement (Smeda et al., 2010).The authors examined the impact of using storytelling toenable the interactivity and engagement of gamificationin the Member Journey course.

Engagement and Learning Achievement withGamified E-Learning

Game elements such as challenges and awards arepowerful tools for promoting motivation and engagementthrough interactivity (Alsawaier, 2017; Calvo & Reio,2018; Jabbar & Felicia, 2015) and interactive storytellingcan increase engagement and comprehension (Baldwin &Ching, 2016; Novak, 2015). Gamification, however, alsodirectly contributes to knowledge gains by increasingengagement (Calvo & Reio, 2018). Calvo and Reio (2018)found higher levels of engagement linked to higherknowledge gains in a gamified course administered toprofessionals in the tourism industry. The more engagedlearners were in a gamified course, the greater the levelof knowledge gains.

The Journal of Applied Instructional Design, 10(2) 25

Jabbar and Felicia (2015) found similar results in asystematic review of engagement and knowledge gains ingamification. They found that gamified learning increasedknowledge attainment. Gamified learning is mosteffective when components contributing to emotional andcognitive engagement are present in the course design.Combining multiple types of interactivity and using aneffective design has the greatest impact on engagementand knowledge gains (Jabber & Felicia, 2015). The articlealso examines the effects of an interactive gamifiedcourse design on understanding of content.

Methods

Purpose of the Study

Limited research is available on design approaches tofacilitate engagement in asynchronous gamified courses.This study adds to the body of knowledge for scholars andpractitioners by exploring the impact of a gamified coursedesign on engagement and understanding of coursecontent in an asynchronous gamified new hire coursetitled Member Journey. The course was delivered on thefirst day of functional job-based training at a healthinsurance organization.

Through quantitative approaches, this study answers thefollowing research questions:

Does the gamified course enable learners to reachtraining goals?What are the significant factors (health careexperience, narration, course activities andengagement) that predict employee post-testscores?What are the employees’ perceptions of thecourse, gamification, interactive elements andinteractivity enablers?How is the impact of gamification on employeeknowledge gains different between lowerperformers and higher performers?How is the time spent on pre- and post-testdifferent between lower performers and higherperformers?How are the employees’ perceptions of thecourse, gamification, interactive elements andinteractivity enablers different between lowerperformers and higher performers?

Research Design

The study primarily employed a quantitative drivenapproaches. The researchers administered a pre- andpost-test to consider participants’ understanding ofcourse content from the intervention of the gamifiedcourse. A survey was used to gain insight into theparticipants’ views and attitudes on the instructional

approaches. After collecting and analyzing thequantitative data, the authors conducted semi-structuredinterviews to delve deeper into the topics and providerich descriptions on participants' attitudes (Creswell,2014). Participants were assured of anonymity; in fact,any names used in the article to enhance the narrativeare pseudonyms.

Setting

The United States based health insurance companyemploys over 23,000 people; more narrowly, the specificdivision this project focused on includes 9,000 conciergecustomer service employees. In 2017-2018, leadership ofthe concierge customer service division determined thatmany new hires did not have sufficient knowledge ofhealth insurance basics, such as terminology andinsurance claim and inquiry processes. To address theseknowledge gaps, instructional designers employed theUnderstanding by Design Framework (UbD) to inform thecourse design (Wiggins & McTighe, 2011). UbD is aninstructional design framework composed of three stages:Stage 1 involves identifying desired results; Stage 2involves determining assessment evidence; and Stage 3encompasses planning the learning experience andinstruction (McTighe & Wiggins, 2012).

Identify Desired Results

The desired result of the fundamental onboarding coursewas for employees to understand foundational elementsof the business and the customer experience. Severalinstructor-led and e-learning learning modalities wereconsidered to address this knowledge gap. To determinethe most appropriate learning solution, leadership andinstructional designers considered several factors. In2017 and 2018, the health insurance instructionaldesigners deployed physically interactive techniques,such as clickables, dragging features, text entry, andsystem simulations in the organization’s e-learningcourses. The courses were positively received by learnersand they supported the evaluation of core competencies.Expanding upon this innovation, the company soughtmore immersive and cognitively interactive approaches.Additionally, the business wanted to implement a coursecovering foundational content across multiple lines ofbusiness. Stakeholders determined that an instructor-ledapproach would not be an appropriate fit with availableresources, so e-learning solutions were explored.

Determine Assessment Evidence

It was determined that assessment evidence would begathered through mixed methods strategies including aknowledge check on core competencies and a surveycentered on the learner experience. The knowledge checkallowed a timely measurement of knowledge gained in thecourse and the survey provided an insight into the

The Journal of Applied Instructional Design, 10(2) 26

Member Journey course relative to other courses andmodules in the program. Success in the evaluation oflearner performance would be achieved when a learnercould demonstrate the effective application of theterminology, concepts, and processes learned in adifferent scenario and context. The scenarios outlined inthe pre- and post-assessment focused on uniquesituations to gauge learner adaptability in new contexts.The UbD framework considers this level of adaptability tobe evidence for knowledge transfer (McTighe & Wiggins,2012). The questions in the pre- and post-assessmentincluded prerequisite knowledge learners workedthrough to solve the problem. The result was a holisticassessment approach that considered the comprehensionand application of concrete and abstract information andconcepts.

Plan Learning Experiences and Instruction

Within the organization, Member Journey represents acomplex e-learning course. The organizational standardof complex e-learning includes: (a) audio (includingscripting); (b) video; (c) interactive simulations; (d) highinteractivity; (e) multiple animations. The learningsolution consisted of a 30-minute Storyline courseinformed by gamification e-learning principles. ArticulateStoryline, an authoring software that promotesinteractive and personalized online and mobile courses,was used to develop the course (Articulate Global, 2019).The gamified e-learning course reviewed foundationalinsurance topics from the member’s perspective acrossmajor topics: the purpose of insurance, retail and groupplan enrollment, benefits terminology and claimsubmission, and appeals. The learning solution presentedcontent sequentially in the context of the overallcustomer journey enabling employees to interact andwork through a customer’s experience with the company.

Learners completed the course in an instructor-ledcomputer lab setting as part of an instructor-led new-hiretraining program that incorporated a variety of learningapproaches such as experiential activities, scenario-basedexercises, e-learning courses, and others. The gamified e-learning course on health insurance basics was self-directed and completed on the first day of the program.

At the beginning of the gamified e-learning course,storytelling elements were introduced when the learnerswere presented with two narrators, Gil and Gidget. Thetwo narrators explained that during the course theywould be going on the customer experience journey byvisiting four islands. Each island included a game-basedchallenge focusing on one of the major topics. Once thechallenge was completed successfully, the participantreceived a badge in the form of a stamp on a passport. Atthe conclusion of the course, the facilitators reviewed anddebriefed the content.

In sum, working through the UbD process revealed that agamified e-learning approach aligned with the audienceand the desired results. The approach allowed employeesto work through a customer’s experience with thecompany and allowed the business to assess knowledgegains through an interactive process.

Participants

The researchers collected data between October andDecember 2018. The researchers administered a pre- andpost-test and a survey to 121 employees in eight new-hireclasses across the United States on the first day ofonboarding. The participants were primarily under theage of 39; 57% percent were under 29 and 22% werebetween the ages of 30-39. 82% of participants werefemale. 43% of participants had some college and 27%had a high school degree or equivalent. Only 11% of theparticipants had more than five years of experience inhealthcare; in fact, 60% of participants had less than oneyear of previous experience. Follow-up interviews wereperformed to add depth to the discussion. Tenparticipants of the 121 were randomly selected to engagein semi-structured one-on-one 60-minute interviews togain a deeper understanding of their perceptions on thegamified course. The interview protocol consisted of sixquestions (see Appendix B).

Procedure

The tests and survey were administered in an onlineformat. Participants engaged in the pre-test prior tocompleting the Member Journey course. Participantsresponded to the post-test and survey after completingthe course. The pre- and post-test were composed of fourmultifaceted multiple-choice questions. The tests alignedto the topics at the “destinations” including insurancefoundations, enrollment, benefits, and claims. Thequestions were scenario-based and challenged thelearners to problem-solve as well as recall knowledge.The pre- and post-test are equivalent tests in number ofquestions, difficulty, and subject matter. Parallel testswere used to avoid problematic issues inherent with test-retesting.

The survey consisted of nine multiple choice questionsand one open-ended question (see Appendix A). Thesurvey questions centered on demographics, courseperceptions, and perceptions of course quality andusability. Previous experience in health care wasconsidered by less than 1 year, 1-3 years, 3-5 years, andmore than 5 years. Perceptions on the narration, courseactivities and engagement were measured on a five-pointLikert scale. Narration questions centered on whether thenarrators assisted participants in understanding theirrole. The researchers examined whether the activities(i.e., drag and drop exercise, a tic-tac-toe game, etc.)

The Journal of Applied Instructional Design, 10(2) 27

helped build understanding of concepts.

Participant responses were housed in the organization’slearning management system. The day after the pre-test,post-test, and survey were administered, the researchersbegan the process of accessing, cleaning, and compilingthe data.

Data Analysis

Quantitative data were analyzed using SPSS statisticalsoftware. For the first research question, employee pre-test scores and post-test scores were compared using a t-test. To answer the second research question, a multiplelinear regression analysis was used. Multiple regressionanalysis is a statistical technique to explore therelationship between a dependent variable and any one ofthe independent variables when the other independentvariables are held fixed (Hoffmann, 2010). Employeepost-test scores were used as a dependent variable andemployee healthcare experience, perception on narration,course activities, and engagement were used asindependent variables. Residual analysis, multicollinearitywith tolerance and VIF (variance inflation factor) valuesand error term were reviewed to check all required modelassumptions to conduct the regression model.

Results

Research Question 1. Does the GamifiedCourse Enable Learners to Reach TrainingGoals?

The average scores in the pre- and post-test for allemployees are displayed in Figure 1. In addressing thefirst research question, participants demonstrated onaverage a 9.5-point increase in scores on foundationalinsurance topics from the pretest to the post-test. Theaverage score in the pre-test was 65.5 (SD=24.42) andthe average score in the post-test was 75 (SD=21.65) outof 100. This was a statistically significant difference(t(120) = -4.11, p < 0.001) and supported the designteam’s completion goal.

Figure 1

Knowledge Gains Comparison in Pre-Test and Post-Test

A Bar Graph Showing a Comparison of Knowledge Gains in Pre-Test and Post-TestScores

Research Question 2. What Are theSignificant Factors (Healthcare Experience,Narration, Course Activities, andEngagement) That Predict Employee Post-Test Scores?

Answering the second research question, prior to multiplelinear regression analysis, Pearson’s correlation analysiswas conducted (see Table 1). Participants’ post-testscores had a positive correlation with health careexperience (r = .20, p < .05), narration (r = .25, p < .01),course activities (r = .22, p < .01), and engagement (r =.22, p < .01). Learning activities had a strong positivecorrelation with narration (r = .69, p < .001) andengagement (r = .68, p < .001). Narration had a strongpositive correlation with engagement (r = .83, p < .01).Multicollinearity with tolerance and VIF values werechecked and there were no issues with multicollinearityin this study. Table 2 shows that health care experience(β = .21, p < .05) was a significant factor impactingparticipants’ post-test scores. The regression model was agood fit (R2=10.7%) and the relationships werestatistically significant in explaining 10.7% variance inpost-test scores using four independent variables.

Table 1

Correlations Among Variables

Post-TestScore

HealthcareExperience Narration Course

ActivitiesPost-TestScore

-

HealthcareExperience

.195* -

Narration .246** -.044 - CourseActivities

.217** -.001 .690*** -

Engagement .217** -.058 .830*** .680***

The Journal of Applied Instructional Design, 10(2) 28

Note. *p < .05, **p < .01, ***p < .001

Table 2

Predicting Factors for Post-Test Score

B Std.Error β t Sig.

Constant 33.40 14.16 2.36* .020HealthcareExperience

4.29 1.84 .21 2.33* .022

Narration 4.75 4.36 .18 1.09 .278Course Activities 1.99 3.42 .07 .58 .561Engagement .95 5.31 .03 .18 .858R2 .107Adjusted R2 .076F 3.463*

Note. *p < .05

Research Question 3. What Are theEmployees’ Perceptions of the Course,Gamification, Interactive Elements, andInteraction Enablers?

Regarding the third research question, participants alsoexpressed positive perceptions of the interactivity andgamification strategies. The employees’ perceptions ofthe course rating are presented in Figure 2. 92% ofparticipants rated the course as excellent or good and93% of learners expressed that engagement through thecourse promoted their understanding of course content.

Participants also expressed positive perceptionssurrounding the activities and narrators. 91% reportedthey strongly agreed or agreed that the activities helpedtheir understanding of concepts. 89% of participantsshared they strongly agreed or agreed the narratorsenhanced their understanding of their role. A majority(87%) of participants strongly agreed or agreed that theypreferred to learn content through the learning gameapproach rather than other approaches (e.g., lecture,instructional videos, assigned reading, etc.).

Figure 2

Employees’ Perceptions on the Course and GamifiesElements

A Pie Chart Showing Employees' Perceptions of the Gamified Elements in theCourse

Research Question 4. How Is the Impact ofGamification on Employee Knowledge GainsDifferent Between Lower Performers andHigher Performers?

To answer the research question, the difference inknowledge gains from the pre-test to the post-test werecompared between the lower performers (post-testscoremean ≤ 50 out of 100) and the higher performers(post-test scoremean > 75 out of 100).

The scores in the pre- and post-test between the twogroups are presented in Figure 3. Higher performers’average score in the pre-test was 80.26 (SD = 20.27),while lower performers’ average score in the pre-test was44.35 (SD = 10.63). There was a 35.91gap, a statisticallysignificant difference (t(67) = -4.81, p < .001). Higherperformers’ average score in the post-test was 100.00(SD = 0.00), while lower performers’ average score in thepost-test was 54.84 (SD = 23.65). There was a 45.16 gap,again a statistically significant difference (t(67) = -32.34,p < .001). Higher performers were closer to traininggoals than lower performers.

Figure 3

Knowledge Gains Comparison in Pre-Test and Post-TestBetween Lower Performers and Higher Performers

The Journal of Applied Instructional Design, 10(2) 29

A Bar Graph Showing Differences in Lower and Higher Performers Pre- and Post-Test Performance

Research Question 5. How Is the Time Spenton Pre- and Post-Test Different BetweenLower Performers and Higher Performers?

Time spent in the pre- and post-test between the twogroups is displayed in Figure 4. Higher performers’average time spent in the pre-test was 135.03 seconds(SD = 112.94), while lower performers’ average timespent in the pre-test was 93.53 seconds (SD = 32.38).There was a 41.50 gap, a statistically significantdifference (t(67) = 2.16, p < .05). Higher performers’average time spent in the post-test was 74.81 (SD =34.20), while lower performers’ average time spent in thepost-test was 71.45 (SD = 43.67). There was a 3.36 gap,but the result was not a statistically significant difference.

Figure 4

Time Spent Comparison Between Lower Performers andHigher Performers

A Bar Graph Showing Time Spent by Lower and Higher Performers on TestingActivities

Research Question 6. How Are theEmployees’ Perceptions of the Course,Gamification, Interactivity, and InteractionEnablers Different Between Lower

Performers and Higher Performers?

The employees’ perceptions of the course rating and itscomponents are compared in Figure 5. Overall, theemployees’ perceptions on the course and the interactivecomponents in the gamified course were statisticallysignificantly different between the lower performers andthe higher performers. Higher performers rated higherthan lower performers on course rating (higherperformers = 4.68 and lower performers= 4.06), howmuch activities helped their understanding (higherperformers = 4.68 and lower performers = 4.19), howmuch the narrator helped their understanding (higherperformers = 4.50 and lower performers = 4.03), howmuch engagement helped their understanding (higherperformers = 4.71 and lower performers= 4.32) and howmuch the interactive gamified approach helped theirunderstanding (higher performers = 4.57 and lowerperformers = 4.16).

The gaps between the two groups showed statisticallysignificant differences in course rating (t(67) = -3.49, p <.01), activities (t(67) = -2.59, p < .05), narrator (t(67) =-2.11, p < .05) and engagement (t(67) = -2.23, p < .05).The gap in course rating between the two groups was nota statistically significant difference. In sum, for the highperformers, the interactive gamified course designfacilitated engagement and promoted knowledge gains.

Figure 5

Employees’ Perceptions of the Course and GamificationBetween Lower Performers and Higher Performers

A Bar Graph Comparing Lower and Higher Performers' Perceptions of GamifiedActivities

DiscussionThe researchers’ goal in conducting this study was toexamine new hires’ learning outcomes and perceptions ofan interactive gamified e-learning course in a healthinsurance organization. The researchers’ found that agamified e-learning course was well received byparticipants and assisted them in understandingfoundational information on insurance terminology,processes, enrollment, benefits, and claims.

The Journal of Applied Instructional Design, 10(2) 30

The learners’ positive perception of the course andunderstanding of the course content were the effect ofseveral factors. The instructional designers employed theUbD process to uncover whether gamification was a goodfit for the audience and appropriate in meeting theorganization’s goals. Additionally, the gamified elementsand engagement enablers, such as characters,increasingly complex challenges, and an overarchingnarrative, supported high levels of interactivity, which inturn enhanced learner engagement and understanding ofcontent.

McKimm, Jollie and Cantillon (2003) explained that e-learning courses are often incorporated into blendedlearning programs. Varying learning approaches canenhance learner engagement by challenging learners indifferent ways, such as through visually applyingdemonstrations, problem-solving scenarios, or interactiveexercises. The Member Journey course mirrors this trend.In the new-hire program, the gamified Member Journeycourse was a component of a larger program includinginstructor-led lessons, videos, and e-learning courses withsimple to average amounts of interactivity and animation.It is critical to align the best learning solution to thecontent and larger experience; interestingly, several ofthe participants were also attuned to this. For example,Emily echoed this sentiment with:

I don't think it'd be beneficial to do allgames during the learning course. So, tochange it up. So, some of them arereading. Some of them were typing. Someof them were playing a game keeps…youon your toes and keeps it different.

The results supported the rigor involved in workingthrough an instructional design framework and revealedthat the interactive courses resonated with participants.Beyond solely the Member Journey course, nine out of tenlearners interviewed found the most interactive e-learning courses the most effective. Lee highlighted “theones that are interactive, which you have to click on toadvance or to try to figure out the scenarios… helps methe most.” 92% of participants rated the Member Journeycourse as excellent or good and participantsdemonstrated on average a 9.5-point increase in scoresfrom the pre-test to the post-test. From the follow-upinterviews, nine out of ten learners’ expressed positiveimpression with the entertaining course structurecompared to the informational and knowledge test-basedcourses. Aiden shared “I had a little more fun with this, itwasn't as stressful as if I go … cramming for an exam.” Inthis context, the researchers’ findings confirmed that thesolution was a good fit based on the thoroughconsideration of the learning context, audience, andbusiness need.

The learners were perceptive to the importance of

considering the learning context. The results revealedgame components in the course were beneficial.Participants discussed in the follow-up interviews findingthe course layout engaging to interact with, whichinvolved navigating to the new destinations aftercompleting challenges on each island. The challengesinvolved game-based interactions such as spinning awheel, flipping over cards and matching content. Emilyexpressed her reaction to completing the game-wheelchallenge noting “…it keeps the fun aspect of it like I'mstill playing a game. But I mean you're playing the gameand you're learning.” This result aligns with the literaturedemonstrating the relationship between gamification andengagement (Alsawaier, 2017; Calvo & Reio, 2018; Jabbar& Felicia, 2015).

This study echoes the position that employing aninstructional design model (e.g., the ADDIE, SAM,Competency-Based, UbD, Critical Events Model, orothers) rigorously will align a learning modality to desiredlearners’ perceptions and outcomes (Bell & Federman,2013). Once instructional designers determine the mosteffective approach for the learning context, content, andaudience, they can develop strategies for enhancingengagement (Young, 2010). Similar to Changet al. ‘s(2018) findings, the instructional designers of theMember Journey course implemented interactivity byemploying gamification and course design features thatenable interactivity which, in turn, increasedengagement. The interplay between the gamification andenabling features is represented by Gil and Gidgetnavigating participants through the plot and providingparticipants with the rules of each island challenge.

The underlying stories in the course mirroredrecommendations discussed in the literature whichprovided learners the opportunity to connect to pastexperiences and apply them to the content (Smeda et al.,2010). During the follow-up interviews, participantsrecalled specific examples of Gil and Gidget presentingchallenging scenarios. Bill excitedly recounted onechallenge in the Member Journey course that resonateddue to unique storytelling approaches. He compared it toa “Saturday morning cartoon.” He recalled that “thecircle fellow there... he climbed a tree and got hit by acoconut.” He further explained “… if someone wouldhave told me that was going to be one of the trainings, Iwould have given them one of the most confusedwhatever.” He further shared that this plot device“…prompted the next part of the journey, which was hehad to go to the doctor. Because he fell off the coconuttree … it was entertaining, and it still made sense in thegrand scheme which is impressive.” In the MemberJourney course, the pairing of digital storytelling withcomplex interactions including audio, animations, andinteractive simulations resonated with the learners andenhanced their positive perception of the course in

The Journal of Applied Instructional Design, 10(2) 31

comparison to other approaches.

As a whole, 89% of participants shared they stronglyagreed or agreed the narrators enhanced theirunderstanding of their role. The challenges requiredparticipants to click, drag, or enter content while theyengaged in problem solving through the scenarios Gil andGidget presented. The researchers found that 91% ofparticipants reported they strongly agreed or agreed thatthe activities helped their understanding of concepts. Thisevidence demonstrates that learner engagement can leadto increased understanding of the course content (Bloom,1956; Nkhoma et al., 2014).

According to the results, most participants reported thatthe course was engaging. Although there was astatistically significant group difference, the gamifiedcourse was engaging for both lower performers andhigher performers. This suggests a positive correlationbetween engagement and understanding of coursecontent. This is echoed the literature on higher learningoutcomes were the result of learners experiencinggreater engagement with the games and overallenjoyment in the course (Nkhoma et al., 2014).

After completing the Member Journey course, 93% of thelearners expressed that engagement through the coursepromoted their understanding of course content. Theparticipants found that the interactive gamifiedcomponents were fun and motivated their learning. Emily explained “Because I'm a hands-on learner, I like tobe able to click on things and, you know, type things in,as opposed to just reading and answering questions. Itseems to stick better with me that way.” These findingsadd to the understanding of how interactivity promotesengagement in e-learning (Zhang & Zhou, 2003). Thefindings also add to the dialogue on the strategies thatincrease learner engagement which will, in turn, increaseunderstanding of the content (Bloom, 1956; Nkhoma etal., 2014).

Time spent in the pre- and post-test revealed that therewas a statistically significant gap in time spent on pre-testbetween higher performers and lower performers whilethere was no significant gap in post-test. We noticed thathigher performers had prior knowledge from theircomparatively greater field experience and they neededless time to answer the pre-test questions. After thecompletion of the course, both groups showed reducedtime spent in post-test. These results present the parallelpattern that low prior knowledge learners allocate moretime to understand the learning materials and processinformation (Amadieu et al., 2009).

Suggestions for Future ResearchThe results informed learning solutions at the

organization where the study took place and the findingsdrove the adoption of highly interactive gamificationapproaches as an initiative for 2019. The findingssupported the ideas that interactivity supports higherlevels of engagement (Zhang & Zhou, 2003) and thathigher levels of engagement can lead to better learningoutcomes (Bloom, 1956; Nkhoma et al., 2014). Regardingenabling features, participants rated the storytellingelements highly and said they increased their interest inthe content, motivation and connection making (Baldwinand Ching, 2016; Smeda et al., 2010). Overall, the studysupported the hypothesis that games increaseengagement (Alsawaier, 2017; Calvo & Reio, 2018; Jabbar& Felicia, 2015). A limitation of the study is that a singlecourse was administered instead of multiple gamifiedcourses. Further studies examining the impact of highlyinteractive and engaging gamified e-learning could bebeneficial in understanding this type of course design inthe workplace. Furthermore, additional exploration isneeded of how varied gamification approaches can besupported to identify consistent results regarding thepositive effects of gamified courses in corporate settings.

This study examined 121 employees in eight classes. Thisrepresents a small portion of the approximately 1,148concierge customer service new hires brought into theorganization each year. The more robust populationprovides the opportunity for a longitudinal study toholistically examine the new hires’ learning outcomes andperceptions of the game-based e-learning course and itscourse design. The participants in the course have theopportunity to reinforce their knowledge during theirdaily work. The researchers recommend surveying theparticipants after six months to assess their perceptionson how the course prepared them for work and thegamification strategies by comparison to otherapproaches experienced in their first six months ofonboarding.

This study took place at a health insurance organization.Even though there is limited research on the impact ofgamified learning in this setting, enabling interactivitythrough course design is not a new phenomenon inworkplace e-learning. There is opportunity for furtherexamination of gamified e-learning across the healthcareindustry.

ReferencesAkdere, M., & Conceicao, S. (2006). Integration of human

resource development and adult education theoriesand practices: Implications for organizationallearning. Online Submission. https://edtechbooks.org/-eMp

Alsawaier, R. S. (2018). The effect of gamification onmotivation and engagement. The International

The Journal of Applied Instructional Design, 10(2) 32

Journal of Information and Learning Technology,35(1), 56-79. https://edtechbooks.org/-pmR

Amadieu, F., van Gog, T., Paas, F., Tricot, A., & Marine,C. (2009). Effects of prior knowledge and concept-map structure on disorientation, cognitive load, andlearning. Learning and Instruction, 19, 376–386.https://edtechbooks.org/-HIj

Articulate Global. (2019). Storyline 3. https://articulate.com/

Baldwin, S., & Ching, Y. H. (2017). Interactivestorytelling: Opportunities for online course design.Tech Trends, 61(2), 179-186.https://edtechbooks.org/-vMin

Bell, B., & Federman, J. (2013). E-Learning inpostsecondary education. The Future of Children,23(1), 165-185. https://edtechbooks.org/-YUM

Bloom, B. S. (1956). Taxonomy of Educational Objectives,Handbook I: The Cognitive Domain. David McKay CoInc.

Calvo, L. C., & Reio, T. (2018). The relationship betweenengagement and knowledge attainment in acomputer-based training game and job performanceof travel agents. The Journal of ManagementDevelopment, 37(5), 374-384.https://edtechbooks.org/-xmAf

Chang, C., Warden, C., Liang, C. & Lin, G. (2018). Effectsof digital game-based learning on achievement, flow,and overall cognitive load. Australasian Journal ofEducational Technology, 34(4), 155-167.https://edtechbooks.org/-fFy

Chatterjee, P. (2010). Entertainment, engagement andeducation in e-learning. Training & ManagementDevelopment Methods, 24(2), 601-621.https://edtechbooks.org/-rfK

Creswell, J. W. (2014). Research design: Qualitative,quantitative, and mixed methods approaches. (4th

ed.). Sage.

Dichev, C., Dicheva, D. (2017). Gamifying education:What is known, what is believed and what remainsuncertain: a critical review. International Journal ofEducational Technology in Higher Education, 14(9).https://edtechbooks.org/-pjKBK

Ho, M. (2017). Learning investment and hours are on therise. TD Magazine. https://www.td.org/magazines/td-magazine/learning-investment-and-hours-are-on-the-rise

Hoffmann, J. P., & Shafer, K. (2015). Linear regression

analysis: Assumptions and applications. NASW Press.

Hong, Y., Clinton, G., & Rieber, L. (2014). Designingcreative user interactions for learning. EducationalTechnology, 54(2), 20-25.https://edtechbooks.org/-fvvp

Jabbar, A., & Felicia, P. (2015). Gameplay engagementand learning in game-based learning: A systematicreview. Review of Educational Research, 85(4),740-779. https://doi.org/10.3102/0034654315577210

Lester, J. C., Ha, E. Y., Lee, S. Y., Mott, B. W., Rowe, J. P.,& Sabourin, J. L. (2013). Serious games get smart:Intelligent game-based learning environments. AIMagazine, 34(4), 31-45.https://doi.org/10.1609/aimag.v34i4.2488

Long, L. K. & Smith, R. D. (2003). The role of web-baseddistance learning in HR development. Journal ofManagement Development. 23(3), 270-284.https://doi.org/10.1108/02621710410524122

McTighe, J. & Wiggins, G. (2012). Understanding bydesign framework. https://edtechbooks.org/-fVI

Nkhoma, M., Sriratanaviriyakul, N., Cong, H.P., & Lam,T.K. (2014). Examining the mediating role of learningengagement, learning process and learningexperience on the learning outcomes throughlocalized real case studies. Education and training,56(4). 287-302. https://edtechbooks.org/-Erie

Novak, E. (2015). A critical review of digital storyline-enhanced learning. Educational Technology Researchand Development, 63(3). 431-453.https://edtechbooks.org/-qKfH

Smeda, N., Dakich, E., & Sharda, N. (2010). Developing aframework for advancing e-learning through digitalstorytelling. Proceedings from the IADISInternational Conference e-learning. Freiburg,Germany.

Woo, T., Wohn, K., & Johnson, N. (2011). Categorisationof new classes of digital interaction. Leonardo, 44(1),90-91. https://edtechbooks.org/-mjr

Young, M. R. (2010). The art and science of fosteringengaged learning. Academy of EducationalLeadership Journal, 14(1), 1–18.https://edtechbooks.org/-qFrH

Zhang, D., & Zhou, L. (2003). Enhancing e-learning withinteractive multimedia. Information ResourcesManagement Journal, 16(4), 1-14.https://doi.org/10.4018/irmj.2003100101

The Journal of Applied Instructional Design, 10(2) 33

Appendix A: Survey

What is your age?1.

Less than 2020-2930-3940-4950-59Greater than 60I would prefer not to answer

What is your gender?2.

MaleFemaleI would prefer not to answer

What is the level of school you have completed? If3.currently enrolled, please select your highestdegree received.

Less than high school degreeHigh school degree or equivalent (e.g. GED)Some college but no degreeAssociate degreeBachelor degreeGraduate degree

How long have you worked in health care?4.

Less than 1 year1 year to less than 3 years3 years to less than 5 years5 years to less than 10 years10 years or more

Overall, I would rate this Member Journey course5.as…

ExcellentGoodFairPoorVery poor

Please rate how much you agree or disagree with6.the following statement: The activities (e.g. dragand drop, tic-tac-toe, etc.) in the Member Journeycourse helped build my understanding offoundational concepts.

Strongly agreeAgreeUndecidedDisagreeStrongly Disagree

Please rate how much you agree or disagree with7.the following statement: Gil and Gidget enhancedmy understanding of my role in the customerexperience.

Strongly agreeAgreeUndecidedDisagreeStrongly disagree

Please rate how much you agree or disagree with8.the following statement: This Member Journeycourse made me engaged in learning.

Strongly agreeAgreeUndecidedDisagreeStrongly disagree

Please rate how much you agree or disagree with9.the following statement: I’d prefer to learn thiscontent through the learning game approach thanother approaches (e.g. lecture, instructionalvideos, assigned reading, etc.)

Strongly agreeAgreeUndecidedDisagreeStrongly disagree

Do you have any suggestion(s) for improving the10.Member Journey course?

Appendix B: Interview Protocol

Tell me about your understanding of health1.insurance prior to attending the first day oftraining.

Describe the learning activities in the training2.program that stood out to you.

Discuss your perceptions of the Member Journey3.course in terms of understanding how your rolefacilitates the member experience. (Show imageof course.)

Talk to me about your impressions of the Member4.Journey course in comparison to other e-learningcourses, such as the scenario-based courses.(Show images of different courses, such as a pieceof the introduction section and/ or a key topicsection.)

In the Member Journey course, you completed a5.pre-test, games/exercises and a bonus around;how did you feel when you reached the bonusround?

Do you have any suggestions to improve the6.Member Journey course?

The Journal of Applied Instructional Design, 10(2) 34

Chefs in Training!Engaging Pharmacy

Students through CourseGamification

Melissa J. Ruble, Jaclyn D. Cole, & BethE. Jordan

Gamification is defined as the “use of game designelements in non-game contexts” (Deterding et al.,2011, p.10) with the goal of promoting userengagement. Didactic courses that incorporategame elements such as rulebooks, elements ofsurprise, levels, challenges, and rewards provideintrinsic motivation through immediate feedback,goal setting, opportunity for mastery, andautonomy. In this design case, course coordinatorsuse a modified ADDIE process in collaboration withan instructional designer (ID) to effectivelyintegrate gamification into an elective course tochallenge students while providing activities thatpromote engagement and retention of information.

IntroductionGamification has been applied to instruction in bothdidactic and experiential training to enhance learnerengagement and motivation (Sera & Wheeler, 2017). Asnoted by Sardi et al. (2017) in their systematic review,there are a number of related terms being used to referto the application of game design concepts, includinggame-based learning, gamification, educational gaming,and serious gaming. As education and the integration oftechnology continues to evolve, so do the definitions ofsuch terms. Certainly, more terms will be coined in thecoming years. Consensus on the use and definition ofterms to guide both researchers and practitioners as theyshare their work publicly would be helpful; however, it isnot the intent of this article. Instead, we use a widelyaccepted, broad definition that gamification is the “use ofgame design elements in non-game contexts” (Deterdinget al., 2011, p. 10).

Gamification in Healthcare Education

Across disciplines within healthcare education, studentsreport positive perceptions of gamification in some formor another. Nursing programs have reported theincorporation of gamification in education through digitalbadges (White & Shellenbarger, 2018), and nursingorientation gamification has been used to improve bothproductivity and retention of knowledge (Brull et al.,2017). Medical education has incorporated gamificationwithin residency training to support credentialing,examination scores, surgical technique, and clinicaldecision-making skills. Gamification of online study toolsthat supplement traditional classroom education for exampreparation had a positive impact on otolaryngologytraining examination scores for otolaryngology residents(Alexander et al., 2018). Time spent gaming resulted insignificant improvement in performance on alaparoscopic simulator compared to residents whopracticed with the simulator alone (Adams et al., 2012).In addition to surgical technique, surgeons must haveexcellent decision-making skills. Lin et al. (2015) foundevidence to support the validity of a web-based gamingplatform for training and assessment of surgical decision-making. These studies and many others over the pastdecade have shown a positive impact of gamification onlearning experiences and assessments across manyprograms in healthcare education.

Gamification in Pharmacy Education

Similarly, pharmacy education has noted positiveoutcomes from the use of gamification within thepharmacy curriculum. The American Association ofColleges of Pharmacy (AACP) introduced charges to its2013-2014 Academic Affairs Committee in order toexplore the role of gamification in pharmacy education,explore how colleges can support implementation, andidentify areas where the impact would be most promising.Based on their research, the Committee recommendedthat colleges of pharmacy integrate serious games intotheir core curriculum for learning and professionaldevelopment. Given the importance of student learning,the Committee also recommended that AACP, as anorganization, develop serious games for institutions toutilize despite the cost and complexity. The Committeerecommended that faculty collaborate with experts in thefield of instructional design to facilitate quality gamedevelopment (Cain et al., 2014). Since AACP’s AcademicAffairs Committee’s report in 2014, there has been effortunderway by various colleges of pharmacy to leverage thebenefits of gamification within both the didactic andexperiential curriculum. Some benefits include increasedstudent knowledge in topics covered, enhanced empathytowards patients, and increased student engagement(Aburahma & Mohamed, 2015). For example, theUniversity of Pittsburgh School of Pharmacy developed a

The Journal of Applied Instructional Design, 10(2) 35

mock investment game to educate students on importantaspects of the pharmacy industry. Students showedsignificant improvement in 19 domains assessed in thestudy (Wolf et al., 2018). After a faculty developmentworkshop at one university, pharmacy faculty reportedeagerness to implement a variety of active learningstrategies featuring gamification (Barone et al., 2018). Inresponse to dwindling applicant pools, several collegescollaborated to design and implement new components(e.g., educational games, guest speakers, team-basedlearning) into the admissions process. This resulted in anincrease in applications (Salazar et al., 2018). In thisarticle we explore a slightly differentscenario—gamification of a graduate level pharmacycourse using a gameshow format with guidance from thecollege’s instructional designer (ID).

Gamification Integration into College ofPharmacy

Faculty profiles collected by the Office of Faculty Affairshave identified that faculty in the Taneja College ofPharmacy (TCOP) have doctoral degrees in pharmacy andrelated fields and not in education; thus, the collegechose to incorporate a Technology, Instruction,Evaluation, Design (TIED) team within the Office ofAcademic Affairs (OAA). The TIED team’s role is toprovide faculty with guidance and support in developingpedagogical skills, implementing educational innovations,and evaluating implementation of innovations.Additionally, the TIED team partners with faculty inscholarly endeavors associated with these collaborations.The TIED team is made up of one ID and one Learningand Development Manager. The ID on the team hasexperience and training in design and development ofonline educational games. The positive results the ID hasseen from effective game design within reading softwaremotivated the ID to incorporate such game elements intoother instructional areas. Faculty development andeducational sessions by the ID have allowed faculty tosuccessfully integrate gaming elements including escaperooms, Kahoot, rap battles, competitions, and more intotheir courses. When preparing to attend the 41st annualmeeting for Association for Educational Communications& Technology (AECT) 2018, the ID enrolled in a workshopon gamifying courses. As part of the pre-workshop, the IDpresented an innovative idea to the pharmacy faculty tosolicit volunteers to embark on the endeavor together.Two eager faculty members agreed to pilot thegamification of a course; a strategy which is notcommonly used in pharmacy education. The facultymembers shared their syllabi and current course outlineswith the ID. During the workshop, “Redesign Courses intoa Competition-Based Game-Show Format” by KiranBudhrani (2018), the ID gathered resources and planningtools in preparations for gamifying the PHA6603CInternal Medicine Elective to be offered for the first time

in the following academic year. The workshop facilitatorused Deterding’s (2011) definition of gamification indescribing the ID’s efforts at gamifying a course inmultimedia instructional design. In this article, wedescribe the systematic process used to analyze thecourse, and to design, develop, implement, and evaluategamification using a gameshow format. We include theselection of game elements and how we applied them tothe non-game context of a graduate level pharmacycourse.

MethodsAlthough designers in various fields (e.g., engineering,software development, education) set about theirassigned tasks along different pathways, these pathwaysshare several common elements. Some kind of analysis ofthe problem, need, or situation takes place. The analysisreveals what needs should be addressed. The needs drivethe design plan which is developed and implemented.Evaluating the product, of course, must also occur toensure the outcome meets the need. When appropriate,the analysis of the evaluation results may drive anotheriteration of the design process to improve the product.These design components make up the ADDIE model:Analysis, Design, Development, Implementation, andEvaluation (Branch & Kopcha, 2014; Clark, 2015; Kurt,2017).

The ADDIE instructional design model was developed inthe 1970s for designing military training. In its puristsense, the ID moves through the phases linearly with nomodifications to the product until after the evaluationphase is completed. While the seemingly strictparameters served their purpose for the military’s needs,the ADDIE model without modification is not a perfect fitfor all training development (Clark, 2015; Kurt, 2017). Analternative Instructional Systems Design (ISD) is the agilemodel. Agile models focus on speed, flexibility,collaboration, and efficiency. It typically implementsminimally viable products that are developed in aperpetual beta, or a constant state of design andredesign. Both ISD models have a range of modifiedapproaches. Variations on the agile model includeSuccessive Approximation Model (SAM) and more(Instructional Design Central, n.d.). Instructionaldesigners typically modify ADDIE by incorporating mini-iterations in various stages of the process (Clark, 2015;Kurt, 2017).

Modified ADDIE: ADDIE in the Real World

Effective educators in a classroom scan the students’body language, ask questions to informally assessunderstanding, and engage students in learningexperiences. Then they adjust their teaching to meet thestudents’ needs. After a final evaluation, they may reflect

The Journal of Applied Instructional Design, 10(2) 36

on their results and make changes for the next semester.They have executed all phases of ADDIE even withoutawareness of any process model. They repeat thesephases in each class, at the module level, and again at theend of the course for an overall analysis drivingmodifications for the next term. Action research is a moreformal method than simply following one’s intuition as aneducator. It involves the same tasks just described withdeliberate data collection and analyses. Actionresearchers typically conduct their study within their ownclassrooms and may not share results beyond thecolleagues in their own institutions. Design-basedresearch, on the other hand, elicits additionalstakeholders in the instructional design process. A notedbenefit of design-based research, or formativeexperiments, is the collaboration in authentic settings(Bradley et al., 2012; Reinking & Bradley, 2008). Ofcourse, design cases can be conducted using an array ofISD models and their modifications. For this design case,the team chose a modified ADDIE model.

Rationale for ADDIE

In reviewing the history of instructional design, Reiser(2001) points out that “most of the models include design,development, implementation and evaluation ofinstructional procedures and materials intended to solvethose problems” (p. 58). Branch and Kopcha (2014) alsostate “all instructional design processes consist of at leastfive major activities” (p. 80) which they list as analysis,design, development, implementation, and evaluation. Asis evident in most, if not all instructional design models,the TIED team at TCOP uses the ADDIE model as a basicframework for instructional design process. In fact, thesteps of ADDIE are part of the ID’s job description. TheADDIE model is often criticized as inflexible, linear,complex, and inefficient; these valid critiques revolvearound the purist, original implementations of theprocess model (Clark, 2015; Kurt, 2017) . Within ourcollege, the model is used more as guidance to ensure allappropriate input is considered in designing an array ofoutcomes to include program curriculum, course design,assessments, and learning experiences within a course. Inthe case of designing learning experiences within anInternal Medicine Elective, there was no need for speedor rapid prototyping. The instructors were effectiveeducators who intuitively reflect and seek to improvetheir course regularly. Adding the college’s ID andfollowing a modified ADDIE model afforded theinstructors the opportunity for more effective coursechanges. Collaboration in authentic settings was both abenefit and a challenge. Given that the ADDIE model wascommonly used in the college already, using it as a guidein this design case provided both the instructors and IDwith shared vocabulary to increase effectivecommunication throughout the process. Below wedescribe the actions taken in each phase of ADDIE,

recognizing that some portions of each phase may overlapother phases.

AnalysisIn the analysis phase of ADDIE, the ID analyzes data todetermine the problem to solve, gap to fill, or need to bemet. For instructional design, this stage often involvesanalyzing available resources, the particular classroomsetting, demographics of the students, learningobjectives, and pedagogical goals (Instructional DesignCentral, n.d.; Kurt, 2017). In this design case, the teambegan with the pedagogical goal of increasing motivationwithin an elective course. Then they conducted ananalysis of the context and needs associated with thiscase.

Context Analysis

Colleges of Pharmacy are accredited by the AccreditationCouncil for Pharmacy Education (ACPE), which providesstandards and guidance documents (ACPE, 2015) toensure pharmacy students are “practice ready” and cancontribute as a valuable member of the healthcare team(Beatty et al., 2014). Successful matriculation throughoutthe didactic curriculum sets the stage for thefoundational knowledge and skills necessary for clinicalapplication. Courses are developed to support thesestandards and are mapped to competencies with topicsthat follow both horizontal and vertical alignment. Gapsin competencies and subject matter are assessed, thentopics are placed accordingly into the didactic andexperiential curriculum.

The PHA6603C Internal Medicine Elective course wascreated after the curriculum was built with the goal tofurther prepare third-year pharmacy students (PY3) forthe rigors and expectations of their Advanced PharmacyPractice Experience (APPE) clinical rotations. This face-to-face, three-hour lecture and lab course would aim tosupport enhanced student learning and emphasize criticalthinking and clinical application. Students taking thiselective course would have foundational knowledge of thetopics with limited individualized application andassessment in a clinical setting. A maximum of twentystudents could enroll in the three-credit elective coursewhich ran during the PY3 Spring semester (the students’last didactic semester) for a total of 16 weeks. Themaximum number of students was set to allow forindividualized assessment and opportunities for one-on-one clinical simulations. Students were randomized toselect their elective courses based upon their last nameon a first come first serve basis. Historically, students hadselected electives based on the area of practice, degree ofdifficulty, and faculty participating in the course.Electives are two to three credit hours, and classes takeplace once per week. Student workload for their PY3

The Journal of Applied Instructional Design, 10(2) 37

ranges between 18 to 20 credit hours per semester.

Needs Analysis

Content-Gaps

National and college decreasing trends in studentperformance on the North American PharmacistLicensure Examination (NAPLEX) and the increase instudent expectations for practice-readiness upongraduation created the need for an internal assessment tobetter prepare our students for success in these areas.The TCOP Curriculum Committee, along with the TIEDteam, reviewed the current curriculum including theprogressive mapping of professional competencies.Discussions on self-assessment of the curriculum throughenhanced simulations and course design occurred at thesame time as course coordinators were petitioning tostart an Internal Medicine Elective as part of thepharmacy curriculum. Creation of the course would needto focus on individualized application of knowledge andassessment of critical thinking and clinical reasoning.Establishing a means for evaluating critical thinking andproviding scenarios where students felt comfortablebeing uncomfortable in both low and high stakesenvironments was essential to the success of the course.

Learning-Gaps

Core courses (not electives) occupy the majority of thestudents’ time, leaving coordinators of electivescontinually searching for ways to motivate and encouragestudent participation and attention during and after class.Incorporating gamification principles aimed to helpelevate student engagement for efficient learning andbuild intrinsic motivation.

Faculty Workload

Integrating activities with individualized assessment anddetailed feedback can be time intensive andoverwhelming for faculty to incorporate. The necessity forimmediate feedback adds to the stress and time intensivenature of such activities. Creating assessments that limitfaculty workload while at the same time aligning withcourse and activity objectives with timely feedback isessential.

DesignThe design phase of ADDIE involves brainstorming andmaking decisions regarding the delivery method, learningobjectives, lesson planning, and determining theresources (Instructional Design Central LLC, n.d.; Kurt,2017). The collaborators in this case determined the goalof gamifying the elective course, selected appropriategame elements, created the rulebook, selected the

assignments to be gamified, and created the rubric toassess the assignments.

Goal Setting

Course coordinators enlisted the help of the TIED teamonce the course was pre-approved by the College’sCurriculum Committee. The coordinators and TIED teammet for a total of 10 hours to design and develop theoverall course platform. During their first meeting,coordinators described their vision and goals of thecourse to the ID with further discussion on course designand implementation. During this design phase, the visionand recommendations of the ID were shared with thepharmacy faculty to align with the gamification workshopthe ID was attending. The coordinators quickly decided totake advantage of this opportunity as this method directlyaligned with the needs addressed above (Budhrani,2018).

Game Elements

After attending Budhrani’s workshop at 41st AnnualMeeting of the Association for Educational (2018),TCOP’s ID met with the course coordinators to reviewguidance materials provided including a document ongame elements. Course coordinators reviewed theproposed elements and the elements of the television (TV)show and discussed ideas for purposeful integration.Elements were chosen based upon the course needs andapplicability in this setting. A consensus was made tomimic the course after the competitive cooking reality TVShow, Master Chef. Competitors in the show have variouskey elements that encourage intrinsic motivation andchallenge the participants to be creative and innovativeas they prepare their dish. Surprise ingredientsintroduced throughout the competition also allow for “onthe spot” thinking and an opportunity for enhancedcritical thinking.

Through applying a gameshow format, the InternalMedicine Elective course hoped to encourage friendlycompetition with activities related to several key internalmedicine topics. When reviewing similarities between theshow and the course, the course coordinators were ableto link “cooking” and “healthcare” in that chefs(pharmacists) must use the correct ingredients(medications) to create the best meal (patient care)possible. Chef terminology (i.e., layered learning orscaffolded learning) offered a direct correlation withstudent expectations and accreditation standards.Introducing these concepts to the students provided themwith clear expectations and encouraged life-long learningas required by “Standard 9” (ACPE, 2015). Coursecoordinators created a rulebook as part of their syllabusto set the stage for expectations and to tie in the MasterChef theme.

The Journal of Applied Instructional Design, 10(2) 38

Assignment Selection

There were three main assignments in the course wherethe coordinators worked with the ID to implementgamification principles and ensure alignment with theMaster Chef theme. Assignments were chosen based oncomplexity and workload encompassing 80% of the totalcourse grade. Assignments included a topic presentation(individual assessment), journal club debate (studentpairs), and case simulation (6 total individualassessments). Activities were created to evaluate higherlevels of learning that correlated with the expectations ofstudents as they prepare for clinical practice.Assignments were scaffolded and encouraged friendlycompetition and autonomy of the learner.

Assessment Instrument

Course coordinators also incorporated entrustableprofessional activities (EPAs) into course assessments toensure professional education expectations to transitionfrom learner to clinician were not overshadowed by gametheme elements (see Table 1). The purposeful integrationof both EPAs and various chef roles into course rubricssupported course gamification without minimizing thesignificance of patient care. An example of an assignmentrubric is provided in Appendix A. “Secret ingredients” inthe form of new lab values or diagnoses wereincorporated into the case simulations to provide a quickassessment and utilize clinical reasoning as they adjustedtheir recommendations. Students were encouraged toalso utilize gaming principles as they created their ownactive learning activities during their topic presentations.This allowed for autonomy and creativity for enhancedmotivation and retention of information.

The team reviewed each of the assignments to make surethey aligned with the objectives, assessment followed thegamification principles, and also encouraged motivation,participation, and higher level of thinking. All activitiesallowed students to assess their progress and enhancereflection and goal setting. Assignments were created tolimit faculty workload and provide an efficient way forassessment and feedback.

Table 1

Entrustable Professional Activity (EPA) Milestone LevelDescriptions

Level ofEntrustment Description

PharmacyPracticeModified

DescriptionLevel 1 I trust the student,

with specific directionand direct supervision,to initiate apreliminaryassessment ofcommon conditionsseen within thepractice setting. Thestudent requiressignificant correctionfor performanceimprovement.

Observe only,even withdirectsupervision

Level 2 I trust the student, withdirect supervision andfrequent correction, toassess common chronicconditions seen withinthe practice setting. Thestudent accepts feedbackfor performanceimprovement.

Perform withdirect,proactivesupervision

Level 3 I trust the student, withlimited correction, toassess common chronicconditions seen withinthe practice setting. Thestudent is self-directedand seeks guidance asnecessary.

Perform withreactivesupervision(i.e. on requestand quicklyavailable)

Level 4 I trust the student tocompletely andaccurately assesscommon chronicconditions seen withinthe practice setting as anindependent practitioner(upon licensure).

Supervise at adistance and/orpost hoc

Level 5 I trust that the studenthas mastered the abilityto completely andaccurately assesscommon conditions seenwithin the practicesetting as anindependent practitioner(upon licensure). Thestudent is qualified togive meaningfulfeedback to otherlearners.

Supervisemore juniorcolleagues

DevelopmentThe development phase includes production and testingof the content planned in the previous stages. In this

The Journal of Applied Instructional Design, 10(2) 39

phase the ID puts the plan into action through threesteps: drafting, production, and evaluation (InstructionalDesign Central LLC, n.d.; Kurt, 2017).

Coordinators met on a weekly basis to draft and producecourse content including lecture slides and outlines,clinical cases, rubrics, knowledge checks, and activelearning scenarios. Since the course was an elective, andmost--if not all--material had been previously taught,emphasis was made on key topics of patient safety andcomplexity when creating assignments and cases.Assignments mimicked those expected during studentexperiences on clinical rotations to scaffold expectationsand provide additional opportunities for didacticapplication before experiential learning. A prototype wascreated in the learning management system Canvas, andcoordinators used this prototype to develop and updatetheir course prior to implementation. Coordinatorsworked closely with the ID on a biweekly basis to applybest practices and purposeful integration of techniquesinto the course in preparation for the inaugural offering.Coordinators enlisted the help of five current fourth yearpharmacy students who were already on rotations toensure learners could identify the purpose andapplication of gamification. Rotation students also helpedto provide areas of confusion and topics of interest basedon current experiences. Coordinators used Canvas to postall assignment details, rubrics, and the leaderboard priorto the start of class. Additional gamification was providedreal-time through cases, verbal defenses, and an escaperoom activity.

ImplementationIn an iterative design process, the implementation phaseis where course materials are shared with studentsthrough a learning management system and delivery ofinstruction takes place. Course design elements areimplemented along with assessment of student learning(Instructional Design Central LLC, n.d.; Kurt, 2017). TheID is actively working to ensure the course is runningefficiently and gathers real time feedback for immediateredesign when necessary.

The first iteration of this gamified Internal MedicineElective course received overwhelming interest from thePY3 students. The class enrollment was maxed outquickly (N = 20), and students continued to ask if thecoordinators were able to add students to the course.Students enrolled in the course varied as far as scholasticranking with grade point averages ranging from thelowest 10% up to the top performers. Students wereexcited about the ability to prepare for their clinicalrotations and to enhance their clinical skills.

The course itself ran smoothly, and the coordinatorsprovided clear expectations for students regarding

deadlines, assignments, and grading rubrics that alignedwith gamification principles. On the first day of class, thecoordinators explained the different performance levelsused for the “Topic Presentation and Patient CaseSimulation” activities regarding performance andexpectations and how they aligned with the theme of thecourse:

“Unsatisfactory: Line Cook” (i.e., first-year level);“Below Expectations: Fish/Vegetable/Meat Cook”(i.e., second-year level);“Meets Expectations: Station Chef” (i.e., third-year level/current level);and “Exceeds Expectations: Sous Chef” (i.e.,fourth-year level).

Coordinators also emphasized that although these rubricswere built on a scale of 60 total possible points, studentswould be graded out of 45 or 50 points in order to alignwith their level of training in the program (third-year).The theme of the course was carried forward with food-themed badges (cupcake and corn necklaces), which wereawarded to the winners of activities and assignmentssuch as our journal club debate. Students were incrediblyenthusiastic about the honor of wearing the necklaces asa point of pride in their work until the next competitiontook place. Finally, a Master Chef escape room was usedto provide insight on how to interpret and apply patientinformation for appropriate therapy assessment andmanagement.

EvaluationEvaluation within the ADDIE process includes bothformative (i.e., throughout the whole process) andsummative evaluation after the implementation phase(Instructional Design Central LLC, n.d.; Kurt, 2017). Forthis particular design case, the focus during theevaluation phase was not on formal evaluation of theintervention itself. Instead, the ID worked with thecoordinators to determine if the course andprogrammatic objectives were met and to further reviewif identified gaps were filled. Gaming elements that wereadded to the course hoped to increase motivation of thestudents with a main emphasis on retention ofinformation given the complexity and importance ofeffective application.

In the first iteration of the course, coordinators relied oncourse evaluations to evaluate interventions. In suchevaluations, student comments were also utilized toprovide further details into overall critique ofgamification elements of the course. Informal verbalstudent feedback was overwhelmingly positive regardingthe gamification of the course and the content ofinformation provided to their pharmaceutical education.For the formal course evaluations through the college, all

The Journal of Applied Instructional Design, 10(2) 40

20 students reported a score of 5 out of 5 regarding thestatements, “I understand how this course will benefit meas a pharmacist” and “The course helped increase myknowledge and competence.” High average scores werealso received regarding clearly stated learning outcomes(4.9); content aligned with stated objectives (4.9);expectations were clear (4.8); and the requiredassignment and activities enhanced my learningexperience (4.8). Free responses indicated that studentsfelt this course would benefit all students and that itpromoted their learning in a way that could translate intoclinical practice during their fourth-year pharmacyrotations. Some representative comments from courseevaluations included, “The structure of this course isgreat!,” “Every single assignment that is done in thiscourse helps you refresh what you have learned so far inother pharmacotherapeutics courses,” and “All of theactivities felt like they enhanced my learning, and Ithought the overall course format was very conducive toreinforcing knowledge we have already come across.”Course coordinators and the ID met to review studentevaluations of the course and to brainstorm ideas forfuture offerings. There were some comments about harshgrading, but this was not in relation to the gamification ofthe course and instead the expectations for patient careas compared with previous courses. Unfortunately, due tothe fact that this was the inaugural class for the course,the coordinators did not have prior course grades tocompare.

Performance and Future Plans

While there were no quantifiable comparison data todrive decision making, our team decided to include self-reflections of the faculty, the ID, and the students asperformance indicators of the course. Discussion with theID lead to ideas for future evaluation through a pre- andpost-perception analysis of the EPAs (see Table 2),analysis of clinical rotation graded performance overall,and review of specific grading for the Internal MedicineAPPE clinical rotation. This would allow both thecoordinators and the ID to quantify immediate andlongitudinal improvements in student knowledge andretention. Additionally, course coordinators will continueto work with the ID to add in opportunities for students tolevel up through advancing through the food pyramid andgaining access to additional assessments for added pointstowards their final grade.

ConclusionThe positive feedback from the learners who participatedin the gamified Internal Medicine Elective course supportthe use of gamification for professional health education.It can be inferred that the use of gamification foreducational purposes does not seem to reduce credibilityof the information presented, but instead supports

learner engagement, retention, and application of skills.Incorporating gamification into the classroom was timeintensive and increased faculty workload at first. This wasin opposition to what the overall goal of this structurewas originally. Most of the time was spent creating theactivities and rubrics to assess student performance whilealigning with the overall theme of the game show. Severalhours were spent discussing course design with the IDand meeting prior to each activity to ensure alignmentwith the objectives and appropriate execution. Thecoordinators are hopeful that this will decrease with eachoffering. In the future, the coordinators plan to enhanceclinical case simulations to align with the course themeand encourage individualized assessment. With thefeedback provided from students, aspects of the coursedesign have been added to core skills-based courses tocontinue to build upon intrinsic motivation and provideavenues for individualized clinical reasoning andapplication for all PY3 students.

Using student feedback, coordinators have incorporatedadditional concepts including a written consult with gamedesign elements into the elective to allow forindividualized assessment of both verbal and writtencommunication methods. With the second offering,students were provided examples of ways to incorporategamification into their topic presentations and this hasbeen added to the assignment description. As the TCOPdevelops an updated modified block curriculum, futureand continuous involvement of the TIED team will beessential to follow pedagogical advances while at thesame time providing efficient and effective assessmentmethods for faculty with respect to workload.

Implications

Student engagement and self-directed learning is vital forhigher education, especially in medical education. Facultycontinue to struggle with a balance of creating options forindividualized student assessment and engagement andthe workload associated with such endeavors.Gamification has proven to be an effective means tochallenge students with autonomy and opportunities formastery while at the same time limiting faculty workloadand time spent on such assessments (Barone et al., 2018).While incorporating such elements into a course can betime consuming and overwhelming, teaming up with anID can alleviate the stress and allow for well-implementedgamification.

ReferencesAburahma, M. H., & Mohamed, H. M. (2015). Educational

games as a teaching tool in pharmacy curriculum.American Journal of Pharmaceutical Education, 79(4)59; http://dx.doi.org/10.5688/ajpe79459

The Journal of Applied Instructional Design, 10(2) 41

Accreditation Council for Pharmacy Education (ACPE).(2015). Accreditation standards and key elements forthe professional program in pharmacy leading to thedoctor of pharmacy degree. Accreditation Council forPharmacy Education.https://www.acpe-accredit.org/pdf/Standards2016FINAL.pdf.

Adams, B. J., Margaron, F., & Kaplan, B. J. (2012).Comparing video games and laparoscopic simulatorsin the development of laparoscopic skills in surgicalresidents. Journal of Surgical Education, 69(6),714-717.http://doi.org.ezproxy.hsc.usf.edu/10.1016/j.jsurg.2012.06.006

Alexander, D., Thrasher, M., Hughley, B., Woodworth, B.A., Carroll, W., Willig, J. H., & Cho, D. Y. (2018).Gamification as a tool for resident education inotolaryngology: A pilot study. Laryngoscope, 129(2),358-361.https://doi.org.ezproxy.hsc.usf.edu/10.1002/lary.27286

Barone, J. A., Aleksunes L, & Hermes-DeSantis E. R.(2018, July). An interactive and integratedpharmacotherapy and skills-based curriculum:Bridging basic and clinical science. [School Poster].119th Annual Meeting of the American Association ofColleges of Pharmacy, Boston, Massachusetts, UnitedStates.

Beatty, S. J., Kelley, K. A., Ha, J., & Matsunami, M.(2014). Measuring pre-advanced practice experienceoutcomes as part of a PharmD capstoneexperience. http://dx.doi.org/10.5688/ajpe788152

Bradley, B. A., Reinking, D., Colwell, J., Hall, L. A., Fisher,D., Frey, N., & Baumann, J. F. (2012). Clarifyingformative experiments in literacy research. In P. J.Dunston, S. K. Fullerton, C. C. Bates, K. Headley, &P. M. Stecker (Eds.), Sixty‐first yearbook of theLiteracy Research Association (pp. 410– 420).National Reading Conference.

Branch, R. M., & Kopcha, T. J. (2014). Instructionaldesign models. In J. Spector, M. Merrill, J. Elen, & M.Bishop (Eds.), Handbook of research on educationalcommunications and technology (pp. 77-87).Springer.

Brull, S., Finlayson, S., Kostelec, T., MacDonald, R., &Krenzischeck, D. (2017). Using gamification toimprove Productivity and increase knowledgeretention during orientation. Journal of NursingAdministration, 47(9), 448-453.https:/doi.org.ezproxy.hsc.usf.edu/10.1097/nna.0000000000000512

Budhrani, K. (2018, October). Redesign courses into acompetition-based game-show format. [Conferenceworkshop].41st Annual Meeting of the Association forEducational Communications & Technology, KansasCity, MO, United States.

Cain, J., Conway, J. M., DiVall, M. V., Erstad, B. L.,Lockman, P. R., Ressler, J. C., Schwartz, A. H., Stolte,S., & Nemire, R. E. (2014). Report of the 2013-2014Academic Affairs Committee. American Journal ofPharmaceutical Education, 78(10), S23.https://doi.org/10.5688/ajpe7810S23

Clark, D.R. (2015). ADDIE Model. The PerformanceJuxtaposition Site. https://edtechbooks.org/-XAT

Deterding, S., Dixon, D., Khaled, R., & Nacke, L. (2011).From game design elements to gamefulness: Defining“gamification.”. MindTrek, 9-15. https://edtechbooks.org/-Rmdf

Instructional Design Central, LLC (n.d.). InstructionalDesign Models. Instructional Design Central.https://www.instructionaldesigncentral.com/instructionaldesignmodels.

Kurt, S. (2017). ADDIE Model: Instructional Design.Educational Technology.https://edtechbooks.org/-vDu

Lin, D. T., Park, J., Liebert, C. A., & Lau, J. N. (2015).Validity evidence for surgical improvement of clinicalknowledge ops: A novel gaming platform to assesssurgical decision making. The American Journal ofSurgery, 209(1), 79-85.https//doi.org.ezproxy.hsc.usf.edu/10.1016/j.amjsurg.2014.08.033

Reinking, D., & Bradley, B. A. (2008). On formative anddesign experiments: Approaches to language andliteracy research. Teachers College Press.

Reiser, R.A. (2001). A history of instructional design andtechnology: Part II: A history of instructional design.Educational Technology Research and Development.49, 57-67. https://edtechbooks.org/-nghy

Salazar, M., Pattipati, S. N., Havard, P., Ofstad, W.,Duncan, W., Hussain, D., Fuentes, D. G., & Hughes, J.(2018, July). Collaboration across departments anduniversity stakeholders reduces impact of thedecreasing PharmD applicant pool. [School Poster].119th Annual Meeting of the American Association ofColleges of Pharmacy, Boston, Massachusetts, UnitedStates.

Sardi, L., Idri, A., & Fernandez-Aleman, J. L. (2017). Asystematic review of gamification in e-health. Journal

The Journal of Applied Instructional Design, 10(2) 42

of Biomedical Informatics, 31.https://doi.org/10.1016/j.jbi.2017.05.011

Sera, L. & Wheeler, E. (2017). Game on: The gamificationof the pharmacy classroom. Currents in PharmacyTeaching and Learning, 9(1), 155-159.https://doi.org.ezproxy.hsc.usf.edu/10.1016/j.cptl.2016.08.046

White, M., & Shellenbarger, T. (2018). Gamification ofnursing education with digital badges. NurseEducator, 43(2), 78-82.https://doi.org.ezproxy.hsc.usf.edu/10.1097/nne.0000000000000434

Wolf, C., Bott, S., Hernandez, I., & Grieve, L. (2018, July).Teaching about the healthcare industry throughgamification. [School Poster]. 119th Annual Meetingof the American Association of Colleges of Pharmacy,Boston, Massachusetts, United States.

Appendix: Internal Medicine ElectiveSimulation Rubric

Line Cook(EPALevel 1)

Fish/Vegetable/MeatCook(EPA Level 2)

StationChef(EPA Level

3)Sous Chef(EPA Level

4)PointsEarned

Significant HarmDone (Major)0points

Not YetCompetent(MinorHarm Done)10points

Competent15points

Best Practice(NewPractitioner/ResidencyLevel)20 points

PatientPresentation(Accuratelyreviews patient’s status[CC, HPI, PMH, ROS,PE, vitals, homemedications, allergies,and pertinentlabs/tests]. Detailschronological courseeffectively. Discussesrelevant signs andsymptoms andpertinent sequelae forthe disease or clinicalissue. Provides dataneeded for accurateassessment.)

Missing importantcomponents ofpatient review andassessment.Missinginformation ormisinterpretsinformation in amanner that wouldcause significantpatient harm ordeath.

Missing orinappropriateassessment of lessthan threecomponents of patientreview. Missing orinappropriateassessment causesminor harm to thepatient and/or doesnot significantlychangerecommendations.

Missing less thantwo componentsof patient review.Missingassessment doesnot significantlychangerecommendationsand is notdetrimental tothe patient’shealth.

Accurately identifies allkey elements from apatient's assessmentwith supporting data.

___/20

Knowledge(Discussesappropriate drugtherapy managementfor the disease statebased on currentpractice guidelines orstandards of care.Recommendations arepatient specific.Effectively summarizesand applies informationfrom primary literatureas it relates to patientcase. Discussespatient’s current drugtherapy, includingappropriateness,potential ADRs, dosingwith pharmacokineticand pharmacodynamicparameters, andduration of therapy.Uses appropriateparameters to assessendpoints of therapyincluding drug efficacyand/or toxicity.Provides importantcounseling points forthe patient whereappropriate. Providesdetails on monitoringand follow-up.)

Missing keyconcepts related topatient care.Inappropriate ormissingrecommendationsthat result insignificant patientharm ordeath.Problems notaccuratelyprioritized.

Missing key conceptsrelated to patientcare. Inappropriate ormissingrecommendations thatwould not result insignificant patientharm or death butcould cause minorharm to the patient.Problems notaccurately prioritized.

Missing fewconcepts relatedto patient care.Provides relevantand appropriaterecommendationsbased on currentpracticeguidelines formost problems.Accuratelyprioritizesproblems.

Identifies all conceptsrelated to patient care.Provides relevant andappropriaterecommendations basedon current practiceguidelines for allproblems. Accurateprioritizes problems.

___/20

CommunicationSkills(Identifies self.Follows requiredformat. Voice is clearand audible withappropriate pace.Providesrecommendations in aconfident manner. Usesopen-ended questions.Answers questionsaccurately, completely,and confidently.Interacts in aprofessional manner)

Unable tocommunicaterecommendationseffectively.Miscommunicationresults insignificant patientharm/death.Unableto answerquestions orquestions areansweredincorrectly causingsignificant patientharm or death.

Unable tocommunicaterecommendationseffectively.Miscommunicationmay result in minorpatient harm but doesnot result insignificant patientharm/death. Unable toanswer questions orquestions areanswered incorrectlyand causes minorharm to the patient.Heavily reliant onnotes with limited eyecontact.

Communicateseffectively andprovidesrecommendationsmostly in aconfidentmanner. Answersall questionsaccurately andmostly complete. References notesbut hasappropriate eyecontact.

Communicateseffectively and providesrecommendations in aconfident manner.Answers all questionsaccurately andcompletely. Limited useof notes with excellenteye contact.

___/20

Total ___/45(Expectationof student atthis time)

Note - The above table is adapted from MWU ChicagoCollege of Pharmacy IPPE Case PresentationEvaluation Form.

The Journal of Applied Instructional Design, 10(2) 43

A Study of InstructionalDesign Master's Programsand Their Responsivenessto Evolving Professional

Expectations andEmployer Demand

Ingrid Guerra-López & Ruta Joshi

As world and job market trends continue to evolve,so does our definition of the instructional designfield. Recent study definitions, professionalstandards, and employer demand suggest thatbeyond designing and developing instructionalsolutions, other competency domains from needsassessment to evaluation are essential forimproving learning and performance in a variety ofeducational and workplace settings. This studysought to elucidate the competencies thatinstructional design master’s programs consideressential as illustrated by their core curriculum.Findings suggest that while there is strongalignment in some competency areas, there isinsufficient evidence of alignment in others.

Evolving Definition ofInstructional DesignAs the world and job market continue to evolve, theinstructional design field has had its share of transition.One indication of its evolution has been the variousadaptations of its definition over the years that haveunderscored a shift from instructional media (Ely, 1963)to a focus on solving instructional problems ineducational settings (Cassidy, 1982; Silber 1970;Gentry,1995). While some of these definitions referred tosome variation of systemic and/or systematic, the focuswas typically on instructional systems and in some cases(e.g., AECT, 1994) included stages of a systematicprocess such as design, development, utilization,management, and evaluation of instruction. In hisdefinition, Silber (1970) offered a more concrete systemorientation suggesting elements such as research

followed by design, production, evaluation, support-supply, and utilization, and also including themanagement of system components such as messages,people, materials, devices, techniques, and settings. Overthe years, others have written on the essential expansionof our professional lens to the whole performance systemand outcomes (Clark & Estes, 2008; Guerra-López, 2008;Guerra-López & Hicks, 2017; Kaufman 1996; 2019;Rummler, 2007). More recently, Reiser and Dempsey(2018) have offered the following definition:

The field of instructional design andtechnology (also known as instructionaltechnology) encompasses the analysis oflearning and performance problems, andthe design, development, implementation,evaluation and management ofinstructional and non-instructionalprocesses and resources intended toimprove learning and performance in avariety of settings, particularlyeducational institutions and workplace (p.4).

This recent definition suggests recognition that the scopeof instructional design professionals extends beyonddesigning and developing instructional solutions, and thatthere is an expectation that they incorporate a variety ofprocesses from diagnosis to evaluation, in order tomeasurably improve learning and performance in avariety of performance settings.

The Evolving Role of InstructionalDesignerThe interdisciplinary roots of instructional design havecontributed to the diversity of our epistemologies andallowed us to draw from a variety of theories, empiricalevidence, and models. It has also contributed toprofessionals in the field working across a wide range ofjob titles, roles, and responsibilities. Instructional designprofessionals work under a variety of job titles includinginstructional designer or technologist, curriculumdeveloper, training manager, learning and developmentspecialist, and performance improvement consultant to-name-a-few. Klein and Kelly (2018) noted the difficulty inidentifying professionals in the field due to its changingnature and in their review of 393 job announcements forinstructional design jobs, found 35 different job titlesacross 28 industries. Evidence of this wide array of jobtitles can also be found in earlier studies. Furst-Bowe(1996) included 147 instructional design practitionersand found that 40 different job titles were used todescribe those professionals. Adding another dimensionof variability, Larson and Lockee (2008) observed thatinstructional design professionals work in a range of work

The Journal of Applied Instructional Design, 10(2) 44

environments such as, corporate, higher education, K-12,government, military, non-profit, and healthcare. Worksettings, job roles, and varied functions contribute to thewide range of job titles used for instructional designprofessionals (Liu, Gibby, Quiros, & Demps, 2002).

The diverse and multifaceted nature of work performedby instructional designers across job roles, contexts,areas of specialization, and career trajectories representa particular challenge for university programs, which aretrusted to effectively prepare instructional designgraduates to meet evolving professional expectations andemployer needs. Thus, this mixed-methods study seeks toelucidate the foundational competencies prioritized byinstructional design master’s programs. Whileinstructional design programs vary in the range of coreand elective courses they offer their students to preparefor the workforce, for the purposes of this study, corecourses are used as an indication of the competenciesprograms consider as essential and will consider electivesas optional competencies. This study will therefore focuson an analysis of core cores.

Professional Expectations andStandardsA professional organization supports a particularprofession and works to advance the quality, skill level,and interests of professionals working in that specificfield. A variety of professional organizations have usedprofessional standards as a framework for guidingprofessional practice and judging the competence ofprofessionals across key areas. The International Societyfor Performance Improvement (ISPI) (2016) has proposedthe following ten professional standards: (a) focusing onresults or outcomes, (b) taking a systemic view, (c)adding value, (d) working in partnership with clients andstakeholders, (e) determining the need or opportunity, (f)determining the cause, (g) designing thesolution(including implementation and evaluation), (h)ensuring solutions’ conformity and feasibility, (i)implementing solutions, (j) Evaluating the results andimpact.

Others have sought to propose professional competenciesas a way to characterize the work of instructionaldesigners. Competencies can be defined as a set ofobservable knowledge, skills, attitudes, and behaviors(McLagan, 1997; Gupta, Sleezer & Russ-Eft, 2007) thatenable one to effectively perform the activities of a givenoccupation or function to the standards expected inemployment (Koszalka, Russ-Eft, & Reiser, 2013; Richey,Fields, & Foxon, 2001). Similarly, the International Boardof Standards for Training, Performance and Instruction(IBSTPI, 2016) defines a competency as “a knowledge,skill, or attitude that enables one to effectively perform

the activities of a given occupation or function to thestandards expected in employment.” The identification ofkey competencies is particularly useful for educationalinstitutions because they provide guidelines forcurriculum development, program revision, andaccreditation (Byun, 2000; Klein & Jun, 2004).

In 2012, IBSTPI described the work of instructionaldesigners through 22 competencies categorized intofollowing domains: professional foundations, planningand analysis, design and development, evaluation andimplementation, and management. Meanwhile, theAssociation for Talent Development (ATD) has defined aset of foundational competencies that include businessskills such as global mindset, industry knowledge,interpersonal skills, personal skills, and technologyliteracy. Areas of technical expertise include instructionaldesign, learning technologies, performance improvement,evaluation, training delivery, managing learningprograms, coaching, knowledge management, and changemanagement (ATD, 2016). Additionally, the CanadianSociety for Training and Development (CSTD) (2011) hasproposed five competency areas: assessing performanceneeds, designing training, facilitating training, supportingthe transfer of learning, and evaluating training.

Finally, while the above competency frameworks havebeen focused on the skills of instructional designprofessionals and closely related roles, in 2012 theAssociation for Educational Communications andTechnology (AECT) adopted six key standards directed ateducation programs and provided them with a frameworkby which programs could determine student competenceacross six key dimensions: (a) Content knowledge, wherestudents demonstrate the knowledge necessary to create,use, assess, and manage theoretical and practicalapplications of educational technologies and processes;(b) Content pedagogy, where students develop asreflective practitioners able to demonstrate effectiveimplementation of educational technologies andprocesses based on contemporary content and pedagogy;(c) Learning environments, where students facilitatelearning by creating, using, evaluating, and managingeffective learning environments; (d) ProfessionalKnowledge and Skills, where students design, develop,implement, and evaluate technology-rich learningenvironments within a supportive community of practice;and (e) Research, where students explore, evaluate,synthesize, and apply methods of inquiry to enhancelearning and improve performance.

Prior research has looked at the alignment betweeninstructional design program curriculum and whatprofessionals, including faculty consider to be importantskills. Fox and Klein (2003) surveyed 24 instructionaldesign faculty members in 11 U.S. universities as well as45 ISPI and ATD members and asked participants to rate

The Journal of Applied Instructional Design, 10(2) 45

the importance of having human performance technology(HPT) competencies for graduates of instructionalsystems/design/technology programs. The competencieslisted in the survey instrument used by the authors werebased on the Handbook of Human PerformanceTechnology (Stolovitch & Keeps, 1999) and HPT coursesoffered by instructional design programs. Respondentsindicated that instructional design graduates should havea broad knowledge of HPT and performance improvementprocess with training needs assessment rated as one ofthe top competencies. Fox and Klein (2003) also notedthat most of the highly rated competencies do not receiveextensive coverage in most ID programs. Specifically,they pointed out that many instructional design programsdo not include human performance as a core course, andthat there are numerous other discrepancies between theinstructional design program curricula and thecompetencies their respondents considered important.

Employer ExpectationsPrior research has also sought to improve ourunderstanding of employers’ expectations (Byun, 2000;Fox & Klein, 2003; Klein & Fox, 2004; Klein & Jun, 2014;Klein & Kelly, 2018; Larson & Lockee, 2007, 2008;Richey, Fields, & Foxon, 2001; Sugar, Hoard, Brown, &Daniels, 2012). Job announcement analysis has been oneof the common methods for identifying competenciesexpected from professionals of a field (Kang & Ritzhaupt,2015; Ritzhaupt, Martin & Daniels, 2010; Moallem 1995;Ritzhaupt & Martin, 2013; Sugar et al. 2012). Anadvantage of using job announcements is first-handinformation on which competencies are in demand in themarket (Klein & Kelly, 2018) which are of benefit not onlyto university programs but also to human resourceconsultants and recruiters who want to find out whatskills employers are seeking (Sodhi & Son, 2009).

Ritzhaupt, Martin and Daniels (2010) studied 205 jobpostings and 231 survey responses by instructionaldesign professionals and found the top 4 instructionaldesign-related competencies included ability to createeffective instructional products (72%), apply soundinstructional design principles (34%), conduct needsassessment (33%) and conduct evaluation (32%). Sugar etal. (2012) analyzed 615 job postings for InstructionalDesign and Technology and identified skills expected ofinstructional design program graduates. The most sought-after competency areas were knowledge of instructionaldesign and ADDIE which appeared in more than 90% ofthe listings. The research identified evaluatingeffectiveness of training programs and conducting needsanalysis as a priority for more than 50% of employers.When disaggregated into Corporate and HigherEducation sectors, researchers found that while morethan 61% corporate jobs require assessment skills, thenumber was far lower for higher education jobs at 33%.

E-learning, evaluation, and project management weresome other competencies highlighted by the study.

Carliner et al. (2015) explored the competencies neededby Performance Consultants. By studying 56 anonymizedjob descriptions for the position of a PerformanceConsultant in Canada, they used qualitative contentanalysis techniques to create a profile of the performanceconsultant position. In doing so, they also compared theprofile to the Training and Development Professionalscompetency model from the CSTD. They observed thatproject management competencies were in demand byemployers but missing in the CSTD model.

Along with being included in the Competencies forTraining and Development Professionals framework, 42out of 56 (or 75%) job descriptions have needsassessment as a priority area, making it a core area ofresponsibility. Design and development of programs, andmanaging client relationships were other competenciesprioritized in the job listings. Kang and Ritzhaupt (2015)analyzed 400 job postings for 43 related job titles in theinstructional design field and found that along withknowledge of instructional design models and principles(55%), the top competency areas included learningmanagement systems (33.75%) and assessment methods(27.5%).

Most recently, Klein and Kelly (2018) analyzed 393 jobannouncements and interviewed 20 instructional designmanagers to identify instructional designer competenciesexpected by employers. The required competenciesvaried according to the job sector (e.g., corporate, highereducation, healthcare, and consulting). Effectivecollaboration (75%) and application of learning theoryand principles (60%) showed up in all sectors.Additionally, ADDIE (67%), e-learning (64%), and needsassessment (53%) were also prominent competenciesacross the sectors. From these employer demand studies,we can see a pattern that typically, and not surprisingly,places instructional design, needs assessment, evaluation,and technology-related competencies as top competencydomains.

MethodologyThe study sought to answer: What foundationalcompetencies are prioritized by instructional designmaster’s programs? Content analysis techniques wereused to systematically review and code existinginformation available through instructional designprogram websites (Boettger & Palmer, 2010). Frequencycounts were used to estimate the occurrence of particularinstances and themes. This study used core courses as anindication of essential or prioritized programcompetencies because of the expectation that all studentstake these courses as part of their preparation.

The Journal of Applied Instructional Design, 10(2) 46

Competencies met through electives are optional andstudents are not guaranteed to walk away with thecompetencies developed through those courses. Theresearchers reviewed core courses offered by 40instructional design (or comparable name) master’sprograms across 34 universities in the United States(U.S.). Given that universities offer programs ineducational technology related fields under differentprogram titles based on their program emphasis andcourse structure (Almaden & Ku, 2017), an instructionaldesign or comparable master’s programs was defined asone having some variation of the following titles:“Learning Design and Technology”, “EducationalTechnology”, “Instructional Design and Technology”,“Instructional Systems Design”, “CurriculumDevelopment and Technology”, “OrganizationalPerformance and Workplace Learning” etc. This study didnot include programs from related fields such as humanresource development or organizational developmentbecause while there may be overlap in courses, thoseprograms are rooted in different disciplinary andtheoretical perspectives. Programs in our sample wereselected from the information found on the list for BestMasters in Educational Technology Program onbesteducationdegrees.com and by using online searchengines like Google and searching for InstructionalDesign programs offered in the U.S.

The researchers reviewed all core courses across the 40programs ( a total of 249 courses) using contentpublished in program websites. In most cases, corecourses formed half of the total credits required tograduate. For each program, the researchers gathereddata about the degree offered, name of the program, totalcredits, core courses, course requirements, mode of study(online or on campus), and the course objectives.Researchers also captured additional requirements forcourse completion such as thesis, internship, portfolio,etc.

The researchers analyzed published program descriptionsin order to provide context for the analysis of courses. Wenoted total credits and their breakdown into core,electives and specialization for each program as part ofour study database. We used a grounded theory approachto guide the process of coding of initial themes, clusteringrelated themes into subcategories, and finally buildingresearch consensus through in-depth discussions. Wedeveloped a high-level map with identified prioritycompetency areas in order to facilitate comparisons(Miles, Huberman & Saldana, 2014). The research teamcollectively reviewed and finalized the codes to be used.

Findings

Program Profiles

Various types of master’s degrees are offered. Oursample included 15 Master of Science (MS), 10 Master ofEducation (M.Ed.), four Master of Arts (MA), threeMaster of Science in Education (MSEd), two Master ofArts in Education (MAEd), two Master of EducationalTechnology (MET), two Master of Science in EducationalTechnology (MSET), one Master of Arts in EducationalTechnology (MAET), and one Master of LearningTechnology (MLT).

While program names varied, they included terms such asinstructional, technology, design, systems, learning andeducational. As Figure 1 illustrates,Technology/technologies were the most commonly usedterms in program names (used in 33 of 40 programnames). “Instructional”, “design”, and “learning” werealso frequently used terms, followed by systems andeducation. Less frequently used terms includedcurriculum, performance, training, improvement,development, innovation, psychology, sciences, librarymedia, and entrepreneurship.

Figure 1

Frequency of Terms found in Program Names

Chart showing frequency of terms found in program names

As Table 1 illustrates, 30 of the programs indicatedpreparing students to develop effective instruction(including design, development, and implementation) astheir focus. Other commonly cited focus areas includedpreparing students to apply skills in a corporateenvironment (26 programs) and technology integration(24 programs), followed closely by application of skills inacademic environments (22). Other focus areas includedevaluation of effective instruction (17), improvedperformance (10), development of learningenvironments/systems (8), leadership (6), improvedlearning processes (4), and specific emphasis on K-12environments (4).

Table 1

Focus Area Based on Published Descriptions

The Journal of Applied Instructional Design, 10(2) 47

Program purpose as stated in programdescriptions Count

1. Effective instruction development(Design, Develop, Implement)

30

2. Apply skills in corporate environment 263. Technology integration 244. Apply skills in academic environment 225. Evaluate effectiveness of instruction 176. Performance Improvement 107. Development of learningenvironments/systems

8

8. Leadership 69. Learning processes 510. K-12 3

In terms of competency areas, six overarching themesemerged from the content analysis of the 249 coursesconsisting of (a) Foundations of Instructional Design,Design and Development, (b) Research and Evaluation,(c) Technology, (d) Teaching and Learning, (e)Management and Performance, and (f) NeedsAssessment. These overarching domains in turn consistedof a number of more specific subdomains as illustrated inTable 2.

Table 2

Program Requirements – Core Courses

The Journal of Applied Instructional Design, 10(2) 48

CompetencyDomain

Sub-Domain(Topics) with

frequencySample Course

Titles

Totalnumber

ofcourses

Foundationsof ID, DesignandDevelopment

• Overview ofinstructionaldesign field(21)• DesignInstructionalInterventions(20)•Introductiontoinstructionaldesign &developmentprocess andskills (13)•Currentissues andtrends (8)•Advanced ID(5)• DesignThinking (3)•CurriculumDevelopment(1)• Text-basedinstruction(1)

• Introductionto ID• Trendsand Issues inInstructionalDesign•CurrentTrends inInstructionalTechnology•InstructionalDesign•DesignThinking andKnowledge•AdvancedInstructionalDesignTheory•MessageDesign andDisplay

72

Research andEvaluation

• Evaluations(24)•EducationalResearch (17)•ResearchMethodology(14)• LearnerAssessments(4)• InquiryandMeasurement(5)• LearningAnalytics (3)

• Measurementand Evaluationin Education•Inquiry andMeasurement•Data-DrivenDecision-Makingfor Instruction•Evaluation ofLearning andPerformance•ResearchMethods inEducation•Research inInstructionalTechnology

67

Technology • Integration oftechnology(21)• E-learning (13)•EmergingTechnologies(8)• Social,ethical, legalissues intechnology (4)•Library andDatabase,DigitalInformation(3)•Computers-hardware (1)•Multimedia inK-12 (1)

• EmergingTechnologies•Technology andDesign•Introduction toE-learning•LearningManagementSystems•InteractiveCourse Design•Library andDigital•InformationManagement•Foundations ofDistanceEducation•Selection andIntegration ofMultimedia forK12 Schools

51

Teaching andLearning

• LearningTheories (23)•Adult Learner(3)•EducationalPsychology(2)• Roles ofteachers (2)

• The Adult asLearner•Learning theory

30

ManagementandPerformance

• PerformanceImprovement(6)• ProjectManagement(5)• HumanPerformanceTechnology(4)•Communication(2)• Diversity(2)•Organizationallearning (1)•FinancialImpact ofTraining andPerformanceImprovement(1)•Leadership (1)

• Foundations ofProjectManagement•Leadership andEducation•Social/CulturalIssues inEducationalTechnology•Intro to HPT•Foundations ofPerformanceImprovement•Return onInvestment inTraining andPerformanceImprovement

22

NeedsAssessment

• NeedsAssessment(5)• StrategicAssessment (2)

• InstructionalNeedsAssessment andAnalysis•StrategicAssessment inEducation

7

The strongest theme to emerge from core courses wasFoundations of Instructional Design, Design andDevelopment (72 courses), followed by Research andEvaluation (67 courses), and Technology (51 courses),while Needs Assessment (seven courses) was at thebottom of the list. Among the sub-domains (i.e. specificcourse topics) evaluation was the most common, with 24programs including it as core courses along with learningtheories which was included by 23 courses. They wereclosely followed by Overview of ID and technologyintegration with 21 programs each.

Foundations of Instructional Design, Designand Development

The Foundations of Instructional Design, Design andDevelopment theme includes an overview of the field ofInstructional Design and Technology emphasizing historyand guiding principles, knowledge, skills, and abilities,and ethical foundations of the field, as well as currenttrends and future implications. This category alsoincludes design and development courses offering anoverview of the instructional design process along withtopics such as design thinking, curriculum developmentand learning strategies. Core courses, required by 33programs, covered at least one of the topics in thisdomain. Our sample contained 72 courses covering topicsunder this domain with some programs covering morethan one topic. The most commonly offered course titleswere Introduction to instructional design, Trends andIssues in Instructional Design, Current Trends inInstructional Technology, Instructional Design, AdvancedInstructional Design Theory, and Message Design andDisplay.

Research and Evaluation

Courses covering evaluation of instructional interventionsand building skills for making data-based decisionsrelated to learning and human performance, as well ascourses related to research methodology and educationalresearch are included under the Research and Evaluationdomain. It is a widely covered competency domain with33 university programs requiring at least one course fromthis domain. The term evaluation appeared in 24 requiredcourses with titles such as Measurement and Evaluationin Education and Evaluation of Learning andPerformance. Research courses like Research Methods inEducation and Research in Instructional Technology werealso included. Overall, 67 required courses fell under thisdomain.

Technology

Technology-focused courses had diverse titles.Integration of technology in instructional design, e-learning, emerging technologies and their potential

The Journal of Applied Instructional Design, 10(2) 49

impact on classrooms, technology applications as aneffective tool in teaching and learning, learningmanagement systems are some of the topics that make upthe Technology domain. Technology was the focus of 51required courses across twenty-eight programs andincluded course titles such as Emerging Technologies,Technology and Design, and Introduction to E-learning.

Teaching and Learning

This domain includes course topics related to learningtheory and application to the instructional designprocess, adult learners, role of teachers, and anintroduction to theories emerging from research oneducational psychology. A total of 30 courses in oursample are captured in this domain with learning theoriesbeing the most common course – required as a corecourse by 23 programs.

Management and Performance

This theme includes courses that address a variety ofenvironmental variables that affect behavior and/orperformance and covers courses like ProjectManagement, Human Performance Technology (HPT) ,and Performance Improvement. A total of 22 requiredcourses fell under this domain with course titles such asFoundations of Project Management Intro to HPT, andFoundations of Performance Improvement.

Needs Assessment

This domain consists of courses that focus a variety offront-end or needs assessment processes with somevariation of strategic assessment, needs assessmentmodels, validating programs and assessing continuingvalidity of ongoing programs based on needs. A total ofseven required courses fell under this category withcourse titles such as Instructional Needs Assessment andAnalysis and Strategic Assessment in Education.

Discussion

A Primary Focus on Designing Instruction?

While our review of the research literature suggests thatprofessional standards and employer demand continue toevolve, the extent to which the current core courses ofinstructional design programs are aligned to theseexpectations are variable. As suggested by Reiser andDempsey (2018), the role of instructional designprofessionals includes the resolution of learning andperformance problems in a variety of settings. Yet, themajority of the program descriptions in our sample (75%)indicated their focus was to prepare students to developeffective instruction with an additional five programsindicating their focus was on improving the learning

process. Design was the strongest theme to emerge, with72 (29%) core courses across the 40 programs in oursample. This is perhaps not surprising given our coreidentity as an instructional design field, and moreover, itwould be sensible to continue to find design as a corecompetency domain for instructional design professionalsin the future. The question is whether we are preparinginstructional design professionals with a complimentaryset of foundational skills to enable them to addresslearning and performance problems effectively andsustainably. Instruction may or may not addressperformance problems, and if we define our goal asdeveloping instruction, we neglect to develop and applyother skills that are essential for effectively addressingperformance problem(s).

In contrast, one-fourth (25%) of program descriptionsalluded to performance improvement as part of theirfocus. Fox and Klein (2003) previously observed that dueto the traditional focus on training solutions, instructionaldesign programs may be struggling with the extent towhich they should incorporate performance improvementor human performance technology in their curricula. Bystudying the responses of instructional design facultymembers and members of local chapters of ISPI and ATDto their survey, they noted that instructional designgraduates should have a broad knowledge of performanceand performance improvement processes. The authorswent on to assert that most instructional design programsdo not require performance improvement courses,indicating discrepancies between the curricula offered bythe programs and the competencies that professionalsworking in the field consider important.

A Systemic Approach Is Vital to AssessingNeeds and Improving Performance

Properly preparing students to address performanceissues requires more than a program description page orintroductory courses to performance improvement. If weare serious about addressing learning and performanceproblems, or leveraging opportunities, we must beginwith a real diagnosis or needs assessment. A needsassessment has the greatest utility and impact when wetake a systems approach because it provides us with aholistic view of reality, helps us distinguish assumptionsfrom facts, validates evidenced-based needs, and reducesthe risk of wasting precious resources on designingsolutions (particularly instruction) that will not addressunderlying issues or get us much closer to expectedoutcomes.

Identifying learning and performance issues requires usto understand the system, which is made up ofinterrelated factors and dynamics that create and sustainthose recurring issues. A systems approach to needsassessment allows us to clearly define the outcomes that

The Journal of Applied Instructional Design, 10(2) 50

the system should deliver, the root causes or barriers thatare getting in the way of achieving those outcomes, andthe requirements that must be met by thesolution(s)—this, in turn, gives us a strong foundationwith which to judge the appropriateness of proposedsolutions. (Kaufman & Guerra-López, 2013; Guerra-López, 2018; Guerra-López & Hicks, 2017). Based on thisdescription of a systems approach to needs assessment,one might reasonably see the importance of thiscompetency on enabling ID professionals to addresslearning and performance problems as stated in Reiserand Dempsey’s 2018 definition, and adding the type ofmeasurable value that employers expect.

Interestingly, needs assessment represents the weakestarea of instructional design professionals’ preparation asindicated by seven (3%) core courses on some variation ofneeds assessment across 40 masters’ programs. Yetacross many professional organizational competencyframeworks and standards (i.e., ISPI, IBSTPI, CSTD, ATD)and employer demand research (Carliner et al., 2015;Klein & Kelly, 2018; Sugar et al., 2012), needsassessment has repeatedly ranks as a prioritycompetency. Richey, Morrison, and Foxon (2007) usedthe Occupational Information Network (O*NET), anonline database containing hundreds of occupationaldefinitions and work demand trends in the U.S., andfound that conducting needs assessments and strategiclearning assessments showed up as one of the mostimportant tasks for the instructional technologist. Thisseems to suggest that it would behoove ID programs toreflect on where and how in their curriculum, they arepreparing students to meet the demands of employers asit pertains to essential needs assessment approaches andskills.

Setting

Consistent with earlier definitions of the field whichplaced boundaries around educational settings, 22program descriptions indicated their students areprepared to work in academic environments and another4 programs in our sample indicated having a specificemphasis on K-12 environments. Yet, the researchliterature suggests that instructional design professionalswork primarily in business and industry (Byun, 2000;Klein & Kelly 2018; Heideman, 1991; Larson & Lockee,2008; Moallem, 1995; Richey et al, 2007). Our findingssuggest that there may be some moderate alignment tothis target setting, with 26 of 40 (65%) programsindicating they prepare students to work in corporateenvironments. Setting is important because it influencesour view of the system, and in turn, how we define theconditions and objectives of interest. While instructionaldesign professionals use principles, evidence, models,and tools that can be useful across a variety of settings,the solutions we design, the stakeholders we work with,

the approach to implementation and management, andmany other important details must also be contextualizedto the realities and dynamics of specific settings.

Technology

Not surprisingly, technology also emerged as a strongtheme with 24 (60%) of the program descriptionsindicating their focus included technology integration and51 (20%) core courses in the sample fitting into atechnology dimension. Berge, de Verneil, Berge, Davis,and Smith (2002) forecasted that instructional designprofessionals will continue to be expected to increasetheir competence with technology given the technologicaltransformation that our society is undergoing. Digitaldisruption is already a reality. Klaus Schwab (2016), theFounder and Executive Chairman of the World EconomicForum argues that we are undergoing a fourth industrialrevolution driven by technology and it is evolving at anexponential rather than a linear pace disruptingpractically every industry. This has significantimplications for the workplace and education. Jobs arechanging with some becoming obsolete, others changingin fundamental ways, and entirely new types of jobsemerging. This has significant implications for theevolution of workplace learning and performance as wellas how we educate learners in educational settings atevery level.

Technology has the potential to significantly strengthenthe ability of educational institutions to deliver on theircore educational mission with greater quality, efficiency,and effectiveness. However, digital tools have been usedlargely under traditional models of teaching and learning(e.g., drill-and-skill instruction, information transmission)and not surprisingly, we have not seen significantimprovement of results (Fishman & Dede, 2016).Instructional design professionals can provide leadershipin instructional strategies and approaches informed byadvances in learning science and technology innovations,coupled with a system’s approach and a clear focus onimproving results. Thirty (12%) core courses in oursample fell under the teaching or learning themesuggesting that perhaps there is an opportunity to betterbalance student preparation on technology innovationswith advances in learning science. Certainly, withouttaking a closer look at the actual course syllabi, it isdifficult to say with certainty that learning science is notalready being incorporated into other courses such asinstructional design and technology. But given the varietyof disciplines from which instructional design programsdraw students, our stated intent to improve learning, andthe ongoing advances in learning sciences, it would seemlogical to expect a stronger prevalence of courses inlearning sciences.

The Journal of Applied Instructional Design, 10(2) 51

Research and Evaluation

Evaluation emerged as the second strongest theme with67 (27%) of core courses offered in some variation ofresearch and evaluation skills. This gives us a positiveoutlook on the contributions that instructional designprofessionals can make and says something about ourstrong tradition in the application of research andevidence in our field. As a continuous improvement toolevaluation can help instructional design professionals andthose they work with focus on well-defined needs andpriorities so that learning and performance improvementsolutions are clearly aligned to desired results at theindividual, team, institution, and external impact levels(Guerra- López, 2008). It can help us learn how, when,why, and for whom learning and performanceimprovement solutions work as intended. If clearlyaligned to key decision-making needs, evaluation canserve as a source of timely feedback allowing us topromptly adapt learning and performance improvementsolutions, ensuring alignment to targeted results (Guerra-López, 2018).

Based on the emphasis on research and evaluation ininstructional design programs, it would seem thatinstructional design professionals tend to be wellprepared in evaluation and research skills. As previouslymentioned the lens we use to frame our scope cansignificantly impact the value we add. Applying theseresearch and evaluation skills in the context of a systemperspective can influence our understanding of desiredoutcomes and whether we are merely making short-termimprovements without meaningful contribution to long-term results (Clark & Estes, 2008; Guerra, 2008).

ConclusionWhile there seems to be some alignment betweenprofessional expectations and the aggregate instructionaldesign masters curriculum, particularly aroundinstructional design, research and evaluation, andtechnology, this study’s findings suggest there may stillbe gaps to further validate and address when it comes toneeds assessment and systemic approaches. This isparticularly important given our stated role in addressinglearning and performance problems. Without fullydefining what problem should be addressed andunderstanding the system variables, recurring loops, anddynamics sustaining that problem, we run the risk ofapplying our design skills to inadequately definedproblems. These problem definition issues in the front-end, provide a flawed frame for the subsequentimprovement efforts and ultimately lead to poorly focusedevaluations that draw flawed conclusions about theeffectiveness of solutions, generating a false sense ofaccomplishment and value. In an ever-increasinglyinterconnected world, the demand for system thinking

and systemic approaches that involve interdisciplinaryand multi-stakeholder methods to solve complex societalproblems will continue to grow. This represents animportant opportunity to raise the profile and value ofinstructional design professionals.

ReferencesAlmaden, A., & Ku, H. (2017). Analyzing the curricula of

Doctor of Philosophy in educational technology-related programs in the United States. EducationalMedia International, 54 (2), 112-128.doi:10.1080/09523987.2017.1364210.

Berge, Z., de Verneil, M., Berge, N., Davis, L., & Smith, D.(2002). The increasing scope of training anddevelopment competency. Benchmarking: AnInternational Journal, 9(1), 43–61.

Boettger, R. K. & Palmer, L. A. (2010, December).Quantitative content analysis: Its use in technicalcommunication. IEEE Transactions on ProfessionalCommunication, 53 (4), 346-357. doi:10.1109/TPC.2010.2077450.

Byun, H. (2000). Identifying job types and competenciesfor instructional technologists: A five year analysis[Doctoral dissertation, Indiana University]Dissertation Abstracts International, 61, 4346. doi:10.17232/KSET.17.1.57

Carliner, S., Castonguay, C., Sheepy, E., Ribeiro, O.,Sabri, H., Saylor, C., & Valle, A. (2015). The job of aperformance consultant: A qualitative contentanalysis of job descriptions. European Journal ofTraining and Development, 39 (6), 458-483.doi:10.1108/ejtd-01-2015-0006

Cassidy, M. F. (1982). Toward integration: Education,instructional technology, and semiotics. EducationalCommunications and Technology Journal, 20(2),75–89.

Clark, R. E., & Estes, F. (2008). Turning research intoresults: A guide to selecting the right performancesolutions. Information Age.

Ely, D.P., ed. (1963). The changing role of the audiovisualprocess in education: A definition and a glossary ofrelated terms. TCP Monograph No. 1. AVCommunication Review, 11(1), Supplement No. 6.{Google Scholar}

Fishman, B., & Dede, C. (2016). Teaching and technology:New tools for new times. In D. H. Gitomer & C. A.Bell (Eds.), Handbook of Research on Teaching (5th

ed., pp. 1269 - 1334 ). American Education andResearch Association.

The Journal of Applied Instructional Design, 10(2) 52

Furst-Bowe, J. A. (Ed.) (1996). Competencies needed todesign and deliver training using instructionaltechnology. In proceedings of selected research anddevelopment presentations at the 1996 nationalconvention of the Association for EducationalCommunications and Technology. (ERIC DocumentReproduction Service No. ED397795) Retrieved April24, 2021 from https://www.learntechlib.org/p/80106/.

Fox, E. J., & Klein, J. D. (2003). What should instructionaldesigners and technologists know about humanperformance technology? Performance ImprovementQuarterly, 16(3), 87–98.

Gentry, C. G. (1995). Educational technology: A questionof meaning. In G. Anglin (Ed.), Instructionaltechnology: Past, present, and future.LibrariesUnlimited.

Guerra, I. (2008). Outcome-based vocationalrehabilitation: Measuring valuable results.Performance Improvement Quarterly, 18 (3), 65-75.doi:10.1111/j.1937-8327.2005.tb00342.x

Guerra-Lopez, I., & Hicks, K. (2017). Partner forperformance: Strategically aligning learning anddevelopment. ATD Press.

Guerra‐López, I. (2018). Ensuring measurable strategicalignment to external clients and society.Performance Improvement, 57 (6), 33-40.

Gupta, K., Sleezer, C. M., & Russ-Eft, D. F. (2007). Apractical guide to needs assessment (2nd ed.).Pfeiffer/John Wiley & Sons.

Heideman, J. G. (1991). A forecast of the competenciesrequired for effective performance by instructionaltechnology practitioners in the year 2000 (Order No.9127229). Available from ProQuest Dissertations &Theses Global. (303934240).https://proxy.lib.wayne.edu/login?url=https://www-proquest-com.proxy.lib.wayne.edu/dissertations-theses/forecast-competencies-required-effective/docview/303934240/se-2?accountid=14925

IBSTPI. (2016). Instructional Design Competencies.Retrieved from:https://ibstpi.org/instructional-design-competencies/

Kang, Y., & Ritzhaupt, A. D. (2015). A job announcementanalysis of educational technology professionalpositions. Journal of Educational TechnologySystems, 43 (3), 231-256.doi:10.1177/0047239515570572

Kaufman, R. (1996). Visions, Strategic Planning, andQuality—More Than Hype. Educational Technology,

36(5), 60-62. Retrieved April 24, 2021, fromhttp://www.jstor.org/stable/44428366

Kaufman, R. (2019). Revisiting mega as the central focusfor defining and delivering worthy results.Performance Improvement, 58 (7), 20-23.doi:10.1002/pfi.21885

Kaufman, R., & Guerra‐ López, I. (2013). Needsassessment for organizational success. ASTD Press.

Klein, J. D., & Fox, E. J. (2004). Performanceimprovement competencies for instructionaltechnologists. TechTrends, 48 (2), 22-25.doi:10.1007/bf02762539

Klein, J.D. and Jun, S. (2014). Skills for InstructionalDesign Professionals. Performance Improvement, 53:41-46.https://doi-org.proxy.lib.wayne.edu/10.1002/pfi.21397

Klein, J. D., & Kelly, W. Q. (2018). Competencies forinstructional designers: A view from employers.Performance Improvement Quarterly, 31 (3),225-247. doi:10.1002/piq.21257

Koszalka, T., Russ-Eft, D., & Reiser, R. (2013)Instructional design competencies: The standards (4th

ed.). Information Age Publishing.

Larson, M. B., & Lockee, B. B. (2007). Preparinginstructional designers for different careerenvironments: A case study. Educational TechnologyResearch and Development, 57 (1), 1-24.doi:10.1007/s11423-006-9031-4

Larson, M. B., & Lockee, B. B. (2008). Instructionaldesign practice: Career environments, job roles, anda climate of change. Performance ImprovementQuarterly, 17 (1), 22-40.doi:10.1111/j.1937-8327.2004.tb00300.x

Liu, M., Gibby, S., Quiros, O., & Demps, E. (2002).Challenges of being an instructional designer for newmedia development: A view from the practitioners.Journal of Educational Multimedia and Hypermedia,11(3), 195-219. Retrieved April 24, 2021 fromhttps://www.learntechlib.org/primary/p/9266/.

McLagan, P. (1996, January). Great ideas revisited.Training & Development, 50 (1), pp. 60-66.

Miles, M. B., Huberman, A. M., & Saldaña, J. (2014).Qualitative data analysis: A methods sourcebook (3rded.). SAGE.

Moallem, M. (1995). Analysis of job announcements andthe required competencies for instructionaltechnology professionals Paper presentation] The

The Journal of Applied Instructional Design, 10(2) 53

Annual Meeting of the American EducationalResearch Association, San Francisco, CA, UnitedStates. Reiser, R. A., & Dempsey, J. V. (2018). Trendsand issues in instructional design and technology.Pearson Education.

Richey, R. C., Fields, D. C., & Foxon, M. (2001).Instructional design competencies: The standards (3rd

ed.). Eric Clearing-House on Information Technology.

Richey, R.C., Morrison, G., & Foxon, M. (2007).Instructional design in business and industry. In R.A.Reiser, & J.V. Dempsey (Eds.), Trends and issues ininstructional design and technology (2nd ed., pp.174–184). Prentice Hall.

Ritzhaupt, A. D., & Martin, F. (2013). Development andvalidation of the educational technologist multimediacompetency survey. Educational TechnologyResearch and Development, 62 (1), 13-33.doi:10.1007/s11423-013-9325-2

Ritzhaupt, A., Martin, F., & Daniels, K. (2010).Multimedia competencies for an educationaltechnologist: A survey of professionals and jobannouncement analysis. Journal of EducationalMultimedia and Hypermedia, 19(4), 421-449.

Rummler, G. A. (2007). Serious performance consultingaccording to Rummler. John Wiley & Sons.

Silber, K. H. (1970). What Field Are We In, Anyhow?.Audiovisual Instr, 15 (5), 21-4.

Stolovitch, H.L. & Keeps, E.J. (1999). Handbook ofHuman Performance Technology (2nd ed.).Jossey-Bass.

Sugar, W., Hoard, B., Brown, A., & Daniels, L. (2012).Identifying multimedia production competencies andskills of instructional design and technologyprofessionals: An analysis of recent job postings.Journal of Educational Technology Systems, 40 (3),227-249. doi:10.2190/et.40.3.b

The Journal of Applied Instructional Design, 10(2) 54

Ladders and Escalators:Examining AdvancementObstacles for Women in

Instructional Design

Jeremy Bond, Kathryn Dirkin, Alexa JeanTyler, & Stefanie Lassitter

Careful analysis of survey data from Bond andDirkin (2018) indicate the possible presence of aphenomenon known as the glass escalator, first putforth by Williams (1992; 2013), in instructionaldesign. The glass escalator effect surfaced infemale-majority professions, indicating thatadvantages are experienced by males due in part totheir tokenism and social standing. The degree towhich these factors are present varies from study tostudy and is impacted by the continuing evolutionwithin the professions selected for investigation.The findings in this study note more significantexperiences in leadership and more frequentinvolvement in functions extending beyond atraditional instructional design scope among maleinstructional designers, despite their minoritystatus in the field. Though some factors that couldaccount for the disparity are present, in other cases,conditions are contradictory or inconclusive. Thisanalysis presents an area of research thus farabsent relative to instructional design but a morecommon investigation into other similarly female-dominated fields. As the importance of instructionaldesign increases, the need to more fully understandthe field and areas affecting its practice likewiseincrease in importance.

IntroductionInstructional design is recognized simply as “thesystematic process of translating principles of learningand instruction into plans for instructional materials andactivities'' (Smith & Ragan, 2005, p.2). The CambridgeBusiness English Dictionary (n.d.) defined instructionaldesign as “the design of systems, computer programs,etc. to help people learn more effectively.” Particularly ina higher education environment, they have served a

foundational role in the shift away from brick and mortarinstitutions to blended and online delivery. Despite theserelatively simple definitions, practitioners – theinstructional designers – are often called upon to serve inadditional capacities, from media designers to facultydevelopers, faculty themselves, and very often as projectmanagers and team leads (Sharif & Cho, 2015). In fact,considerable literature from the field of instructionaldesign indicates that practitioners should be prepared toserve as leaders in their organizations (Ashbaugh, 2013;Bean, 2014), perhaps even at the executive level inspecific settings, such as community colleges (Boyle,2011). In addition, pertinent studies also commonly findthat instructional designers are called upon to function ina variety of other ways, in and beyond the scope of theirtraditional role (Bond & Dirkin 2018; Gibby et al., 2002;Intentional Futures, 2016; Sharif & Cho, 2015). Soprevalent is the diversification of the instructionaldesigner’s role, it is often only a small minority ofprofessionals who are actually investing a majorityportion of their time in actual instructional design work(Bond & Dirkin 2018; Gibby et al., 2002; Sharif & Cho,2015). This leads to instructional designers being knownfor working in a “meta” field, making it incrediblycumbersome to define and place their work in aparticular box (Bodily et al., 2019; Wingfield, 2009).These themes of role diversification and leadershipresponsibilities indicate potential opportunities withininstructional design for career growth, variety, andadvancement.

The demand and diversification of roles for instructionaldesigners within higher education institutions havesteadily increased, with universities, such as Utah StateUniversity (USU) completely rebranding theirInstructional Technology Department to include thelearning sciences, “a change so dramatic that itwarranted an article explaining the rationale” (West etal., 2017, p. 870). USU is not the only university toundergo such a dramatic change to their instructionaltechnology department, creating new departmentsultimately to provide space for instructional designers towork (West et al., 2017). The field at large is projected togrow at a rate of seven percent well into the next decade(U.S. Bureau of Labor and Statistics, 2019). Presumably,with these notions in mind, Bond and Dirkin (2018)conducted an in-depth investigation of the current stateof instructional design practice. The research includedinquiry related to leadership functions and rolediversification occurring among instructional designers.

In January of 2018, Bond and Dirkin distributed a survey(see Appendix) soliciting responses from instructionaldesigners. Recipients of the invitation to participateincluded a state-by-state reference of individuals servingin teaching and learning/e-learning/instructional designleadership roles, as well as subscribers to the email lists

The Journal of Applied Instructional Design, 10(2) 55

of the Michigan Blackboard Users Group (MiBUG),University Professional and Continuing EducationAssociation (UPCEA), Arizona State UniversityBlackboard Users Group, and the Professional andOrganizational Development (POD) Network. ThoughBond and Dirkin (2018) initially explored the prevalenceof work beyond traditionally accepted definitions ofinstructional design, the gender of respondents was alsosolicited in the original survey. The investigation ofgender’s role relative to certain aspects of the subjectpool’s experience yielded potentially notable andcertainly interesting results. What follows is a conciseoverview of instructional design’s evolution into a female-dominated field, an introduction to a gender-relatedphenomenon known as the glass escalator (Williams,2013), and data analysis surfacing differences betweenmales and females in the instructional design role andpractice. Discussion and implications of the data analysisfindings and recommendations for necessary futureresearch are also included.

Literature Review

Instructional Design’s Female Predominance

Instructional design originated in the 1940s andestablished itself as a separate field independent fromthose from which it emerged by the 1960s (Reiser, 2001).Though instructional design is predominately femaletoday, this was not always the case. When instructionaldesign emerged from the field of psychology, femalepredominance was not a hallmark of instructional designor many other fields of the time. While documented casesof women representing a minority stake in instructionaldesign are severely limited, some vignettes are beginningto emerge in books and journals. In her chapter,“Mentoring and the Role of Women in InstructionalDesign and Technology” in Women’s Voices in the Field ofEducational Technology , Richey (2016) detailed hercareer as a woman within the instructional design field,starting in 1971, and her experiences as a minorityamong men. After graduation, Dr. Richey was the onlyfemale faculty member within the instructionaltechnology program in the 1980s (Richey, 2016).Throughout her career, she searched for other femalepractitioners with limited success. While this is only oneindividual’s experience, it helps paint a picture of howscarce women were in the field only a few decades ago.

Today, however, a clear female majority exists in thefield. Depending upon the source of data one consults,instructional design is approximately 70 percent female,30 percent male. Bond and Dirkin (2018) found among254 subjects, 67 percent of practitioners are female, 30percent male, with the remaining three percent indicatingeither non-binary gender or abstaining. These genderdemographics fundamentally align with Intentional

Futures (2016) findings and the U.S. Bureau of Labor andStatistics (2019). The Intentional Futures study of highereducation found that “instructional designers are 67%female and their average age is 45” (Intentional Futures,2016 p. 6). If one looks at the U.S. Bureau of Labor andStatistics, the closest categories are related to the largerfield of training and development specialists or managers.They reported a minor discrepancy. Other predominantlyfemale occupations similarly evolved over varying arcs oftime with respect to gender composition to becomefemale majority.

Before the U.S. Civil War, for example, men were morelikely to be employed as teachers than were women(Hodson & Sullivan, 1990). A number of historical eventsserved as pivots impacting the gender balance of theworkforce. The 1870 United States Census was the firstto record female engagement in employment, finding atthe time that women made up 15 percent of the totalworkforce. The first and second World Wars likewisecreated more opportunities for women in the workforce,further transitioning many females into occupationsbeyond the home (Green, 2000). Despite a relativebalance in males and females in the employable populace,instructional design is among several fields known to begender-segregated. Gender segregation in the workforcewas identified decades ago as “one of the most perplexingand tenacious problems in our society” (Williams, 1992,p. 253) but nevertheless persisted. Impacts arising basedupon the gender composition within a field may beunexpected and concerning. The following sectionprovides an overview of one such phenomenon related togender, tokenism, social status, and the collectivepotential relationship of these factors to affectadvancement and opportunity.

The Glass Escalator

In groundbreaking research, Williams (1992) concludedthat males generally encounter structural advantageswhen employed in predominantly female professions.Borrowing some terminology from an occurrence better-known at the time, the glass ceiling, Williams (1992)referred to the advantages experienced by males workingin female-majority fields as the glass escalator. Whereasthe glass ceiling posits the existence of an invisiblebarrier that prevents women from reaching top positionsin organizations (Hymowitz & Schellhardt, 1986), theglass escalator indicates there exist “subtle mechanisms[that] seem to enhance men’s position in [female-majority] professions” (Williams, 1992, p. 263). Thetheory indicates that part of the advantage comes fromtokenism, which is the relative numeric rarity of aparticular group or representative of a group within alarger context (Kanter, 2008). The second component isthe token group members; even more so than a numericrarity, social status “determines whether the token

The Journal of Applied Instructional Design, 10(2) 56

encounters a ‘glass ceiling’ or a ‘glass escalator’”(Williams, 1992, p. 263). In other words, the glassescalator idea suggests that the higher social status ofmales over females affords them an advantage even whenthey are a minority in a given profession: “While womenclimb the ladder in female-dominated professions, theirmale peers glide past them on an invisible escalator,shooting straight to the top” (Goudreau, 2012, para. 3).

Williams’ (1992) work is qualitative and involvedinterviews with 99 subjects (76 men and 23 women)between 1985 and 1991. At the time of the study, thesubjects were employed in one of four fields: nursing,elementary school teaching, librarianship, and socialwork. Authors described these professions as “…pink-collar occupations because of their higher likelihood of[males] being promoted…” (Dill et al., 2016). These fourprofessional arenas, the female semi professions,occupations that require advanced knowledge and skillsbut are not widely regarded as a true profession (Hodson& Sullivan, 1990), ranged in their gender compositionfrom just five and a half percent male in nursing to a highof 32 percent males found in social work. The researchrevealed several thought-provoking findings. Amongthem, and at the core of the glass escalator phenomenon,were male subjects reporting the belief that being malehad made a primarily positive difference in theopportunities they received.

Furthermore, male subjects were cognizant of theirtokenism, often indicating their own knowledge of therebeing relatively few men in their fields. Even in thosecases when subjects reported what was initially perceivedas internal discrimination against them, later events leftthem in increasingly advantageous positions, with moreauthority and increased status (Dill et al., 2016). Onesubject, when asked if he had considered suing overbeing transferred out of a job due to how he had beenreceived as a male, responded in part, “I’ve got a wholelot more authority here. I’m also in charge…and I’verecently been promoted” (Williams, 1992, p. 263). Even inpre-service contexts, Williams found advantageouscircumstances emerging for men as they prepared toenter female-majority fields. Subjects reported what wasperceived as extra encouragement from professors andadministrators because they were male and studying infemale-dominated disciplines. The glass escalator ispresented in hiring practices as well. In neither the caseof pre-service experiences nor in the hiring phase wereadvantages extended to males only by other males.

On the contrary, females were also noted as advancingmales in female-dominated fields. Williams (1992)indicated that women were enthusiastic about menentering their fields. Subjects noted their career successhaving been facilitated in a multitude of ways by women.One subject reportedly recognized being “extremely

marketable because I am a man” (Williams, 1992, p. 256);another indicated being told by a hiring manager, “it’snice to have a man because it’s such a female-dominatedfield” (p. 256). Simultaneously, female subjects werenoted showing some resentment for the perceived ease inadvancement men in higher positions experienced. Alsoof interest were other ways, in each of the fields explored,men were sometimes treated differently. This was not inthe way one might expect, as Williams (1992) suggested,experiencing a “poisoned work environment” (p. 260) aswomen sometimes do when entering male-dominatedfields.

Conversely, male subjects in the study reported noinstances of sexual harassment. The differences notedwere more closely related to what was asked of them onthe job. Specifically, male nurses were more likely to beasked to assist with male patients' procedures or liftheavy patients. Similarly, male librarians believed theywere called upon more often when boxes of books neededlifting. Within the teaching field, however, somethingbeyond the scope of the job was noted as one subjectindicated “…teaching with all women, and that can behard sometimes” and went on to share “if somebody getsa flat tire, they come and get me… there are just a lot ofstereotypes” (p. 260). Some subjects, in each profession,shared being bothered by the various forms of specialtreatment; others indicated that no distress was causedby it. A third group felt more valued by what they saw asappreciation and an opportunity to contribute to theprofession in other ways with “special traits and abilities(such as strength)” (p. 261). Williams concluded thatmore work would be needed to integrate men and womeninto the labor force. From this early work, several otherrelated studies have sprung, some of which brought thework up to date, built upon it, or both.

In a study that both expanded upon and updatedWilliam’s (1992) research, Budig (2002) examined threedifferent populations to determine whether maleadvantage is the same in differently composed settings:female-dominated, gender-balanced, and male-dominated.The research focused on wage levels and wage growth,and found males, with respect to wage growth, had anadvantage across all three groups. Though wage growthadvantage was maximized in male-dominated fields, itwas also present in female-dominated fields. In the lattercase, the difference is interpreted as a systemicdevaluation of a field due to its female majority (England,1992; Kilbourneet et al., 1994 as cited in Budig, 2002).Overall, males experienced wage growth three percentfaster than females, though this research does notnecessarily support the existence of the glass escalator inthe way posited by Williams (1992). Instead, though theadvantage was noted in female-majority fields, the mostsignificant advantage was not found there for men but inmale majority fields.

The Journal of Applied Instructional Design, 10(2) 57

Further, more recent quantitative tests of the glassescalator have produced varying results, supporting theglass escalator hypothesis, introducing mitigating factors,and other research offering evidence to the contrary(Smith, 2012). Huffman (2004) illuminated an additionalnuance, concluding that the effect of gender compositionon wage inequality increases with job rank. In otherwords, as males move up in a female-majority field ororganization, so does the magnitude of their advantage.In a related, earlier study Hultin (2003) investigatedadvancement opportunity in a longitudinal investigationand found men who work in fields typically viewed asfemale occupations have much greater opportunities forinternal promotion than female counterparts. Hultincontrolled for possible differences in gender-specificpreferences and ambitions (e.g., premarket careerpreference, attitudes toward upward movement). In thisway, men and women included by Hultin were even moreequivalently compared. Ultimately, results were found,which further indicate the disadvantage to women is agender-specific effect, thereby offering additional supportfor the glass escalator concept.

Smith (2012) introduced employer-sponsored benefits asan area for exploration and found additional support forthe glass escalator, concluding it is both “gendered andracialized” (p. 168). Additionally, Smith (2012) found that“white men experience a double advantage based on thefact that they possess two socially valued statuses” (p.168), being white and male, faring better in terms ofcareer advancement and when compared with femalecolleagues at similar levels. These findings furthersupport William's (1992) claim that the social status ofthe token’s group, their rarity alone, creates conditionsfor the glass escalator.

Snyder and Green (2008) conducted a qualitativeinvestigation into glass escalator phenomena amongregistered nurses and found contrary evidence.Specifically, while gender segregation or concentrationsin certain horizontal specializations existed, males werenot disproportionately represented in higher-leveladministrative positions. However, it may be worth notingthat male representation in the nursing field also grewconsiderably over the period encompassing the work ofWilliams (1992) and Snyder and Green (2008). Theproportion of male registered nurses more than tripledfrom just 2.7 percent in 1970 to 9.6 percent in 2011(Landivar, 2013). This growth may signal factors thatserved to level the playing field in the profession.Nevertheless, and despite some mixed findings, theliterature overall tends to support the existence of a glassescalator and collectively supports the notion of maleadvantage in various settings, particularly those that arepredominantly female (Alegria, 2019; Snyder & Green,2008; Williams, 1992).

Alegria (2019) built upon the original concept of the glassescalator but looked more closely at how women,particularly white women, are given slightly moreadvantages in technology-based industries over women ofcolor. During the 1990s, the role of women within thetechnology field had reached a peak and has sincedeclined as it has returned to a primarily male-dominatedoccupation (Alegria, 2019). More specifically, “women ofcolor remain numerical minorities,” which makes thestudy of tokenism within technology-based occupationsfruitful for those looking to better understand the conceptof the glass escalator (Alegria, 2019, p. 2). During herstudy and literature review, Alegria (2019) found thatwomen tend to move into technical roles, and of thosewho do, white women more regularly move intomanagerial level positions, similar to their malecounterparts. Is it possible that the tokenism experienceamong males within instructional design, and other fields,is similar to tokenism that white women may experiencein relation to their non-white female counterparts?Alegria's research draws attention to the notion there aremany levels of the glass escalator that move beyondgender and include race. King et al. (2017) focusedheavily on this intersectionality within the glass escalatorthrough literature review and analysis. The effect of thisintersectionality within the glass escalator varies byindustry, but race appears to be the most prominent in itseffects following gender. While the work conducted byBond and Dirkin (2018) does not focus onintersectionality within the glass escalator, it is essentialto be cognizant that there are many layers related toidentity groups that can be affected and should be furtherresearched.

Instructional Designers and the GlassEscalator

As noted earlier, multiple sources align instructionaldesign with other gender-segregated fields. Specifically,its characteristic female majority of about 70 percent(Bond & Dirkin 2018; U.S. Bureau of Labor and Statistics,2019; Intentional Futures, 2016) makes it similar in thatregard to nursing, social work, librarianship, elementaryand special education. Among the fields addressed thusfar, instructional design aligns most closely with socialwork (Williams, 1992), wherein males account forapproximately 30 percent of individuals in the field. Whileconsiderable research exists investigating the varyingimpact of instructional design products among learners ofdifferent genders, very limited study of gender’s role ininstructional design practice has occurred. Thoughquestions have been asked in the literature with regard toinstructional design over the past few decades (Gray etal., 2015), lack of understanding persists (Smith & Boling,2009). Particularly concerning gender impact, theanalysis offered by Bond and Dirkin (2018) addressed anarena ripe for additional research and exploration.

The Journal of Applied Instructional Design, 10(2) 58

In her autoethnography, Campbell (2015) shared herpersonal experiences as an individual who identifies as awoman and an instructional designer and intertwinesthose experiences with literature review and academicresearch through a feminist lens. She stated thatinstructional design, which is currently categorized as a“science,” causes unintended gender-based stereotyping,a theme common in STEM-related fields (Campbell,2015). Campbell’s feminist approach to instructionaldesign is that it is “process-based, relational, andtransformative,” similar to the same feminist approachesin other forms of design-based fields such as architecture(Campbell, 2015, p. 233). Campbell claimed that one ofthe barriers to female instructional designers is that thefield itself has “long been masculinized by language, bydiscourse, by metaphor,” and by “the tools [we] havechosen to use” (Campbell, 2015, p. 233). To illustrate thispoint, Campbell highlighted the frequency in whichinstructional design created/based workshops utilizelanguage that focuses more on the act of “doing” ratherthan the act of “thinking,” which can be interpreted asmasculine (Campbell, 2015). This use of masculinelanguage could be a clue to why and possibly how theglass escalator exists within the field of instructionaldesign. While Campbell shed light on language andfeminism within instructional design, her work does notfully illuminate how this lens can be focused on highereducation, but rather, the field as a whole. Bond andDirkin (2018) use Campbell’s experiences and study to bemore cognizant of the feminist lens while focusing oninstructional design within higher education and itspotential relationship to the glass escalator.

Similar to Campbell (2015), Romero-Hall et al. (2018)used a critical autoethnographic approach to examine thefemale experience within the field of instructional design.Many of the stories within the article featured womenwho have faced discrimination while in their roles asinstructional designers for issues related to the demandsof being a mother (Romero-Hall et al., 2018). Anotherissue identified within Romero-Hall et al.’s (2018)research was that several of the women interviewedexperienced isolation and depression due to an identitycrisis caused by perceived sexism. These feelings couldalso impact instructional designers who identify aswomen view themselves and their own success in thefield, affecting whether the glass escalator is present.While the number of women in instructional design hasgrown significantly, the feelings of being alienated ordismissed appear to still be present in the field, evenwithin traditionally liberal environments, such asuniversities (Romero-Hall et al., 2018).

Educational institutions focus on growth in online courseand program offerings, student retention, and effectiveteaching and learning practices. These factors increaseinstructional designers' importance as they are charged

with preserving and improving the integrity and quality ofinstruction (Ross & Morrison, 2012). Recent researchconsistently situates instructional designers'contributions as critical factors in the success of highereducation (Tate, 2019; Ross & Morrison, 2012; Campbellet al., 2009). The consideration Bond and Dirkin (2018)gave the question of gender in instructional designers'role is unique. Exploring gender with respect todesigners’ potential areas of specialization, perceptions ofdesign process ownership, autonomy, and other aspectsof instructional design practice is likewise novel.

MethodsIn January of 2018, Bond and Dirkin distributed a nationalsurvey (see Appendix) soliciting responses frominstructional designers regarding perceptions of theirroles. This paper examines the same dataset obtained byway of the original survey. The web-based survey, createdwith and hosted on Qualtrics®, was adapted withpermission from surveys conducted previously byIntentional Futures (2016) and Sharif and Cho (2015).Themes were also adapted from the previous work ofMiller (2007) and Gibby et al. (2002). The authorsexamined the data for general trends in rolediversification and leadership among instructionaldesigners and connections between these and gender.

Participants and Procedures

In January of 2018, Bond and Dirkin distributed a survey(see Appendix) soliciting responses from instructionaldesigners. Recipients of the invitation to participateserved in various capacities related to teaching andlearning/e-learning/instructional design leadership roles.However, all participants were asked if they worked in aninstructional designer capacity before beginning thesurvey. Participants were recruited to participate throughlistservs of professional organizations, including MichiganBlackboard Users Group (MiBUG), UniversityProfessional and Continuing Education Association(UPCEA), Arizona State University Blackboard UsersGroup, and the Professional and OrganizationalDevelopment (POD) Network. These listservs cater toinstructional designers in public and non-publicinstitutions. Prior to distribution to the target population,the survey was piloted among the instructional designmanagement and staff of a Midwestern university’steaching and learning center, a cohort of doctoralstudents and faculty. After implementing a series ofsuggested edits, the link was distributed to subscribers ofvarious email lists, including the Michigan BlackboardUsers Group (MiBUG), University Professional andContinuing Education Association (UPCEA), Arizona StateUniversity Blackboard Users Group, Professional andOrganizational Development (POD) Network, and anotherlist which included a state-by-state reference of teaching

The Journal of Applied Instructional Design, 10(2) 59

and learning/e-learning/instructional design leaders. Thesurvey instrument consisted of four question blocks andused conditional branching to assure that individuals whomet specific criteria were exposed to a particular set ofquestions.

Data Analysis: Gender and InstructionalDesign Practice

For data analysis, to effectively address the role of genderrelative to various aspects of instructional designpractice, Bond and Dirkin (2018) limited gender to binaryinputs of male or female-only ( n = 248). Data gatheredvia the aforementioned survey (Appendix A) were used togenerate a new variable, leadership score, calculated, asshown in Table 1.

Table 1

Leadership Score Calculation

ItemNumber Summary Value/Notes

10 Do you manage others? +1 for yes,informally;+2 for yes,formally;+3 for yes,informally andformally;+0 for no

11 How many employees doyou manage?

+1 for 1-2+2 for 3-4+3 for 5-6+4 for more than6

14 Functions served inaddition to InstructionalDesign

+1 for committeework+1 for personnelmanagement+1 for projectmanagement

17 Design modelownership/autonomy

+1 for creatingthe model(s) inuse+1 for authorityto change themodel

Max 12Min 0

Once a leadership score was calculated, t-tests anddescriptive statistics were conducted to determine thesignificance of the various dimensions of leadershiprelated to their position. Specifically, descriptive statisticswere used to determine the number of males and femalesinvolved in committee work, personnel management, andproject management. Additionally, descriptive statistics

such as cross-tabulations were used to identify, relative togender, the size of teams they managed, and areas ofspecialization. Independent samples t-tests wereconducted using leadership scores, team size supervision,education level, and years of experience to determinewhether differences between the two groups werestatistically significant.

Considering further one’s involvement in additionalfunctions as a potential area of significant differencebetween genders, Bond and Dirkin (2018) createdanother variable from subject responses to survey item 14(functions other than instructional design). The newvariable, Diversification Score, represents a total basedupon one point for each of the ten additional functions arespondent selected in item 14. A minimum value of 0 anda maximum of 10 were possible. While the initial purposeof the diversification score was to assist in quantifyingtypical additional duties of an instructional designer,grouping scores served as a final look into gender’srelationship with other aspects of instructional designpractice. An independent samples t-test was used todetermine if there was a significant difference betweengenders regarding the diversification score.

ResultsThe data analysis looked at multiple areas of leadershipwithin the role of an instructional designer. These includeleadership functions such as committee work, personnelmanagement, and project management. In addition,researchers examined other leadership responsibilitiessuch as team management and job diversification. T-testswere conducted to determine whether a significantdifference existed between the groups to understandmediating factors such as education and experience level.

Education and Experience

An additional independent-samples t -test was calculatedto compare the mean level of education between malesand females to examine the disparity between genders.Here again, values were assigned to each level of theordinal variable, level of completed education. This testfound no significant difference ( t (245) = -1.889, p >.05), though it was noted that females, on average,possessed more education than males, education levelacross both groups was comparable. The mean educationlevel of females (M = 4.38, sd = .644) was notsignificantly different from the mean among males (M =4.21, sd = .732). Another test was calculated comparingmean years of experience between males and females ( t(245) = 1.998, p < .05) to assess whether time in the fieldmay have played a role. Statistically, a significantdifference was found in the mean years of males'experience level (M = 2.83, sd = 1.271), which washigher than among females (M = 2.44, sd = 1.483). It

The Journal of Applied Instructional Design, 10(2) 60

should be noted that these means were calculated basedon the selection of an option associated with a range(e.g., less than five years, six to ten years). Consequently,the difference between the means (0.39) equates toapproximately 1.95 years of additional experience, onaverage, among men.

Leadership Functions

Male subjects more frequently indicated involvement thanfemale counterparts in committee work (68 percent, vs.58 percent), personnel management (39 percent, vs. 27percent), and project management (78 percent, vs. 68percent). Additionally, males were nearly twice as likelyto indicate involvement in all three leadership functions(Figure 1).

Figure 1

Involvement in Committee Work, Personnel Management,and Project Management

A Pie Chart Comparing Males' and Females' Involvement in Committee Work,Personnel Management, and Project Management

An independent-samples t -test was calculated comparingthe mean Leadership Score of males and females toinvestigate the overall role of gender in leadership. Asignificant difference was found between the means ofthe two groups: ( t (246) = 2.361, p < .05). The meanamong males was significantly higher (M = 6.2208, sd =2.0623) than that of females (M = 5.5731, sd = 1.9701).To unpack this finding, additional testing was done on akey component of the Leadership Score, the size of therespondent’s team.

Team Management

Subject responses to survey item 11, the approximatenumber of employees managed, n = 248, were groupedby gender. A cross-tabulation of this data appears inTable 2.

Table 2

Number of Employees Managed by Gender

Approximately how many other

employees do you manage, formallyand informally?

0 1-2 3-4 5-6 Morethan6

Total

Male Count 29 14 17 2 15 77% 37.7% 18.2% 22.1% 2.6% 19.5% 100.0%

Female Count 73 39 34 9 16 171% 42.7% 22.8% 19.9% 5.3% 9.4% 100.0%

Total Count 102 53 51 11 31 248 % 41.1% 21.4% 20.6% 4.4% 12.5% 100.0%

Figure 2 displays team size by gender. Each of theserepresentations demonstrates that male subjects areconsiderably more likely to be managing large teams ofsix or more persons.

Figure 2

How Many Managed Versus Team Size

A Bar Graph Comparing How Many Managed Versus Team Size

As the size of one’s team indicates a progression with atrue zero point and a meaningful order between levels,team size was treated as an ordinal variable (Cronk,2017), with numeric values assigned for each level foranalysis comparing the genders. An independent samplest- test found ( t (246) = 1.731, p > .05). While the meanvalue for males on item 11 (M = 1.48) indicated thesupervision of larger teams among males, the differencebetween males and females (M = 1.16) was notstatistically significant.

Specialization of Practice

As growth in demand for online learning design andexpanding technology toolsets are often connected to theexpansion of instructional design (Allen & Seaman, 2016;Kim et al., 2007), gender was also explored relative tospecialization. Cross-tabulation was generated basedupon an indicated area of practice (item 15) and gender(Table 3).

Table 3

Crosstabulation of Areas of Practice

The Journal of Applied Instructional Design, 10(2) 61

Which of the following best describes your area ofspecialization in your current instructional design

role?

Online

learningdesign

Classroomlearningdesign

Blendedlearningdesign

Classroomonline

andblended

Other,pleasespecify

Total

Male Count 42 1 6 23 5 77% 54.5% 1.3% 7.8% 29.9% 6.5% 100.0%

Female Count 97 3 5 51 5 161% 60.2% 1.9% 3.1% 31.7% 3.1% 100.0%

Total Count 139 4 11 74 10 238 % 58.4% 1.7% 4.6% 31.1% 4.2% 100.0%

Further analysis (e.g., Chi-Square) was not possible, asmore than 20 percent of cells have expected counts lessthan five. Despite this finding, one can observe in thecross-tabulation relative equity in areas of specializationbetween genders. The cross-tabulation in Table 3 andFigure 3 demonstrate that gender difference in an area ofspecialization is prevalent only among those in blendedlearning design. Additionally, the largest area ofspecialization is, not surprisingly, in online learningdesign. Six respondents did not respond to thespecialization question; four others indicated other/pleasespecify, but did elaborate further or offered an unusablereply and were therefore excluded from the data reflectedin Table 3.

Figure 3

Areas of Specialization

A Bar Graph Comparing Males' and Females' Areas of Specialization

An independent-samples t -test was calculated to comparethe mean diversification score between males andfemales. The test found significant differences ( t (245) =3.391, p < .005), with a mean value among males(5.2632, sd = 2.119) significantly higher than that amongfemales (4.2749, sd = 2.112).

Discussion and ImplicationsIt is relatively clear that instructional design bearssimilarities to other predominantly female professions(Milner et al., 2018; Shen-Miller & Smiler, 2015;Ridgeway & Kricheli-Katz, 2013; Williams, 1992; 2013).For example, the female majority emerged over time outof an era wherein practice was generally dominated bymales (Yellen, 2020). As a female majority field and,

therefore, a gender-segregated field, instructional designis a logical context for gender-specific research (Alegria,2019). However, as noted earlier, limited research ongender and professional practice exists outside of aminimal collection of articles. Instead, research involvinggender tends to focus on the instructional design processresults as applied to learners of different genders. Inother words, while the study of learning differencesbetween the genders, relative to different instructionalapproaches, is available in the literature, investigationregarding gender’s role within instructional design in thehigher education profession is lacking. This gap inresearch serves to create a lack of understanding withininstructional design practice. For example, knowledge ofwhether instructional design is subject to phenomenasuch as the glass ceiling or escalator, more specifically,what is causing it and how such a situation might beimproved upon, is difficult to ascertain (Alegria, 2019;Williams, 2013). Therefore, despite the clarity with whichinstructional design’s gender composition aligns withother fields, related questions are not so quicklyanswered. Suppose more research was done andknowledge gained by answering these questions. In thatcase, results may lead to possible action steps that couldbe taken to ensure more equal opportunity sharing for allinstructional designers, regardless of their identifiedgender (Dewan & Gebeloff, 2012).

The data analysis certainly suggests that maleinstructional designers are positioned differently thanfemale counterparts and, perhaps, the way males aresituated is advantageous relative to function. However,one cannot conclude from this alone that the structuraladvantages noted by Williams (1992) in some fields, suchas elementary school teaching, nursing, and social work,also exist in instructional design. Nevertheless, theresults of this analysis do point to elements of the glassescalator phenomenon as males consistently fared betterin some of the areas investigated (Alegria, 2019;Friedman, 2015). Specifically, significant differencesbetween male and female instructional designers wereidentified by analyzing aggregate data surfaced by theleadership and diversification scores and years ofexperience. Had the former two variables not beencreated from the data, these significant and potentiallyessential differences would not have been identified. Despite what the data collected shows, it is still bestpractice when conducting these types of studies toremember that correlation does not necessarily equate tocausation, and data found by Bond and Dirkin (2018) is noexception. More statistical analysis may still benecessary, however, as the sample sizes were unequaland therefore increases the likelihood of type I errors.

Considering management functions collectively, based oncalculated leadership scores, male designers weresignificantly more involved in oversight activities ranging

The Journal of Applied Instructional Design, 10(2) 62

from the supervision of others to committee membership,project management, and autonomy over process. Asspecialization was also investigated, it was found thatinvolvement in practice more broadly encompassing aspectrum of learning design, rather than specific areassuch as online or classroom learning design only, did notappear to occur along gender lines. A morecomprehensive look at diversification of roles did,however, yield significant results. Based on thediversification score analysis, male subjects engage ininstructional design practice, which is significantly morediverse than their female peers. What may be causing thisimbalance between male and female instructionaldesigners is still unclear, and further investigation will beneeded.

The conditions under which all of the advantages notedabove occur are worth noting here. Comparison ondemographic characteristics, specifically which one mightassociate with expected differences (e.g., level ofeducation) in their experience and opportunities aspractitioners, actually yielded counter-intuitive results.The level of education between males and females wasnot found to be significantly different, and femalesubjects had on average somewhat more education thanmales. Male respondents were found to have moreexperience in the field than females, an interestingfinding, but one that did prove to be statisticallysignificant. Further quantitative and qualitative researchwould need to be done to better illuminate the possiblecauses for these differences related to on-the-jobexperience and education levels.

Males exercising supervision among these surveyrespondents were outnumbered by females, more thantwo to one. This finding hints further at the presence ofthe glass escalator, as it is often the case even in thosefields where a female majority is present, in management,the token male experiences certain advantages.Nevertheless, the absence of additional information inthis data presents challenges to aligning outcomes withother research. As wage and rank were not included, norwere pre-employment preferences and attitudesregarding advancement assessed, the findings hereinneither confirm nor refute those of earlier studies,including Budig (2002), Huffman (2004), and Hultin(2003). In contrast, certain fundamental elements presentin this data or instructional design at large – a significantfemale majority and the resulting possible tokenism ofmales in the field, do indicate some consistency withelements of Williams (1992) and Kanter (2008). Insummary, the data present in Bond and Dirkin (2018)posited an observably advantageous positioning for maleinstructional designers, despite male subjects notnecessarily being better qualified. Overall, maledesigners indicated exercising more supervision authorityand report more diverse roles than female counterparts.

However, no strong correlation pointing to causation wasrevealed during this study, which emphasizes the needfor future investigation in this topic as the field continuesto expand and educational technology continues to rise inhigher education.

Recommendations for FutureResearchAs alluded to earlier, further research is necessary todetermine or refute whether instructional design may beamong those fields subject to a glass escalator effect.These additional studies would be well-advised to collectmore data than earlier investigations of instructionaldesign, which, while focusing on role and aspects ofpractice, did not address gender (Bond & Dirkin, 2018;Intentional Futures, 2016; Sharif & Cho, 2015). Forfuture quantitative approaches, soliciting informationregarding income, rank, and perspectives, and attitudeson advancement opportunities would be tremendouslyvaluable. Insights gained from these data points wouldnot only enable additional analysis but also position theresearch to be compared more effectively with priorquantitative glass escalator research, including Huffman(2004) and Smith (2012). Moreover, it may also beadvisable to repeat this research with a mixed-methodsapproach or an exclusively qualitative model as this couldilluminate factors that would facilitate comparison withWilliam's (1992) original study. The higher averageleadership and diversification scores among males, whichessentially translates into their possibly being asked toengage more frequently in management and involved inmore widely varying functions, could be a function ofmore experience in the field. More research into thisaspect is also needed to make such a determination.

Interviews with male and female instructional designersalike could be the only approach that yields insight intowhether males are the recipients of subtle mechanismsadvancing their careers, not on merit, but ratherpotentially on privileges associated with maleness. Heretoo, one might discover, as Williams (1992) did, that menwere cognizant of their advantage. Additional study inthis direction could also explore the existence of industry-specific stereotypes discovered in other fields. Forexample, are there instructional design practiceequivalents to the male nurse being asked to assist withlifting a heavy patient or a male elementary schoolteacher being called upon when a tire needs changing?Conversely, the discovery of counter-evidence couldoccur. Research may find that something else, besidestokenism, social standing, and structural mechanisms,accounts for male advantage.

Tightening the scope on why males may experienceprivileges associated with their gender within the field of

The Journal of Applied Instructional Design, 10(2) 63

instructional design may require additional research andstudy on the language used both in on-the-job situationsas well as in the recruitment and hiring processes. Asmentioned earlier, some academics argue that much ofthe language associated with this field is masculine innature, which can have unintentional or intentionaleffects on the perceptions and performance of femaleinstructional designers (Campbell, 2015). Can thelanguage used to describe expectations, and thus definesuccess within the role, create barriers for women tohave equitable opportunities for success in relation totheir male-identifying counterparts? This type of studywould require a broad literature review on the nature oflanguage and gender in addition to research on a largesample size of both hiring literature and performanceevaluations with regard to instructional design for varioushigher education institutions. A study of this nature couldprove valuable in better understanding how language canbe intertwined with the glass escalator withininstructional design and how it can be utilized to createequitable opportunities for all designers regardless ofself-identified gender.

To further deepen the understanding of how gender playsa role in instructional design and whether or not the glassescalator exists therein, it may be necessary to also lookat how race may cause different responses amongwomen. Intersectionality appears to play a role inworkplace advantages and disadvantages, as shown invarious research (King et al., 2017). Such layers ofidentity could assist in further understanding the effectsof tokenism as it relates to those who identify as female,in addition to male tokenism explored in this study. Is itpossible that a white instructional designer that identifiesas a woman experiences a sort of tokenism compared towomen of color who are in similar roles? Do those womenrecognize this possible tokenism, and if so, how do theyfeel about it? To seek out this type of data, a much largerparticipant pool will be needed to decrease theoccurrence of data collection-related errors and ensureequal representation. The data collected on a study of thisnature could be used to find gaps in support for thosewho identify with groups that may be experiencing theeffects of a glass escalator and allow for possiblesolutions and increased advocacy to come forward.

Regardless of the outcome, the increasing importance ofinstructional design, as indicated by Ashbaugh (2013),Bean (2014), Boyle (2011), and many others, is impetusenough for continued investigation. Even as genderequality and equity remain at the forefront of workplaceissues, instructional design’s own shroud of obscurity(Sharif & Cho, 2015) may contribute to a lack ofinvestigation in this area. While the focus has beenunderstandably placed on research pointed at onlinelearning growth and technology, learning science, andorganizational impacts, to understand instructional

design practice fully, it should be investigated in a moreholistic manner, consistent with that found in other fields.Whether doing so surfaces the presence or lack ofchallenges similar to those found in other professionalarenas, valuable insight will likely be gained.

ReferencesAlegria, S. (2019). Escalator or step stool? Gendered

labor and token processes in tech work. Gender &Society, 33 (5), 722-745.https://doi.org/10.1177/0891243219835737

Allen, I. E., & Seaman, J. (2016). Online report card:Tracking online education in the United States.https://onlinelearningsurvey.com/reports/onlinereportcard.pdf

Ashbaugh, M. (2013). Expert instructional designervoices: Leadership competencies critical globalpractice and quality online learning designs.Quarterly Review of Distance Education, 14 (2),97-118.

Bean, C. (2014). The accidental instructional designer:Learning design for the digital age. American Societyfor Training and Development (ASTD).

Bodily, R., Leary, H., & West, R. E. (2019). Researchtrends in instructional design and technologyjournals. British Journal of Educational Technology, 50 (1), 64-79. https://doi.org/10.1111/bjet.12712

Bond, J., & Dirkin, K. (2018). Instructional design: Studyof a widening scope of practice. Online Journal ofDistance Learning Administration, 21 (4).

Boyle, K. M. (2011). Instructional designers as potentialleaders in community colleges. TheCommunity College Enterprise, 17 (2), 7-18.https://doi.org/10.1002/cc.119

Budig, M. J. (2002). Male advantage and the gendercomposition of jobs: Who rides the glass escalator? Social Problems,49 (2), 258-277.https://doi.org/10.1525/sp.2002.49.2.258

Cambridge Business English Dictionary. (n.d.).Instructional design.https://dictionary.cambridge.org/us/dictionary/english/instructional-design

Campbell, K. (2015). The feminist instructional designer:An autoethnography. In B. Hokanson & M.W. Tracey(Eds.), The Design of Learning Experience, 231-249.https://doi.org/10.1007/978-3-319-16504-2

The Journal of Applied Instructional Design, 10(2) 64

Campbell, K., Schwier, R. A., & Kenny, R. F. (2009). Thecritical, relational practice of instructional design inhigher education: An emerging model of changeagency. Educational Technology Research andDevelopment, 57 (5), 645-663.https://doi.org/10.1007/s11423-007-9061-6

Cronk, B. C. (2017). How to use SPSS: A step-by-stepguide to analysis and interpretation. Routledge.

Dewan, S., & Gebeloff, R. (2012). More men enter fieldsdominated by women. CNBC.https://www.cnbc.com/id/47500189

Dill, J. S., Price-Glynn, K., & Rakovski, C. (2016). Doesthe “glass escalator" compensate for the devaluationof care work occupations? The careers of men in low-and middle-skill health care jobs. Gender & Society,30 (2), 334–360.https://doi.org/10.1177/0891243215624656

England, P. (1992). Comparable worth: Theories andevidence. Aldine de Gruyter.

Friedman, S. (2015). Still a ‘stalled revolution’?Work/family experiences, hegemonic masculinity, andmoving toward gender equality. SociologyCompass, 9 (2), 140-155.https://doi.org/10.1111/soc4.12238

Gibby, S., Quiros, O., Demps, E., & Liu, M. (2002).Challenges of being an instructional designer for newmedia development: A view from the practitioners. Journal of Educational Multimedia and Hypermedia,11 (3), 195-219.

Goudreau, J. (2012, May). A new obstacle for professionalwomen: The glass escalator. Forbes.https://www.forbes.com/sites/jennagoudreau/2012/05/21/a-new-obstacle-for-professional-women-the-glass-escalator/?sh=5441f5d9159d

Gray, C. M., Dagli, C., Demiral‐Uzan, M., Ergulec, F., Tan,V., Altuwaijri, A. A., Gyabak, K., Hilligoss, M.,Kizilboga, R., Tomita, K., & Boling, E. (2015).Judgment and instructional design: How IDpractitioners work in practice. PerformanceImprovement Quarterly, 28 (3), 25-49.https://doi.org/10.1002/piq.21198

Green, H. (2000). The uncertainty of everyday life,1915-1945 . University of Arkansas Press.

Hodson, R., & Sullivan, T. A. (1990). The socialorganization of work. WadsworthPublishing.

Huffman, M. L. (2004). Gender inequality across local

wage hierarchies. Work and Occupations, 31 (3),323-344.https://doi.org/10.1177%2F0730888404266384

Hultin, M. (2003). Some take the glass escalator, some hitthe glass ceiling? Career consequences ofoccupational sex segregation. Work and Occupations,30 (1), 30-61. https://doi.org/10.1177/0730888402239326

Hymowitz, C., & Schellhardt, T. D. (1986). The glassceiling: Why women can’t seem tobreak the invisible barrier that blocks them from thetop jobs. The Wall Street Journal, 24 (1), 1573-1592.

Intentional Futures. (2016). Instructional design in highereducation: A report on the role, workflow, andexperience of instructional designers. https://intentionalfutures.com/static/instructional-design-in-higher-education-report-5129d9d1e6c988c254567f91f3ab0d2c.pdf

Kanter, R. M. (2008). Men and women of the corporation:New edition. Basic Books.

Kim, M. C., Hannafin, M. J., & Bryan, L. A. (2007).Technology‐enhanced inquiry tools in scienceeducation: An emerging pedagogical framework forclassroom practice. Science Education, 91 (6),1010-1030. https://doi.org/10.1002/sce.20219

King, J., Maniam, B., & Leavell, H. (2017). Shatteredglass and creaky steps: Remodeling the glass ceilingand glass escalator theories using an intersectionaltoolbox. Southern Journal of Business and Ethics, 9,194-203.

Landivar, L. C. (2013). Men in nursing occupations:American community survey highlight report . USCensus Bureau.https://www.census.gov/library/working-papers/2013/acs/2013_Landivar_02.html

Miller, J. L. (2007). The new education professionals: Theemerging specialties of instructional designer andlearning manager. International Journal of PublicAdministration, 30 (5), 483-498.https://doi.org/10.1080/01900690701205970

Milner, A., King, T., LaMontagne, A. D., Bentley, R., &Kavanagh, A. (2018). Men’s work, women’s work, andmental health: A longitudinal investigation of therelationship between the gender composition ofoccupations and mental health. Social Science &Medicine, 204, 16-22.https://doi.org/10.1016/j.socscimed.2018.03.020

Reiser, R. A. (2001). A history of instructional design andtechnology: Part II: A history of instructional design.

The Journal of Applied Instructional Design, 10(2) 65

Educational Technology Research and Development,49 (2), 57-67. https://doi.org/10.1007/BF02504928

Richey R. C. (2016) Mentoring and the role of women ininstructional design and technology. In: J. Donaldson(Ed.), Women's voices in the field of educationaltechnology (pp.193-197). Springer, Cham.https://doi.org/10.1007/978-3-319-33452-3_27

Ridgeway, C. L., & Kricheli-Katz, T. (2013). Intersectingcultural beliefs in social relations: Gender, race, andclass binds and freedoms. Gender & Society, 27 (3),294–318. https://doi.org/10.1177/0891243213479445

Romero-Hall, E., Aldemir, T., Colorado-Resa, J., Dickson-Deane, C., Watson, G. S., & Sadaf. A. (2018). Undisclosed stories of instructional design femalescholars in academia. Women's Studies InternationalForum, 71, 19-28.https://doi.org/10.1016/j.wsif.2018.09.004

Ross, S. M., & Morrison, G. R. (2012). Constructing adeconstructed campus: Instructional design as vitalbricks and mortar. Journal of Computing in HigherEducation, 24 (2), 119-131.https://doi.org/10.1007/s12528-012-9056-0

Sharif, A., & Cho, S. (2015). 21st-century instructionaldesigners: Bridging the perceptual gaps betweenidentity, practice, impact and professionaldevelopment. International Journal of EducationalTechnology in Higher Education,12 (3), 72-85.http://dx.doi.org/10.7238/rusc.v12i3.2176

Shen-Miller, D., & Smiler, A P. (2015). Men in female-dominated vocations: a rationale for academicstudent and introduction to the special issue. SexRoles, 72 , 269-276.https://doi.org/10.1007/s11199-015-0471-3

Smith, K. M., & Boling, E. (2009). What do we make ofdesign? Design as a concept in educationaltechnology. Educational Technology, 49 (4).Retrieved fromhttps://www.jstor.org/stable/44429817

Smith, P. L., & Ragan, T. J. (2005). Instructional design(3rd ed.). John Wiley & Sons.

Smith, R. A. (2012). Money, benefits, and power: A test ofthe glass ceiling and glass escalator hypotheses. TheAnnals of the American Academy of Political andSocial Science, 639 (1), 149-172.https://doi.org/10.1177/0002716211422038

Snyder, K. A., & Green, I. A. (2008). Revisiting the glassescalator: The case of gendersegregation in a female dominated occupation.

Social Problems, 55 (2), 271-299.https://doi.org/10.1525/sp.2008.55.2.271

Tate, E. (2019). Instructing instructional designers .Inside Higher Ed .https://www.insidehighered.com/digital-learning/article/2017/04/12/training- evolving-instructional-designers

U.S. Bureau of Labor Statistics. (2019). Instructionalcoordinators. In Occupational outlook handbook. https://www.bls.gov/ooh/education-training-and-library/instructional-coordinators.htm.

West, R. E., Thomas, R. A., Bodily, R., Wright, C., &Borup, J. (2017). An analysis of instructional designand technology departments. Educational TechnologyResearch and Development, 65 (4), 869-888.

Williams, C. L. (1992). The glass escalator: Hiddenadvantages for men in the ‘female’professions. Social Problems,39 (3), 253-267.https://doi.org/10.2307/3096961

Williams, C. L. (2013). The glass escalator, revisited:Gender inequality in neoliberal times, SWS feministlecturer. Gender & Society, 27 (5), 609-629.https://doi.org/10.1177/0891243213490232

Wingfield, A. H. (2009). Racializing the glass escalator:Reconsidering men's experiences with women'swork. Gender & Society, 23 (1), 5-26.https://doi.org/10.1177%2F0891243208323054

Yellen, J. L. (2020). The history of women’s work andwages and how it has created success for us all.Brookings.https://www.brookings.edu/essay/the-history-of-womens-work-and-wages-and-how-it-has-created-success-for-us-all/

Appendix: Instructional Designer Survey

Q1 By continuing, you grant consent for your responsesto be included in reporting and data analysis. Anyidentifiable information provided will be removed prior tocompiling results. Do you wish to continue?

o Yes (1)o No (2)

Skip To: Q2 If By continuing, you grant consent for yourresponses to be included in reporting and data analysis...= Yes

Skip To: End of Survey If By continuing, you grantconsent for your responses to be included in reportingand data analysis... = No

The Journal of Applied Instructional Design, 10(2) 66

Q2 Are you currently working in an instructional designrole (including management of instructional design staff)?

o Yes (1)o No (2)

Skip To: End of Survey If Are you currently working in aninstructional design role (including management ofinstructional... = No

Skip To: Q3 If Are you currently working in aninstructional design role (including management ofinstructional... = Yes Page 2 of 8

Q3 Please indicate your current level of employment

o Full-time (40 hours/week, 10 months or moreper year) (1)o Three-quarter time (30 hours/week, 10 monthsor more per year) (2)o Half-time (20 hours/week, 10 months or moreper year) (3)o Less than half-time ( (4)o Other, please specify: (5)________________________________________________

Q4 Please indicate your gender:

o Male (1)o Female (2)o Non-binary/third gender (3)o Prefer not to say (4)o Prefer to self-describe: (5)________________________________________________

Q5 Do you have formal instructional design education(e.g., a degree in instructional design or a closely relatedfield)?

o Yes (1)o No (2)o Other, please specify: (3)________________________________________________

Skip To: Q7 If Do you have formal instructional designeducation (e.g., a degree in instructional design or a clo...= Yes

Skip To: Q6 If Do you have formal instructional designeducation (e.g., a degree in instructional design or a clo...= No Page 3 of 8

Q6 My formal education prepared me for work in the fieldof instructional design in:

o All aspectso Most aspectso Some aspectso Only a few aspects

o Other, please specify:________________________________________________

Q7 Approximately how long ago did you complete yourformal education in instructional design?

o <5 years (1)o 5-10 years (2)o 11-15 years (3)o 16-20 years (4)o 21-25 years (5)o >25 years (6)

Q8 Please indicate your highest level of completededucation:

o High Schoolo Associate’s Degreeo Bachelor’s Degreeo Master’s Degreeo Doctoral Degreeo Other, please specify:________________________________________________

Q9 Please select the option which best indicates youryears of experience in instructional design:

o <5 years (1)o 5-10 years (2)o 11-15 years (3)o 16-20 years (4)o 21-25 years (5)o >25 years (6)

Q109 Do you manage other employees?

o Yes, formally. (1)o Yes, informally (the other employee(s) do notreport to me, but I assign work to them) (2)o No (3)

Skip To: Q11 If Do you manage other employees? = Yes,formally.

Skip To: Q11If Do you manage other employees? = Yes,informally (the other employee(s) do not report to me, butI assign work to them)

Skip To: Q13 If Do you manage other employees? = No

Q11 Approximately how many other employees do youmanage?

o 1-2 (1)o 3-4 (2)o 5-6 (3)o more than 6 (4)

The Journal of Applied Instructional Design, 10(2) 67

Q12Which of the following best describes thefunction(s) of the employees you manage (selectall that apply)?

▢ Instructional Design

▢ Audio/Video/Graphic Production

▢ Coding/Programming

▢ Technical Support

▢ Administrative/Clerical

▢ Project Management

▢ Other, please specify:

Q13 About how much of your time at work is invested ininstructional design activities, not including managementof other instructional designers?

o (1)o 21 percent-40 percent (2)o 41 percent-60 percent (3)o 61 percent-80 percent (4)

o >80 percent (5)Q14 In addition to instructional design work,which of the following functions do you alsoperform (select all that apply)?

▢ Audio/Video authoring/editing or Graphic design (1)

▢ Coding/Programming (including HTML) (2)

▢ Committee work (e.g., assessment/accreditationcouncils, oversight groups, etc.) (3)

▢ Faculty development (e.g., designing and/or conductingworkshops/training) (4)

▢ Instructor (e.g., teaching one or more courses on aregular basis) (5)

▢ Personnel management (e.g., hiring, performancereview, etc.) (6)

▢ Scholarly activity (e.g., research, publishing) (7)

▢ Server administration (e.g., LMS, database, webserver) (8)

▢ Technical Support (9)

▢ Other, please explain: (10)________________________________________________

Q15 Which of the following best describes your area of

specialization in your current instructional design role?

o online learning design (1)o classroom learning design (2)o blended learning design (3)o general learning design, including classroom,online, and blended (4)o other, please specify: (5)____________________________________

Q16 Which of the following best describes the designmodel in use in your current setting?

o The same design model is applied to eachproject (i.e. a template is used) (1)o The design model varies slightly, project byproject, based on needs (2)o The design model varies greatly, project byproject, based on needs (3)o No formal design model is used (4)o Other, please specify: (5)_______________________________________________

Q17 Which of the following describes ownership of thedesign model in your current setting (select all thatapply)?

▢ I created the model/models my team and I use (1)

▢ I was given the model/models I/my team use(s) (2)

▢ I do not have authority to change the design model(s)(3)

▢ I have authority to make changes to the designmodel(s) (4)

▢ I and others have authority to make changes to thedesign model(s) (5)

▢ Other, please specify: (6)________________________________________________

Q18 Please indicate which theoretical framework(s) ormodel(s) from the literature underpin your instructionaldesign practice:

______________________________________________________________

Q19 Would you be interested in being interviewed tofurther discuss your answers to this survey?

o Yes, my email address is: (1)________________________________________________o No (2)

The Journal of Applied Instructional Design, 10(2) 68

"I Can Do Things BecauseI Feel Valuable":Authentic Project

Experiences and HowThey Matter to

Instructional DesignStudents

Jason K. McDonald & Amy A. Rogers

This paper examines how authentic projectexperiences matter to instructional design students.We explored this through a single case study of aninstructional design student (referred to as Abby)who participated as a member of an educationalsimulation design team at a university in thewestern United States. Our data consisted ofinterviews with Abby that we analyzed tounderstand how she depicted her participation inthis authentic project. In general, Abby found herproject involvement to open up both possibilitiesand constraints. Early in her involvement, when sheencountered limitations she did not expect, thoseconstraints showed up as most significant and shesaw the project as a place of disenfranchisementthat highlighted her inadequacies. Later, inconjunction with changes in the project structureand help from a supportive mentor, she reorientedto the possibilities her participation made available,all of which disrupted the cycle ofdisenfranchisement in which she seemed to becaught. Abby saw more clearly opportunities thathad previously been obscured, and she became oneof the project’s valued leaders. We conclude bydiscussing implications of these findings forunderstanding how authentic project experiencescan fit into instructional design education.

IntroductionOur purpose in this paper is to explore authentic projectexperiences in instructional design education. As Lowell

and Moore (2020) summarized, such experiences aremeant to help students “hit the ground running” (p. 581),preparing them for the rigors of professional practiceupon completion of their academic training. Priorresearch has pointed towards a number of benefits theycan have to accomplish this purpose. Studies indicateauthentic projects help bridge the gap between classroomand workplace as they provide natural interactionsbetween students and professional colleagues (Kramer-Simpson et al., 2015), expose students to the constraintsand challenges of work settings (Herrington et al., 2003),and present opportunities to practice design in potentiallydemanding circumstances (Miller & Grooms, 2018).

Our interest in authentic project experiences centers onhow they matter to instructional design students as partof their education. But whereas prior studies—both withininstructional design and in other fields—have researchedstudent perspectives to develop insights into what theythink about authentic projects (Dabbagh & Williams Blijd,2010; Hynie et al., 2011; Miller & Grooms, 2018; Vo etal., 2018), our concern was somewhat different. Westudied the issue from a practice-oriented point-of-view(Nicolini, 2012), attending to different modes ofengagement that are opened up to students throughauthentic project participation, including how students fitinto project environments and what can be learned abouthow projects matter by depicting this fit qualitatively. Toexplore this in richness and depth, we carried out a singlecase study of a student involved in an authentic project atthe culmination of her Master’s program in instructionaldesign. Our inquiry focused on three questions: How didthe student’s authentic project participation matter toher? How did her project involvement fit into hereducation? And what can be learned about studentinvolvement in authentic instructional design projects bystudying this fit?

Literature ReviewThe expectations that clients, team members, and otherstakeholders have about what instructional designers docan lead to challenges for novices in the field.Instructional design is a complex profession, requiringdesigners to cope with uncertainty (Ertmer et al., 2008),make frequent judgments (Gray et al., 2015), and adaptformal models or theories into practical action, with littletime for reflection (Ertmer et al., 2009; Yanchar et al.,2010). All of these can be difficult for new practitioners tomanage, leading to work-related stress (Fortney &Yamagata-Lynch, 2013), and requiring employers toinvest in on-the-job assistance (Stefaniak, 2017). The roleof an instructional designer can also be very ambiguous,leading to additional stress if designers’ expectations oftheir role are misaligned with those with whom they work(Drysdale, 2019; Radhakrishnan, 2018). In addition,instructional designers are often expected to be proficient

The Journal of Applied Instructional Design, 10(2) 69

in a wide range of skills that go beyond the actual designof instruction, including project management, buildingprofessional relationships, responding to shiftingpriorities, and promoting or defending their role tocolleagues (Schwier & Wilson, 2010).

These needs have led to calls for more authenticexperiences to be integrated into instructional designeducation, as a means for preparing students for therigors of professional practice (Bannan-Ritland, 2001;Larson & Lockee, 2009; Lowell & Moore, 2020). Long apart of learning in many fields, authentic projectexperiences can vary in scope, ranging from classassignments based on true-to-life scenarios (Herringtonet al., 2003), to working on client projects as part ofcoursework (Lowell & Moore, 2020), to internships wherestudents work for an extended period of time and with atleast some degree of autonomy (Johari & Bradshaw,2008). They can be primarily teacher-directed, student-directed, or exhibit a mix of oversight methods (Aadland& Aaboen, 2020).

Regardless of scale or the name by which they go,however, authentic project experiences share at leastsome commitment to a learn-by-doing philosophy, asdescribed in theories of experiential learning (Kolb,1984). Their benefit is often framed in the opportunitiesthey give students to practice design in realcircumstances (Miller & Grooms, 2018), or at leastcircumstances that closely model real situations(Herrington et al., 2003). They allow students tocollaborate with clients and disciplinary specialists(Kramer-Simpson et al., 2015; Lei & Yin, 2019), oftenexposing them to constraints they might face in on-the-jobsettings (Herrington et al., 2003). Projects can helpstudents develop specific skills they will need uponentering the workforce, such as leadership andcommunication (Hynie et al., 2011). In many ways, thevalue of authentic experiences is the balance they providebetween offering students a “dose of reality” aboutprofessional practice (Hartt & Rossett, 2000, p. 41), whileat the same time being a reasonably safe environmentwhere they can reflect on, and learn from, failures theymight experience (Kramer-Simpson et al., 2015).

Research indicates there can be challenges withauthentic project experiences, however. Especially intheir more unstructured forms they likely requireeffective mentorship on the part of instructors or otherexperts to help students translate the experience intoproductive growth (Heinrich & Green, 2020; Johari &Bradshaw, 2008). Also, if the project is significantlybeyond students’ skills, they might not provide asufficient return on investment to the person ororganization providing the experience (Hartt & Rossett,2000). The value of authentic projects can also be limitedif students are not willing to fully immerse themselves in

the learning task, especially those that might bestructured around more simulated scenarios (Herringtonet al., 2003). And students might have expectations aboutthe experience that are unmet—such as the nature of thework they will be doing, their role on the team, or howeffective the experience will be—leading to frustration ordisillusionment (Dabbagh & Williams Blijd, 2010).

To address these possible shortcomings, scholars havestudied authentic project experiences in instructionaldesign education from a variety of perspectives. Someresearch has been more conceptual, such as Bannan-Ritland’s (2001) review of what she called the principlesof “action learning” (p. 37), which she illustrated bydescribing examples of how authentic project experiencescan align with those principles. This type of research alsoincludes Miller and Groom’s (2018) articulation of aframework for integrating authentic projects intoinstructional design curricula. Other researchers havefocused on the varying perceptions of those participatingin authentic projects. Dabbagh and Williams Blijd (2010)found that students generally viewed authentic projectsas a positive contribution towards their education, inspite of moments of “anxiety and confusion” that oftenaccompanied their immersion in the project environment(p. 6). From another angle, Hartt and Rossett (2000)focused on the perspective of those providing authenticproject experiences. They studied to what extentstudents’ work provided a return on their organizationalinvestment, and found that in many cases studentsprovided meaningful value and the overall experience waspositive for the organization. Finally, other researchershave focused on guidelines for designing particular typesof authentic projects, such as Stefaniak’s (2015) focus onservice-learning experiences, Johari and Bradshaw’s(2008) study of project-based learning in internshipprograms, and Lowell and Moore’s (2020) exploration ofauthentic projects in online environments.

Our study aims to contribute towards this body ofliterature, focusing on authentic project experiences as arich phenomenon that can reveal unique insights whenexamined from the perspective of the “concernfulinvolvement” of students participating in projects(Yanchar, 2015, p. 110). We did not solely focus on whatauthentic projects accomplish from an external point-of-view, such as the educational outcomes instructors mightwant them to provide. Nor did we focus only on thesubjective perspectives that students might have aboutauthentic projects. Instead, we studied how studentswere involved in, and engaged with, project work from apractice-oriented perspective (Nicolini, 2012), to morefully understand how authentic projects matter tostudents as seen through their responses to projectexperiences. This can generate knowledge about thenature of student involvement in authentic projects aswell as how authentic projects fit into instructional design

The Journal of Applied Instructional Design, 10(2) 70

education more generally (Yanchar & Slife, 2017).

MethodTo address our research questions we chose a case studymethodology. Our case is that of an instructional designstudent involved with a team-based project, designingsimulations to teach cybersecurity at both the high schooland college level. Throughout our report we will refer toher as Abby. We chose a case study because it wouldallow us to explore Abby’s practical involvement with thisauthentic project in detail, providing insight into herparticipation by taking the world seriously as sheexperienced it (Packer, 2018). Our purpose was not totest a hypothesis about authentic projects, nor togenerate universal laws or principles about how they fitinto instructional design education. We also did notattempt to evaluate the effectiveness of the team withwhich Abby participated. Rather, we aimed to understandauthentic projects in a new, and perhaps unfamiliar way,as we became attuned to the details of Abby’s experienceover the course of about a year. We were also interestedin the discriminations she made in response to project-related events, including her affective responses to bothpositive and negative situations. This type of researchallows readers to become “affectively reoriented to theworld,” meaning “that we think differently about theworld, . . . that we feel it differently, [and] see itdifferently” (Wrathall, 2011, p. 170). Throughout ourresearch we assumed a view of people and their practicalinvolvement as found in the writings of Dreyfus (1991),Packer (2018), and Yanchar and Slife (2017), based in thephilosophy of thinkers such as Heidegger (1962) andMerleau-Ponty (1964). In this perspective, “humans arefully embodied, engaged agents . . . situated in a livedworld of significance,” which allows for theorizing intohuman activity that does not “invoke a more fundamentalreality of causal forces assumed to control . . . humanparticipation” (Yanchar & Slife, 2017, pp. 147–148).

The context of Abby’s involvement with this instructionaldesign team was grounded in her pursuit of a Master’sdegree in instructional design from an R2 university inthe western United States. This university enrolled about34,000 students (31,000 undergraduates and 3,000graduate students), and employed over 1,000 full-time,tenure-track faculty. The team included members from allof these groups – professors (including this paper’s firstauthor), undergraduate, and graduate students, from thefields of instructional design, information technology, andcreative writing. The professors were supported by grantsthey had received to study simulations in cybersecurityeducation, including a large NSF grant. All of thestudents were part-time employees. Abby, who had beena member of the team for about 12 months, was involvedfor at least three additional reasons: the project fulfilledan internship requirement for her Master’s degree in

instructional design; she was using the project as the siteof her thesis research; and the project gave heropportunities to complete various assignments for classesin which she was enrolled. According to Aadland andAaboen’s (2020) taxonomy, Abby’s involvement would becharacterized as student-directed. She was primarilyresponsible for ensuring her participation met hereducational goals, and her work was not specificallydesigned to serve her needs. While Abby did receiveoversight from professors associated with the project theydid so in their capacity as project supervisors and not asher teachers.

Our data were drawn from our multi-year, in-depth studyof the team with which Abby was involved. Our fullcorpus of data consisted of interviews with teammembers, transcripts of team meetings, field notesgenerated by researchers, and artifacts the teamproduced during the course of their work. From this datawe segmented out observations and interviews in whichAbby participated over the course of approximately oneyear, along with related field notes produced by theresearchers during the same period. The researchersobserved Abby in team meetings held every 1 – 2 weeks,and the first author conducted discussions with her every2 – 3 weeks. Some conversations lasted a few minuteswhile others were an hour or more. The specific quoteswe use in our report to illustrate Abby’s involvement withthe project were drawn from two formal interviews thefirst author conducted with her towards the end of thestudy, each lasting approximately 45 minutes. Theseinterviews were audio recorded, then transcribed foranalysis.

Our analysis method was drawn from Packer (2018).Packer’s approach relies on careful analysis of the wordsand other linguistic conventions research participants useto relate their experiences. The goal is not to summarizepeople’s experiences into a set of codes or otherwiseabstract expressions that can be generalized acrosssituations. In contrast, his method is meant to generatean empirically based interpretation of the local, practicalwork in which people engage to account for themselvesand their situation. The results of such an analysis aretypically ethnographic in character, although they are notfull ethnographies since they are centered aroundparticipants’ self-reports rather than includingobservations or artifact analysis. There are reports thatPacker called, “a way of seeing the world that followsfrom [interview participants’] way of being in the world”(p. 472). Further, it is often the case that the usefulnessof these studies is at least partially found in theiruniqueness. Rather than being valuable because they areuniversal, such research is meant to provide a distinctivevantage point from which to view a phenomenon—a viewthat can reveal fresh insights about common things.

The Journal of Applied Instructional Design, 10(2) 71

To achieve this outcome we conducted a hermeneuticanalysis based on close readings of our data. This analysiscentered around the effects Abby’s interviews had on ourunderstanding of her project experience (Packer, 2018).We started by articulating our initial understanding ofeach transcript (done individually by each author andthen in discussion together). We then engaged in thefollowing steps recommended by Packer, focusing not onany inherent meaning in the words of the transcript butattempting to articulate the effects they had on ourunderstanding. In each transcript we identified: (a) thecontext of the interview – its background, purpose, andfacts it contained about Abby or her participation in theproject; (b) gaps in Abby’s report, where she seemed tobe making assumptions or taking for granted certainconclusions; (c) the tropes and structures through whichAbby communicated details of her situation as well as heraffective responses to her circumstances; (d) thechronology of Abby’s experience—especially breakdownsin her experience—and how she talked about herself asan agent in these events; and (e) any explicit knowledgeAbby identified as important to understand her story. Ateach stage we recorded evidence that supported ourinterpretation of Abby’s claims, any disconfirmingevidence or examples, the effects our readings werehaving on our understanding, and additional questionsraised by that phase of analysis. Through hermeneuticcomparison of each of these parts with the wholetranscripts, as well as the whole with the individual parts(Fleming et al., 2003), we crafted an account thatprovided “a new way of seeing” (Packer, 2018, p. 149) theresearch issues of our study, while remaining true to thedetails of Abby’s experience.

While this method allowed for a detailed examination ofAbby’s mode of engaging with the project—including herown complicity in creating that mode of engagement(Packer, 2018)—we acknowledge that it does come withsome limitations. Abby’s reports undoubtedly reflectedher own biases, and the project itself also affordedcertain ways of participating better than others. So werecognize that other instructional design students maysee and experience their authentic project experiencesdifferently than did Abby, as well as respond to events ina different manner than she did. So our findings do notgeneralize to every situation educators might encounter.Nevertheless, there is still value in understanding theexperiences of one student to the depth we provide here.Even single cases can uncover new possibilities or revealuncommon or unfamiliar aspects of the world –possibilities and aspects that might remain hidden whenusing research methods that summarize the detail oflarge numbers of students (Stake, 1995). They can alsosuggest certain things that must be taken into account ifone were to develop broader, more generalizable theoriesor frameworks, recognizing that if events happen even inone case they are legitimately part of the world,

regardless of their frequency (Flyvbjerg, 2001). It is thesetypes of findings that we aimed to generate through ourstudy.

FindingsAs Abby described her involvement with the simulationproject, she depicted it as a place of both possibility andconstraint. As she initially explored the project space sheencountered considerable freedom, and she believedthese opportunities would allow her to meaningfullycontribute towards ensuring the simulations wouldachieve their intended outcomes. But then Abbyencountered limitations to her participation that she didnot expect. The significance of these constraints startedto eclipse the opportunities she had seen, and the projectstarted to show up to her as a place ofdisenfranchisement that highlighted her inadequacies.Later, in conjunction with changes in the projectstructure and help from a supportive mentor, Abbyreoriented to the possibilities available and disrupted thecycle of disenfranchisement in which she seemed to becaught. She saw more clearly opportunities that hadpreviously been obscured, and she became one of theproject’s valued leaders. These stages are summarized inTable 1, and are further developed in the sections thatfollow.

Table 1

Summary of Abby’s Involvement in an Authentic ProjectExperience

Abby’s involvement How Abby’s involvementwas significant

Abby encountered initialfreedom, with few firmexpectations and manyopportunities to pursuewhat she thought wasimportant.

Abby believed shedeveloped a unique point-of-view on the projectthat would help her makea meaningfulcontribution.

Abby encounteredlimitations; she did nothave the skills toimplement her ideas forimproving the simulations,and teammates often toldher that her suggestionswere not the team’spriorities.

Abby felt like she had beenboxed in anddisenfranchised. She feltinadequate and started topull away from fullparticipation.

Abby received help from asupportive mentor, andwas given newopportunities to lead out inaspects of the project’sdevelopment.

Abby reoriented towards thepossibilities the projectoffered her; as shereengaged she became oneof the project’s valuedleaders, seeing even moreways she could bemeaningfully involved.

The Journal of Applied Instructional Design, 10(2) 72

Abby’s Initial Involvement – Few FirmExpectations and a Unique Point-Of-View

Abby’s initial engagement with the team looked as if itwould serve mutually beneficial purposes. From Abby’spoint of view, joining the project gave her an opportunityto pursue a research interest that would ultimatelybecome her thesis – how to better attract high schoolgirls to STEM careers. On the team’s part, they wantedAbby to oversee what she called the simulations’“education-oriented” components. Her first assignmentwas to develop learning outcomes for each simulation.Abby was also tasked to develop teacher supportmaterials to accompany the simulations; while studentswere meant to complete each one on their own (as a unitwithin a larger class on cybersecurity-related topics), theteam wanted to provide teachers with enough support tofeel confident they could answer any student questionsthat might arise. And finally, because Abby had sometraining in instructional video production, the teamanticipated that she would oversee the production staffwho would develop each simulation’s video elements(however, this was not scheduled to begin until a fewmonths after Abby was hired, and so it did not influenceher initial participation).

As Abby’s involvement with the project deepened, shebecame aware that the nature of her work differed fromother students. While others were required to providetangible evidence of their progress on a regular basis,Abby’s responsibilities did not come with the sameamount of oversight. She generally followed her ownschedule, and was rarely asked to report the status of herwork in the same way as others. If something was notcompleted on time (such as the learning outcomes for asimulation phase), the rest of the team was told to moveahead, adjusting their work when Abby was finished.Relatedly, Abby also noticed that her deliverables differedfrom those of other students. Their work products werealmost exclusively concrete – written narrative elements,files for UX elements, or code to run the simulation. Abby,in contrast, while producing a few tangible artifacts (e.g.,worksheets for teachers), found most of her work to beconceptual, such as writing learning outcomes that mightinfluence the form the narrative or user interface took,but that did not show up in the simulations directly.

Together, these conditions created an environment whereAbby initially felt free to pursue whatever work shethought best. She said that she felt “less tethered to oneparticular expertise,” and although she was assignedcertain tasks she did not feel bound to any certainprocess for completing them, nor did she limit herinvolvement to only those areas to which she was formallyassigned. For example, she took it upon herself tocomplete one of the simulations on her own, from start tofinish without the answer key – something no one else on

the team had done. She told us this was because “I’mmore responsible for what the student experience is like,”and “I feel like it's my job to make sure that the studentshave the scaffolding that they need, that they’reaccomplishing the tasks, [and] that the tasks aremeaningful,” even though no one told her so explicitly.Additionally, Abby assumed responsibility for evaluatingthe simulations’ usability. She told us that watchingstudents actually using them helped her generate insightsfor improving the team’s work. From her observations,Abby “could tell… if they thought [a simulation] wasstrange, or it rubbed them the wrong way.” She alsoobserved what she called students’ “emotional reactions”to their experience, “if [this student] liked it or [another]didn’t,” that further informed her view of theproject. Helping professors with their research into thesimulations helped her develop additional ideas forimproving them, as well.

Abby told us she initially believed that because theseassigned and assumed responsibilities were uniquecompared to what her teammates were doing, shedeveloped a “different perspective,” regarding how todesign the simulations so they would achieve theirintended outcomes. She saw a “vision” of the project thatwas not “necessarily easy for everyone to see.” She toldus that, “because I’ve been involved in the research . . .and, like, going through it in classes, and trying to reallyunderstand the students’ experience, I think I’m moreconnected with that aspect.” She identified this as adistinct opportunity she had to contribute to the projectteam, “conveying that vision,” as she called it, andsharing her unique outlook with others – one that theywere not in a position to see on their own.

Abby Encounters Limitations to HerInvolvement

As Abby became more involved with the team, however,she told us that her working environment began to showup as more and more limiting, and that the projectstarted to feel like a place of constraint. She slipped intoa pattern of yielding to others to shape the simulations’direction, and eventually saw fewer opportunities to acton her own. As we undertake to describe this, werecognize the potential irony – one might think theenvironment Abby initially described, where she waslargely able to decide when and how she would engage,and where she was bringing unique insights back to theteam, would be a space of accomplishment. But inactuality she began to depict her participation ascharacterized by constraints and limits. As we will showlater, Abby was eventually able to reorient and reengagewith the project in a more freeing manner, but at least fora time nearly the opposite occurred, and she talked aboutherself as if she had been boxed in by obstacles that hadbeen placed around her.

The Journal of Applied Instructional Design, 10(2) 73

Yet this was not merely her private interpretation of thesituation that she was able to overcome only by adoptinga better attitude towards what seemed to be constrainingforces. Rather, the project itself had real features thatafforded themselves towards courses of action that weremore limiting than freeing. As Abby pursued these shedid so as if she were taking a path of least resistance – apath that, although it was the easiest, was neverthelessone that she moved into (although she avoided admittingthat to herself at the time). Correspondingly, when welater describe the positive changes in Abby’sparticipation, we will show that while it was true that itdid include a change in how she approached hercircumstances, it also reflected a change in the projectstructure so that it afforded itself towards more liberatingpossibilities on Abby’s part. So we are careful not toportray Abby as either choosing on her own to see theproject as a confined space, or as being forced into aconstrained role by deterministic, environmental forcesoutside of her control. Abby’s interviews invited us to seehow the way she fit within the project’s structure made iteasy for limitations to show up as relevant, while at thesame time recognizing that the concrete ways thoselimitations mattered to her, and how she chose to copewith them, were equally important in defining herexperience.

Being Boxed In

Abby told us about two, interrelated factors that togethershowed the project as a space where she was boxed in,with limited options to meaningfully participate. First, asnoted earlier, there was a contrast between the nature ofAbby’s work and that of her teammates. Abby told us thatothers offered what she called “tangible” contributionstowards the simulation’s final form – the form studentswould actually experience. This included the simulations’code, the graphic design that gave them visualrepresentation, and the creative writing that broughteach simulation’s story to life. Abby, on the other hand,defined her contribution as, “helping people do what theyneed to do.” She seemed to draw a distinction betweenthe work others did—creating the concrete and visiblebuilding blocks that one could point to in the finalsimulation—and the work she did, which was conceptual,in the background, and useful to the extent that it helpedthe rest of the team do their jobs better.

While in the abstract Abby talked about suchcontributions as having “value,” actual examples sheshared reflected a more conflicted tone, because most ofher ideas required someone else to actually give them aperceptible form. For instance, she told us that sheaccepted responsibility for whether students weresuccessful in learning from the simulations, “if people areexperiencing [poor learning outcomes], then I wouldmaybe feel, like, maybe that’s on me.” But she also said

she had not created anything that students wouldencounter directly to help them achieve those outcomes,nor did she have the ability to do so. “People aren’t goingto be, like, ‘oh, Abby built this or did this.’ . . . I’m notdoing anything right now that’s going to be a tangiblething.” The nature of Abby’s involvement meant thatwithout help from her colleagues, what she designedwould not be used by students. And it seemed this beganto overshadow the importance of any concrete materialsshe was producing, such as her teacher supportmaterials. After initially describing that she was workingon them, and while we know from our observations thatshe completed the assignment, she did not bring them upagain and did not mention deriving any satisfaction orsense of significance from their completion.

Alone this may not have meant much to Abby, other thanoccasional hints she offered about how she would haveenjoyed the recognition that accompanied thesimulation’s concrete development. But Abby also foundthat her teammates could be reluctant to accept orimplement any suggestions she provided. Through herresearch, usability testing, and personal experiencecompleting them, she generated a number of ideas forhow the simulations could be improved. And at least for atime she would bring her ideas back to the rest of theteam. But often their response was her suggestions wereeither too difficult or were not their current priority:

I’m, like, “hey, I really think we should change this.” AndI feel, like, sometimes people are, like, “that’s kind ofhard and we don’t necessarily want to do it.” So then thatvalue doesn’t necessarily come to fruition.

Abby offered multiple examples. A particularly illustrativeone concerned the team’s focus on building women’s self-efficacy to pursue a cybersecurity career:

I really feel like putting students’ names in[the simulation] would be really helpful.We’ve used Junior because that’s just aneasy way to program it. And that rubbedme the wrong way when I got on,especially thinking if we’re trying totarget girls. Like, so, here’s me putting myresearcher hat on. I know we want to helpgirls feel more, like, identify with thisbetter. And I’m thinking, no girl has everbeen called Junior as a nickname. . . Itried it out with my sister, and my sister’s,like, “Junior, what, is that me?” So, I canhear this from the students. I’m thinkingfrom my research mind, “this is not good.”I talked to [the lead professor], he’s, like,“oh, yeah, students identify better if theirname is there.” Then when I take that tothe team, they’re, like, “oh, that’s going tobe a lot of work.” So, how much do I push

The Journal of Applied Instructional Design, 10(2) 74

it?

The result of dismissals such as these was a growingsense on Abby’s part that what she wanted to contributewas not as needed as what her team members offered.Not only did it appear that they valued different outcomesthan her, but she also concluded that she did not have theability to influence the direction the simulations wouldtake, “I’ve kind of let the developers do their thing . . . Ididn’t see myself to be in a position to tell themanything.” She often described the simulations’development as occurring around her, where she wasaware of what was happening, but they were notsomething she was directly helping. Over time, she sawfewer opportunities to engage in ways that would changethe project’s trajectory, including changes aligned withwhat she learned through her research into thesimulations’ educational effectiveness.

Growing Disenfranchisement

Given that Abby needed cooperation from her teammatesto implement her designs, their dismissals hurt herdeeply, “why be on a team if you’re not doing anything?So, it kind of made me—if I’m not really doing much, thenI just kind of feel pointless. Well, maybe I shouldn’t behere.” We use the term hurt intentionally. Similar to howa physical injury can become inflamed and sensitive, andthe afflicted area becomes too tender to tolerate anotherwise benign touch, or bear what would otherwise beone’s ordinary weight, Abby’s growing sensitivity to herlimitations led her to pull away from other team membersto avoid difficult interactions. She particularly becameattuned to, and even defensive about, potential offenseson the part of her teammates (whether intended or not).

One example occurred when new writers were hired tocomplete the simulation narrative. Abby told us that asthey were beginning their work she tried to show them aset of scripts she had consulted on with the previouswriters:

I was trying to point out, “hey, look, wedid a lot of work on this last spring. Wemight want to look in this folder becausesomebody already wrote a bunch ofscripts. We don’t need to reinvent thewheel.” And [one of the writers] told me,“well, yeah, but we’re master’s students,and so we probably can do a better job.”

Abby continued, “that response just felt like it wasdismissing what I was trying to say. So, instead oflistening and validating. . . like, ‘tell me more,’ it was justdismissing.” Abby told us that by this she meant that shethought the writers were both dismissive of the work thathad been done as well as of her attempts to have aconversation about it. Additionally, she was particularly

bothered that at least one writer did not seem tounderstand that she was also a graduate student, “[thewriter said], ‘well, I’m a master’s student.’ Okay. So am I,but I won’t mention that.” Abby found the experiencequite disheartening, telling us, “I was so frustrated,” anddescribing how afterwards she started to withdraw fromfully participating. At one point she told us that herresponse was, “all right, I’ll step aside.” At another timeshe described it as, “okay, I’ll back out of your way.” Bothphrases seemed to suggest Abby’s sense of resignationand defeat.

In talking about incidents like these, Abby seemed todescribe the project as being a place ofdisenfranchisement, depriving her of opportunities tooffer meaningful contributions, and where she had beenjudged as inadequate to contribute anything of substance.The positive aspects of her participation, which earlierhad seemed so fulfilling, receded into the background.She started to primarily focus on her limitations, evengoing so far as to tell us, “I didn’t really feel like I hadanything that I was doing. . . [For a semester] I washardly assigned anything. Yeah, I was like a bump on alog.”

As we analyzed other events Abby talked about, however,we saw that while it was true that her contributions couldbe discounted, at the same time she started to pull awayfrom the project as well. This also reduced the extent towhich she was actively involved. In the face of rejection itseemed that Abby generally stopped putting herself in theposition of being rejected again. At one point she evenseemed to openly admit this, saying, “[I] was, like, notsuper engaged in what was going on.” She described oneinstance, during the time she was “frustrated that no onewas valuing what had been done last spring” (meaningwhen the new writers had abandoned the existingscripts). One of the professors asked Abby to work withthe same writer who had been particularly dismissive toupdate some of the material students would initiallyencounter when using a simulation. Abby described thisas another case of work she had previously completedbeing dismissed without actually examining what hadbeen done, “I was like, ‘it’s all there, we did this, look atthis.’” In response to the request, Abby told us that, “Irefused to help. And so instead of being involved, I just,like, checked out.” Out of these difficult interactions avicious circle seemed to emerge. Abby thought hercontributions were being rebuffed, and she responded bypulling away. But this meant she had fewer opportunitiesfor meaningful involvement, which further darkened hermood. As she became more discouraged, the actions ofher teammates tended to show up as if they wereintentionally slighting her work. Whether they actuallywere or not, the result was the same; Abby becamesensitive (or perhaps overly sensitive) to saliences thatappeared slighting, which, in turn, fueled a further sense

The Journal of Applied Instructional Design, 10(2) 75

on her part that she was not needed.

Interestingly, even though Abby told us that for asemester she “was hardly assigned to anything,” based onteam meetings we observed during that period thisappears to have not actually been the case. We watchedAbby participating in project decisions, takingassignments, and being treated by others as a fullcontributor to the project. Yet we do not interpret Abby’sinsistence that she had nothing to do as her trying tomislead us, or that her memory was flawed (although weacknowledge both of these as possibilities). Rather, sincewhen she was not talking about her disenfranchisementshe occasionally brought up other ways she was involvedduring this same period, it seems more likely that whenshe talked about not being assigned anything she wastrying to communicate the affective quality of herexperience instead of the literal facts of the situation.Saying that she was, “a bump on a log,” or that, “[I] didn’treally feel like I had anything I was doing,” were herattempts to point out what was significant about hercircumstances. What seemed to matter most was that shesaw herself as not being a contributor, and that she didnot see the simulations being improved because of herwork. Yet, as we have emphasized, this sense was notsolely created by either the events around her, or by herbeliefs and attitudes about those events. It seems to bebetter characterized as a way of engagement that wasjointly produced both by the situation Abby found herselfin as well as how she attempted to cope with what sheexperienced.

Abby Moves from Disenfranchisement toValued Project Leader

Despite her growing discouragement, Abby did notcompletely abandon her membership on the team. Whenwe asked why she identified at least three aspects thatcontinued to draw her in. First, notwithstanding thedifficult interactions Abby had with some teammates,others had become her friends, and she described a“connection with certain people I was working with” thatshe wanted to maintain. She also seemed to fall intosomething of a sunk cost fallacy, telling us, “I wasinvolved when it started. . . I guess I felt some level ofinvestment and commitment.” Finally, she wouldreminisce about the sense of belonging and being acontributor she once experienced, and hoped that shecould recapture it in some form, “we were excited aboutthis idea that we [came] up with. . . . So I guess I caredabout being on the team and I wanted to be productiveand useful.” These largely emotional factors—allmattering to Abby in different ways and providing herdifferent motives for wanting to participate—weresignificant enough to tether her to the project even as somany other aspects continued to push her away.

Alone, however, these commitments did not actuallychange anything in Abby’s situation. While they inclinedher towards at least some association with the team, shestill remained mostly disengaged until three, somewhatintertwined features of the project structure alsochanged, that together seemed to open up possibilitiesthat Abby found less constraining. The first was that acertain professor who was sensitive to helping studentshave good experiences began to assume a moreprominent role as the team began working on asimulation for which he was the subject matter expert (wewill refer to him as Eric). Abby told us that Eric “makes[her] feel valued,” and, “he just totally built me up.” Thesecond factor was Abby enrolled in a project managementclass that required her to be a “scrum master” for aproduct team (a project management role found in agileapproaches to product development). Abby asked Eric ifhe would allow her to complete her assignment for thesimulation he was overseeing, “I need this experience, soI emailed Eric, like, ‘hey, do you think I could be scrummaster on our team?’” Eric’s response was, to Abby, veryenthusiastic, “immediately he started referring to me asthe scrum master.” She further commented, “he’d, like,let me lead in meetings,” and, “the way Eric is, like,promoting me and what I can do, I think I [now] havemore of a leadership role.” Finally, development reachedthe point that video production began, and Abby said shealso felt valued because, “[team leaders] put me in chargeof the videos and actually said, ‘Abby’s responsible forthis,’ and, ‘go to Abby.’”

As Abby pursued the new assignments and opportunitiesthese structural changes opened up, the character of herparticipation changed as well, reorienting from a sense ofdisengagement to one of more complete involvement. Shebecame more attuned to possibilities in her situation, assuggested by her comment that, “I can do things becauseI feel valuable.” To illustrate she provided a number ofexamples of not only the new work she was doing but alsothe change she experienced in the character and qualityof her participation.

One change was that even though the work Abby didduring this period continued to be intangible and largelyin the service of teammates doing concrete production,she began to describe it as adding value, as opposed toher previous sense that her work was not needed. Forinstance, even though Abby did not produce thesimulation videos herself, she did take the initiative torecruit, hire, and support the videographer with littleoversight or direction from those supervising her. Of thisshe said:

I think we're all excited about the videosright now because we have [ourvideographer], who's, like, our – he'sgoing to make it cool. He's going to make

The Journal of Applied Instructional Design, 10(2) 76

it cool. We have actors that we're excitedabout . . . [The videographer] interviewedthem and sent me the videos and all thesepeople are going to be so fun. . . So, Ithink I'm excited about the production,and we're shooting on Saturday, so it'slike the big thing right now.

The difference in Abby’s tone as she described hersupport of the videos was striking. Whereas hercomments about previous events could reflect a sense ofdespondency, when she described her leadership over thevideo production—even though she was not directlyshooting the videos herself—she spoke with a sense ofenthusiasm that suggested she was more confident abouther place on the team than she felt before.

A related change was that difficult interactions withteammates that had previously bothered her so much,seemed to recede into the background of her experience.She told us, “now I feel a lot more respected and capableand less impacted by those types of situations. So, I’m notas worried about that now.” Even though she told usthere were still hard conversations or challengingproblems to address, her sensitivity to them diminished,and she talked about them more dispassionately than shehad before.

And finally, as Abby began acting as the scrum mastershe started to see things about the project she had notnoticed earlier. In particular, her experience of beingdisenfranchised no longer appeared to be so unique. Shestarted to get a sense that the overall project had been“stuck.” She told us, “there hasn’t been a whole lot oforganization in getting stuff done,” and seemed toindicate that from the perspective of her new role shecould see that she had not been the only personfrustrated because they felt like they were notcontributing, or that what they were doing did not matter.But realizing this did not lead to her to slip back intodiscouragement. Rather, she seemed more attuned tosituational possibilities for how she could lead out andhelp the team make better progress, like enforcing dailystatus updates, planning agendas for project meetings, orcontributing new design ideas that could createadditional project momentum.

By the time of our final discussions with Abby sheappeared to have largely overcome any sense that theproject was boxing her in. Neither was she asdiscouraged as she had been earlier. But she did not justperceive different things about the simulations, herteammates, or her own work. She was involved in theproject in a completely different way, more as a valuedleader than as an occasional contributor. This does notmean the project has become trouble-free. As mentioned,after being placed in a leadership role Abby could seeproject shortcomings she had not seen before, and even

while we were interviewing her she had questions aboutwhether the simulations were as effective as they couldbe at achieving their outcomes. But Abby seemed toapproach these challenges from a position of self-possession, rather than disenfranchisement or doubt. Shebecame a leader not only because she had skills to helpher lead, but because she started to respond tocircumstances like leaders respond, as suggested by hercomment:

I’m involved in lots of aspects of lots ofthings. . . When things are brought up [Ithink], “oh, yes, I have something that Iwant to bring up for the team to thinkabout.” . . . I have more to contributebecause I’m more involved.

As we have emphasized, this seemed to be due toopportunities Abby was given as well as her ownwillingness to accept those opportunities and makesomething of them. Whereas before she experienced avicious circle of further and further disengagement, shenow seemed caught up in a virtuous circle. Others’willingness to believe in her and give her new ways tocontribute opened a space for her to act. Accepting whatthey offered reignited her enthusiasm, and her improvedmood showed her even more opportunities forinvolvement. Abby herself seemed to recognize thechange, telling us, “there’s just been a huge contrast”between times that she was so hurt by actions of herteammates that she was willing to step away from activeparticipation, to the time of our interviews where she wasbeing told by her colleagues, “Abby’s so important on thisteam, Abby’s involved, Abby does everything, Abby doesmore than the professors.” When we shared that this wasalso reflected in our own interviews with other teammembers, and that they were equally telling us how muchshe was contributing, her response was, “wow, that’s,wow. That makes me feel like I want to do even more!”

DiscussionOur interest in studying Abby’s case was to explore howher authentic project involvement mattered in herinstructional design education. Analyzing her interviewsprovided us “a fresh way of seeing” (Packer, 2018, p. 148)what it could entail to be a student involved in this formof learning, which we summarize as three insights. First,Abby’s account contributes towards the literaturerecognizing that even though authentic projectexperiences can have clear advantages, they also may notalways be unambiguous goods in students’ education.Second, we suggest that a reason for this is because theoutcomes of authentic project experiences do not solelylie in any intrinsic properties of the opportunitiesthemselves, nor in students’ personal attempts to makemeaning out of those opportunities. Avoiding a

The Journal of Applied Instructional Design, 10(2) 77

dichotomous distinction between situation and studentprovides a clearer view of how authentic projects becomea learning space when students engage in the practicalwork of fitting themselves to the affordances suchexperiences offer. Finally, we learn from Abby’s case thatchallenges accompanying authentic project experiencescan be mitigated, but doing so will likely involvecooperation from those with the ability to adjust the formand structure of an experience, as well as theparticipating students themselves.

Authentic Project Experiences May NotAlways Be a Pedagogical Good

For Abby, participating in the simulation project allowedher to apply a variety of skills in authentic settings andoffered her unanticipated leadership opportunities, butalso challenged her self-confidence to the extent that shenearly abandoned her involvement. This duality suggeststhere can be tensions in authentic project experiences asa pedagogical strategy, and they may not always beunambiguous goods in students’ education. This alignswith findings from prior research. While researchers havedescribed a number of benefits these experiences canprovide (Johari & Bradshaw, 2008; Miller & Grooms,2018), the literature also recognizes that the veryauthenticity of these experiences can create complexitieswith which students may have a difficult time coping(Dabbagh & Williams Blijd, 2010; Hartt & Rossett, 2000).They may find themselves tangled up in binds they do notyet have the ability to unravel on their own.

Our study extends this literature, not only by drawingattention to the forms potential complexities could take,but also by showing at least some ways that studentsmight affectively respond if complications arise.Highlighting both potentialities seem important to helpeducators address challenges they might face whenimplementing authentic project strategies themselves.For instance, one reason project involvement was not anunambiguous good for Abby was because when herteammates were reluctant to implement her ideas, theirdismissals showed up to her as obstructing her ability tomeaningfully contribute. But while her views werecertainly understandable, they were also not unavoidable.We can imagine how it may not have mattered as much toother students if they were challenged as Abby was, orhow they might even have been energized by the need tofind ways to better persuade their colleagues. So in hercase, for educators to understand how to help Abby havea better experience they would have to pay attention tothe situational affordances as well as the relevance ofthose affordances to her. Yet we are aware that Abby’sexperience only highlights some of the difficulties thatmight create strains for students involved in authenticprojects. So we encourage continued research into otherpossibilities authentic project experiences might open up,

especially research that explores challenges that canaccompany the approach.

Authentic Projects Become LearningExperiences Through a ReciprocalRelationship Between Student and theProject World

As just mentioned, and as we have described throughoutour report, Abby’s experience was born out of realsituational affordances, as well as how she negotiated andnavigated those affordances. This seemed typified by howshe described how her mode of engagement changedafter Eric appointed her scrum master, “I can do thingsbecause I feel valuable.” In Abby’s world, she not only feltmore or less valued based on what she was able to do, butshe also felt more or less capable of acting depending onhow valuable she felt. Her experience seemedcharacterized by reciprocality. She had to respond tofeatures of the environment outside of her control, buther responses altered the project context and changedwhat type of involvement was available to her movingforward. Focusing only on one side or theother—opportunity or Abby’s attitude—seems insufficientto understand either Abby or the project itself. Whattranspired cannot be reduced either to the influence ofenvironmental forces acting upon her, or her privateprocesses of constructing meaning out of her experience(see Wrathall, 2004). It seems more accurate to attemptto unify what was provided from both Abby and from theproject space, “not [as] sharply distinct, self-sufficientstates or separately existing ingredients, but [as]essentially interwoven aspects of a single, unifiedphenomenon. . . More like two sides of a coin or twodimensions of a figure” (Carman, 2020, p. 77).

Recognizing this provides a more comprehensive way ofunderstanding authentic projects as learning experiences.Abby’s account indicates that neither a view of learningthat locates it primarily in environmental influences orone locating it primarily in individual processes ofmeaning-making is sufficient. For instance, while sheclearly had to respond to environmental factors in herjourney towards becoming a project leader, Abby cannotbe portrayed as someone who learned leadership onlybecause her actions came into alignment with a set ofstandards or norms provided by her environment – a viewimplied by theories that define learning as the result ofprocesses of socialization and enculturation (cf.Matthews, 2016). And while she clearly had to interprether situation and decide what events meant to her, shealso cannot be portrayed as having learned leadershiponly because of personal, internal changes to herknowledge, attitudes, beliefs, or skills. Equally importantwere the changes to what Yanchar et al. (2013) called her“embodied familiarization” (p. 219) with the project,meaning how she was able to practically comport herself

The Journal of Applied Instructional Design, 10(2) 78

to fit into the space provided by the real, situationaldemands of her work. Abby learned from her projectexperience as she became more capable of “meaningfulengagement” with what had previously been foreign. Shebecame more “accustomed” to, and “familiar” with, howto navigate the very practical concerns her situationrequired (p. 220).

This is a view that transcends reductive attempts tolocate learning primarily in one type of cause or another,either cognitive or cultural. It shows learning as a processof developing a practical stance towards the world – inAbby’s case a stance taken by instructional designers.Certainly this stance includes learning new skills ordeveloping a new identity, but is not defined by thesefeatures alone. It also includes how the world feels as oneinhabits it, such as how the project felt to Abby when shewas disengaged, or as she re-engaged (cf. Dreyfus, 1991).It entails how one anticipates, and becomes sensitized to,saliences in the world, such as how Abby as a projectleader could see the team was not as organized as sheonce thought, and how this drew her attention towardsopportunities that might have otherwise remainedunnoticed (cf. Wrathall, 2004). It encompasses how onebecomes resolved to act in response to opportunities theworld offers, such as how Abby accepted theresponsibility to plan project meetings so they would be abetter experience for everyone involved (cf. Dreyfus,2017). In this view, authentic projects fit intoinstructional design education not because they provide asingle cause of learning, or even a group of causes, butbecause they contribute towards “shifts in how the worldshows up, how learners fit into the cultural contexts oflife, how they engage in practices, and the stands theytake on matters of significance” (McDonald & Yanchar,2020, p. 643).

Educators and Students Jointly ImproveAuthentic Project Experiences

These views suggest a new way of understanding eventsthat might arise during students’ participation inauthentic project experiences. Individual project eventswill not necessarily be good or bad because of anyintrinsic properties they possess, because their value is atleast partly found in how students respond to them. Whileit is true that project experiences can be well- or poorlydesigned, their design itself is only a starting point for theevolution of the experience that will occur as actualstudents get involved. But neither is it correct to say thatany given event is neutral—with its learning valuecreated by students themselves—since individual eventswill open up certain possibilities while at the same timeclosing down others. So it is still incumbent on thoseplanning authentic experiences to “offer compellingbeginnings” in projects that students “may be persuadedto pick up” as they engage in the project space

(McDonald, in press). If authentic projects are noteffective because of their inherent properties,instructional design educators and students can at leastwork together to make them effective by attempting toimprove how students fit into them. This implies thateducators may be able to help students break out ofnegative cycles of participation as they alter conditions inthe environment and as they point students’ attentiontowards new possibilities that might be opened up by theimproved conditions.

Prior research suggests practical ideas that educatorscan consider for accomplishing this, including: cultivatingmeaningful relationships between students and mentorsso that students come to trust the guidance they provide(Michela & McDonald, 2020); ensuring the designs ofproject environments do not inadvertently discourage orpunish students for expressing their independence (Johari& Bradshaw, 2008); providing students frequentopportunities to reflect on their experiences and whetherthose experiences are leading to desirable ends (Bannan-Ritland, 2001); and ensuring regular evaluation is part ofauthentic project environments so necessary adjustmentsto structures or relationships can be made (Larson &Lockee, 2009). We recommend additional research beconducted to develop other design guidelines that areconsistent with our findings.

But as our study emphasizes, when challenges ariseduring authentic projects it is likely not the soleresponsibility of any party alone to mitigate the problems– neither the educators planning the project nor thestudents learning from it. This is not because either sidecan be relieved of responsibility, but because both sidesare likely contributing something towards the unfoldingsituation (for good or bad). Challenges may have as muchto do with what stands out to students as important abouttheir involvement as they do with any objective factorswithin the context itself, although situational factorswould certainly contribute towards what students couldsee. So neither side’s efforts alone will be sufficient toalter the circumstances. On the side of the educators,while they can set up any number of conditions, theycannot set up how students respond to the conditionsthey provide. On the side of the students, no matter whatattitude they bring into a situation, they may still findconditions that stifle their contributions or otherwiseimpede their capacity to act in alignment with thepractical stance the authentic project is meant to makeavailable. So cooperation from all sides will be needed toaddress authentic project challenges – those with theability to adjust the form and structure of an experience,as well as the participating students themselves.Improving the student experience will jointly be a matterof changing what opportunities the environment provides,and of students becoming reenergized as they anticipateanew the potential futures such opportunities could

The Journal of Applied Instructional Design, 10(2) 79

unfold. But educators cannot pick up the possibilities onbehalf of students directly. Ultimately, as it was for Abby,students have to accept the changes they are offered, andmake the project personally relevant in a manner thatimproves the quality and character of their participation.

ConclusionOur purpose in this study was to explore how authenticproject experiences matter to instructional designstudents. Through a case study of how an instructionaldesign student, Abby, depicted her experiences as amember of a design team, we came to understand how (a)authentic projects may not always be unambiguous goodsin instructional design education; (b) how this is sobecause authentic projects become learning experiencesthrough a reciprocal relationship between students andthe project; and (c) how because of this, educators andstudents must jointly cooperate in improving authenticproject experiences. Of course, more research is neededto more fully understand how authentic projects matter toinstructional design students. But our initial explorationhere at least illuminates how part of their significance liesin the range of practical and affective responses studentsmight have to them. We hope that further research willcontinue to focus on these relationships between studentsand the project experiences in which they participate,seeing them as important not because of what they do tostudents, but also because of what students are able tomeaningfully contribute towards the experiencesthemselves.

ReferencesAadland, T., & Aaboen, L. (2020). An entrepreneurship

education taxonomy based on authenticity. EuropeanJournal of Engineering Education, 45(5), 711–728.https://doi.org/10.1080/03043797.2020.1732305

Bannan-Ritland, B. (2001). Teaching instructional design:An action learning approach. PerformanceImprovement Quarterly, 14(2), 37–52.

Carman, T. (2020). Merleau-Ponty (2nd ed.). Routledge.

Dabbagh, N., & Williams Blijd, C. (2010). Students’perceptions of their learning experiences in anauthentic instructional design context.Interdisciplinary Journal of Problem-Based Learning,4(1), 2–13. https://doi.org/10.7771/1541-5015.1092

Dreyfus, H. L. (1991). Being-in-the-world: A commentaryon Heidegger’s Being and Time, Division I. The MITPress.

Dreyfus, H. L. (2017). Background practices: Essays onthe understanding of being (M. A. Wrathall, Ed.).

Oxford University Press.

Drysdale, J. T. (2019). The collaborative mapping model:Relationship-centered instructional design for highereducation. Online Learning Journal, 23(3), 56–71.https://doi.org/10.24059/olj.v23i3.2058

Ertmer, P. A., Stepich, D. A., York, C. S., Stickman, A.,Wu, X., Zurek, S., & Goktas, Y. (2008). Howinstructional design experts use knowledge andexperience to solve ill-structured problems.Performance Improvement Quarterly, 21(1), 17–42.https://doi.org/10.1002/piq.20013

Ertmer, P. A., York, C. S., & Gedik, N. (2009). Learningfrom the pros: How experienced designers translateinstructional design models into practice.Educational Technology, 49(1), 19–27.

Fleming, V., Gaidys, U., & Robb, Y. (2003). Hermeneuticresearch in nursing: Developing a Gadamerian-basedresearch method. Nursing Inquiry, 10(2), 113–120.https://doi.org/10.1046/j.1440-1800.2003.00163.x

Flyvbjerg, B. (2001). Making social science matter: Whysocial inquiry fails and how it can succeed again.Cambridge University Press.

Fortney, K. S., & Yamagata-Lynch, L. C. (2013). Howinstructional designers solve workplace problems.Performance Improvement Quarterly, 25(4), 91–109.https://doi.org/10.1002/piq.21130

Gray, C. M., Dagli, C., Demiral-Uzan, M., Ergulec, F., Tan,V., Altuwaijri, A. A., Gyabak, K., Hilligoss, M.,Kizilboga, R., Kei, T., & Boling, E. (2015). Judgmentand instructional design: How ID practitioners workin practice. Performance Improvement Quarterly,28(3), 25–49. https://doi.org/10.1002/piq.21198

Hartt, D. C., & Rossett, A. (2000). When instructionaldesign students consult with the real world.Performance Improvement, 39(7), 36–43.https://doi.org/10.1002/pfi.4140390712

Heidegger, M. (1962). Being and time (J. Macquarrie & E.Robinson, Trans.). Blackwell Publishers Ltd.

Heinrich, W. F., & Green, P. M. (2020). Remixingapproaches to experiential learning, design, andassessment. Journal of Experiential Education, 43(2),205–223. https://doi.org/10.1177/1053825920915608

Herrington, J., Oliver, R., & Reeves, T. C. (2003). Patternsof engagement in authentic online learningenvironments. Australasian Journal of EducationalTechnology, 19(1), 59–71.https://doi.org/10.14742/ajet.1701

The Journal of Applied Instructional Design, 10(2) 80

Hynie, M., Jensen, K., Johnny, M., Wedlock, J., & Phipps,D. (2011). Student internships bridge research toreal world problems. Education and Training, 53(1),45–56. https://doi.org/10.1108/00400911111102351

Johari, A., & Bradshaw, A. C. (2008). Project-basedlearning in an internship program: A qualitativestudy of related roles and their motivationalattributes. Educational Technology Research andDevelopment, 56(3), 329–359.https://doi.org/10.1007/s11423-006-9009-2

Kolb, D. A. (1984). Experiential learning: Experience asthe source of learning and development (Vol. 1).Prentice-Hall.

Kramer-Simpson, E., Newmark, J., & Dyke Ford, J. (2015).Learning beyond the classroom and textbook: Clientprojects’ role in helping students transition fromschool to work. IEEE Transactions on ProfessionalCommunication, 58(1), 106–122.https://doi.org/10.1109/TPC.2015.2423352

Larson, M. B., & Lockee, B. B. (2009). Preparinginstructional designers for different careerenvironments: A case study. Educational TechnologyResearch and Development, 57(1), 1–24.https://doi.org/10.1007/s11423-006-9031-4

Lei, S. A., & Yin, D. (2019). Evaluating benefits anddrawbacks of internships: Perspectives of collegestudents. College Student Journal, 53(2), 181–189.

Lowell, V. L., & Moore, R. L. (2020). Developing practicalknowledge and skills of online instructional designstudents through authentic learning and real-worldactivities. TechTrends, 64(4), 581–590.https://doi.org/10.1007/s11528-020-00518-z

Matthews, M. T. (2016). Learner agency andresponsibility in educational technology.(Unpublished doctoral dissertation). Brigham YoungUniversity.

McDonald, J. K. (in press). Instructional design as a wayof acting in relationship with learners. In B.Hokanson, M. E. Exter, A. Grincewicz, M. Schmidt, &A. A. Tawfik (Eds)., Learning: Design, engagement,and definition. Springer.

McDonald, J. K., & Yanchar, S. C. (2020). Towards a viewof originary theory in instructional design.Educational Technology Research and Development,68(2), 633-651.https://doi.org/10.1007/s11423-019-09734-8

Merleau-Ponty, M. (1964). Phenomenology of perception(C. Smith, Trans.). Routledge.

Michlea, E., & McDonald, J. K. (2020). Relationships,feedback, and student growth in the design studio: Acase study. In B. Hokanson, G. Clinton, A. A. Tawfik,A. Grincewicz, & M. Schmidt (Eds.), Educationaltechnology beyond content: A few focus for learning(pp. 183-192). Springer Nature Switzerland AG.https://doi.org/10.1007/978-3-030-37254-5_16

Miller, C. L., & Grooms, J. (2018). Adapting the Kolbmodel for authentic instructional design projects: The4-C Framework. Scholarship of Teaching andLearning, Innovative Pedagogy, 1(1), 36–49.

Nicolini, D. (2012). Practice theory, work, & organization:An introduction. Oxford University Press.

Packer, M. (2018). The science of qualitative research(2nd ed.). Cambridge University Press.

Radhakrishnan, V. (2018). A role analysis exercise tominimize role ambiguity and promote role clarity ininstructional design teams. John Hopkins University.

Schwier, R. A., & Wilson, J. R. (2010). Unconventionalroles and activities identified by instructionaldesigners. Contemporary Educational Technology,1(2), 134–147. https://doi.org/10.30935/cedtech/5970

Stake, R. E. (1995). The art of case study research. SagePublications.

Stefaniak, J. E. (2015). The implementation of service-learning in graduate instructional design coursework.Journal of Computing in Higher Education, 27(1),2–9. https://doi.org/10.1007/s12528-015-9092-7

Stefaniak, J. E. (2017). The role of coaching within thecontext of instructional design. TechTrends, 61(1),26–31. https://doi.org/10.1007/s11528-016-0128-2

Vo, N., Brodsky, A., Wilks, M., Goodner, J., & Christopher,K. (2018). Infusing authentic learning into onlinecourses: A case of online introduction to sociology.Journal of Educational Multimedia and Hypermedia,27(3), 391–409.

Wrathall, M. A. (2004). Motives, reasons, and causes. InT. Carman & M. B. N. Hansen (Eds.), The Cambridgecompanion to Merleau-Ponty (pp. 111–128).Cambridge University Press.

Wrathall, M. A. (2011). Heidegger and unconcealment:Truth, language, and history. Cambridge UniversityPress.

Yanchar, S. C. (2015). Truth and disclosure in qualitativeresearch: Implications of hermeneutic realism.Qualitative Research in Psychology, 12(2), 107–124.https://doi.org/10.1080/14780887.2014.933460

The Journal of Applied Instructional Design, 10(2) 81

Yanchar, S. C., & Slife, B. D. (2017). Theorizing inquiry inthe moral space of practice. Qualitative Research inPsychology, 14(2), 146–170.https://doi.org/10.1080/14780887.2016.1264517

Yanchar, S. C., South, J. B., Williams, D. D., Allen, S., &Wilson, B. G. (2010). Struggling with theory? Aqualitative investigation of conceptual tool use in

instructional design. Educational TechnologyResearch and Development, 58(1), 39–60.https://doi.org/10.1007/s11423-009-9129-6

Yanchar, S. C., Spackman, J. S., & Faulconer, J. E. (2013).Learning as embodied familiarization. Journal ofTheoretical and Philosophical Psychology, 33(4),216–232. https://doi.org/10.1037/a0031012

The Journal of Applied Instructional Design, 10(2) 82

Exploring FacultyPerceptions of

Professional DevelopmentSupport for Transitioning

to Emergency RemoteTeaching

Ana Redstone & Tian Luo

Professional development (PD) for instructors athigher education institutions offering online coursesis important for assuring the quality of onlineprograms. However, PD opportunities for facultymembers have often been piecemeal andinadequate. In light of the COVID-19 pandemic thatforced instructors around the world to teach online,PD has become even more critical to the success ofthe instructors, students, and institutionsthemselves. This paper describes researchconducted at a large university in the United Statesthat used a survey developed to operationalizeBaran and Correia’s (2014) holistic ProfessionalDevelopment Framework for Online Teaching(PDFOT). The survey identified strengths andweaknesses in PD support that could be targetedfor growth and improvement. Key findings include aneed to bolster support at each of the teaching,community, and organizational levels.Recommendations for addressing improvements arediscussed.

IntroductionWith the spread of the COVID-19 pandemic in early 2020,colleges and universities around the world, along withK-12 institutions, were forced to rapidly transition fromface-to-face classes to what has been coined “emergencyremote teaching” (ERT; Bozkurt & Sharma, 2020). ERTuses some of the same pedagogical and technologicaltools as online courses, but ERT and online coursesare quite different. The former involves a quick andtemporary transition from synchronous face-to-faceteaching to virtual, technology-enabled teaching, while

the latter involves thoughtfully designed and developedinteractions among the learners, the instructionalmaterials, and the instructor (Hodges et al., 2020).

In March 2020, institutions of higher educationeverywhere found themselves racing to providetechnological and pedagogical support for instructorswho may never have anticipated teaching remotely.Despite the tremendous increase in online courses priorto the shift to ERT, many instructors were not prepared touse web-based technologies in effective ways (Alexiou-Ray & Bentley, 2015; Baran et al., 2011; Lackey, 2011). Prior to the pandemic, if a college or university’s culturehad not widely promoted or supported online learning,the institution may not have been able to provideadequate opportunities for faculty professionaldevelopment (PD) in online teaching. Recent researchindicates that faculty members who took part in moreopportunities for PD for online teaching had higher self-efficacy and were better prepared to teach (Frass et al.,2017; Richter & Idleman, 2017;). Those higher educationinstitutions (HEIs) with robust PD programs for onlineteaching likely equipped their faculty members for asmoother transition to ERT.

HEIs’ transition to successful online and blendedenvironments is critical for universities and their studentsto survive and thrive long-term. Therefore, theseinstitutions need to identify PD support structures thatwill continue to support transitioning their faculty toonline teaching. The purpose of this study is to describethe organizational, community, and teaching support in auniversity that assisted instructors to transition to onlineteaching quickly during the COVID-19 crisis and faculty’sperceptions of the institution’s support structures.

An Overview of Baran andCorreia’s ProfessionalDevelopment Framework forOnline TeachingWhile there has been much research conducted abouthow to develop and maintain PD for online faculty,research has mainly focused on a particular component ofPD support, such as instructional design assistance ortechnology integration (Baran & Correia, 2014; Lane,2013). Additionally, existence of a variety of online PDprograms among and within institutions lead tohaphazard approaches (Frass et. al, 2017; Lane, 2013). Baran and Correia’s (2014) Professional DevelopmentFramework for Online Teaching (PDFOT) proposes tounite and align an institution’s PD efforts to foster highquality online teaching practices and better learningoutcomes for students. With faculty members movingtheir face-to-face and blended classes to emergency

The Journal of Applied Instructional Design, 10(2) 83

remote teaching environments during the COVID-19pandemic, ongoing PD support was critical to the successof students, faculty, and HEIs. The sudden shift totechnology-enabled instruction offers institutions theopportunity to rethink their support systems for onlineteaching.

Building on research about faculty members’ motivation,participation, and acceptance of online teaching, thePDFOT uses a systems approach to PD. The frameworkdescribes successful online teaching as the result ofinteractions among teaching, community, andorganizational support in HEIs (Baran & Correia, 2014). It consists of a nested integration of the three supports:the organizational support on the outer level; thecommunity support in the middle; and the teachingsupport at the inner level (see Figure 1). The purpose ofthis framework is to provide key administrative staff witha guide for designing, developing, and maintaining PDprograms, which may involve a culture shift within theinstitution (Baran & Correia, 2014). Looking beyond PDsupport at the big picture of a HEI as a whole, anexamination of what factors lead to institutional successin online development identified seven facets: advocacyand leadership within the university, entrepreneurialinitiatives, faculty support, student support, digitaltechnology, external advocacy, and professionalism(UPCEA, 2020). Faculty support in this context should“consider every touch point,” (UPCEA, 2020, p. 14)including teaching, community, and organizationalsupport outlined in the PDFOT that contribute to highquality online instruction and student learning outcomes.

Figure 1

Professional development framework for online teaching(Baran & Correia, 2014)

Picture of a Conceptual Framework Describing a Professional DevelopmentFramework for Online Teaching

Note: From "A Professional Development Framework forOnline Teaching" by E. Baran and A. Correia, 2014,TechTrends, 58(5), p. 97. Copyright 2014 by Springer.Used wtih permission.

Teaching Support

Teaching support in the PDFOT refers to the formal andinformal technological, pedagogical, and design anddevelopment support provided by an institution (Baran &Correia, 2014). Online teaching is vastly different fromface-to-face teaching; therefore, this type of support iscrucial for instructors who may find themselvesuncomfortable and challenged in a new environment(Baran & Correia, 2014). Faculty do not typically have abackground in pedagogy. They are subject-matter expertswho often mimic the teaching style of instructors theyhad while they were students in the classroom (Grover etal., 2016; Mohr & Shelton, 2017). For many instructorsnew to online teaching, teaching support provided bydepartment or campus-wide administrators, such asinstructional designers, offer an opportunity to reexamineand improve teaching strategies (Abdous, 2011; Baran &Correia, 2014).

Online teaching support in HEIs vary widely dependingon the size of the institutions and the resources availableto the institution, as well as community andorganizational support in place (Baran & Correia, 2014). Support ranges from simple technical assistance to fulladministrative centers for learning and teaching at theinstitutional level that provide formalized workshops andtraining as well as online course development. HEIs thathave a dedicated center for teaching and learning moreoften have successful online courses and programs(UPCEA, 2020). Depending on the levels of serviceoffered to faculty, centers may consist of a fewinstructional designers/technologists or an entire teamincluding video production staff, graphic designers, andmultimedia designers. For online course development,this team approach offers comprehensive and effectivesupport for faculty (Abdous, 2020; UPCEA, 2020). Centers work hand-in-hand with or as a part of campusinformation technology services providing technologicaltools such as web-conferencing for faculty.

Types of Teaching Support

HEI’s often provide teaching support for faculty indifferent modalities which may include face-to-face andonline, self-paced workshops, a collection of onlineresources, and one-on-one support, provided by thecenter’s staff (Baran & Correia, 2014). HEI’s also turn toopen-source certification Massive Open Online Courses(MOOCs) for training online faculty (Lane, 2013). Workshops, seminars, or other formal training may berequired or optional for instructors (McGee et al., 2017). Effective quality PD opportunities reflect a supportiveorganizational culture (McGee et al., 2017).

For a faculty member transitioning to an online format,technology support is important and should include

The Journal of Applied Instructional Design, 10(2) 84

equipment, training, and access to support staff (UPCEA,2020). Technology support also includes the LearningManagement System (LMS) and other tools such ascollaborative, polling, and web-conferencing tools. However, technology use cannot be meaningfullyseparated from pedagogical support and, teachingsupport should recognize the importance of integratingboth (Baran & Correia, 2014; González-Sanmamed et al.,2014).

With regard to how HEI’s provide teaching support, alarge body of research shows that faculty members highlyvalue individual support and feedback (Philipsen et al.,2019). Results of a survey indicated that faculty stronglypreferred one-on-one with instructional designers andonline resources over other types of teaching support; theleast preferred was a formal course (Grover et al., 2016).Similarly, Lackey (2012) indicates that instructors preferone-on-one training. Baran and Correia (2014) point outthat though workshops help build confidence ininstructors, workshops alone may not be enough toaddress individual needs and one-on-one assistance maybe needed.

To be successful, online faculty need ongoing, sustainedteaching support because as they receive more training,faculty may become more aware of personal gaps in skills(González-Sanmamed et al., 2014). The feedbackprovided by individual support is important for helpinginstructors identify their PD needs (Philipsen et al.,2019). Using an embedded mixed-methods design,Brinkley-Etzkorn (2019) found that instructors were veryoptimistic about their ability to teach online before andafter a three-week-long intensive hybrid workshop.However, they became less optimistic about and satisfiedwith their training after teaching their first online course,mainly due to a lack of other teaching support whichmade instructors feel abandoned. These results alignwith findings that experienced online instructors weremore likely to disagree with the statement that onlineclasses are easier to teach than novice instructors,indicating that novice instructors are more likely tooverestimate their capabilities to teach online (Rhode etal. 2017). To address this issue, McGee et al. (2017)recommended offering PD opportunities that incorporatestrategies such as case studies to allow instructors todevelop expertise over time by putting their learning incontext.

The amount and variety of teaching support available forfaculty appear to influence instructors’ readiness to teachonline. Results of a causal-comparison study showed thatonline instructors who participated in more supportactivities had higher self-efficacy (Richter & Idleman,2017). An examination of the teaching support providedby four HEIs revealed that faculty who were required toparticipate in PD for online teaching felt better prepared

to teach than those participating in voluntary PD (Frasset al., 2017). A phenomenological study of six novice andexperienced online instructors found that theirinstitutions offered no formalized training leavinginstructors feeling ill-prepared to teach online.Instructors had to seek out resources on their own or one-on-one assistance from an instructional designer orcolleague (Lackey, 2012). These findings clearly show theinterconnectedness of the organization, community, andteaching support.

Community Support

Community support refers to collaborative supportprovided by in-depth interactions between facultymembers and their colleagues. These types of supportcan be formal or informal and include collegial learninggroups (CLGs), communities of practice (CoPs), and peersupport (Baran & Correia, 2014). Scarpena et al. (2018)argued that community support is integral to all levels ofthe PDFOT and proposed an expanded framework toinclude additional opportunities for online faculty toconnect. The PDFOT describes community support ascritical in combating instructors’ feelings of socialisolation that can occur due to the lack of face-to-faceinteraction with colleagues. In a phenomenologicalstudy, Lackey (2012) found online instructors’ feelings ofisolation were the driving factor behind preference foropportunities for collaborating with colleagues over anyother type of support. In light of social distancing andquarantine policies enacted to prevent the spread ofCOVID-19, formal and informal online community supportemerge as even more important for the psychological andemotional wellbeing of instructors (Golden, 2016).

Social Networks and Communities ofPractice

CLGs, sometimes referred to as professional learningcommunities (PLCs) or professional learning networks(PLNs), and CoPs are informal and formal networksformed by instructors or by an organized group whereideas and diverse perspectives can be exchanged (Baran& Correia, 2014; Trust et al., 2017). These groups areimportant for HEIs to promote as the exchange ofdialogue and reflection on teaching practices can improvepedagogical practices (Luo et al., 2020; UPCEA, 2020). Research indicates that social construction of knowledgecan lead to transformative teaching practices (Baran &Correia, 2014). However, social networks can offerfaculty richer experiences that include an emotional oraffective component that is not present as part of a CoP(Lantz-Andersson et al., 2018; Trust et al., 2017).

Social networks offer faculty an informal, open supportsystem for sharing ideas and resources. They alsoprovide instructors with support in their peripheral roles,

The Journal of Applied Instructional Design, 10(2) 85

which may not be included in formal PD at their HEIs(González-Sanmamed et al., 2014). Regardless ofwhether a HEI offers formal opportunities for shareddialogue, faculty often seek out support from their peers(McGee et al., 2017). Social networks are frequentlyformed in social media spaces such as Facebook,LinkedIn, and Twitter (Ferguson & Wheat, 2015; Lantz-Andersson et al., 2018; Luoet al., 2020). Social mediatools offer online educators ways to engage in PDopportunities that impact their growth as professionals(Trust et al., 2017).

CoPs offer instructors with more formal opportunities forcollegial interactions and can be provided by institutions,departments, or professional associations. Maincharacteristics of CoPs include participants learning fromeach other through interactions, shared interest andcompetence in a domain, and a shared practice (Reilly etal., 2012). Though online faculty members may interactfrequently with staff who provide teaching support, theymay find few opportunities or additional time toparticipate in formal CoPs to exchange knowledge andideas with their peers. Online teaching is a subjectivepractice and PD opportunities should allow instructors toshare personal experiences teaching rather than justsharing technology tools (Glass, 2016). Instructorsparticipating in both formal and informal networks, suchas CLGs and CoPs, also transition better to teachingonline (Baran et al., 2011; Samarawickrema & Stacey,2007).

Peer Support and Mentoring

Like other types of support, peer support available toonline faculty is often dependent on organizationalsupport in place. It may include peer observation, peerfeedback, and peer mentoring (Baran & Correia, 2014)which are important for promoting confidence andmotivation (Philipsen et al. 2019). A descriptive surveystudy of 47 faculty at a large public university showedthat a university-wide mentoring program for onlinefaculty facilitated high quality online programs(Buckenmeyer et al., 2011). These results support theUniversity Professional and Continuing EducationAssociation’s (UPCEA’s) 2020 recommendation forestablishing a mentor-program for new online faculty. Embedding faculty mentoring in a formalized onlineprofessional development workshop, where the noviceinstructor shadows and is then observed by the mentor,has been used as a personalized and contextualizedmethod of incorporating peer support (Gregory &Salmon, 2013; Frass et. al, 2017).

Organizational SupportOrganizational support refers to the rewards andorganizational culture of a HEI that support online

teaching and learning. At the overarching level in thePDFOT, organizational support and strategic plans arecrucial to support student success though faculty supportsystems that include teaching and community support(Baran & Correia, 2014; Lackey, 2011). HEIs seekingsuccess and effectiveness in online education need toalign their institutional priorities by providing these kindsof organizational support systems (Herman, 2013; Velez,2015).

Rewards

Online faculty can encounter a variety of barriers, such aslack of technical skill and increased workload. Thus,organizational rewards are critical for motivating onlinefaculty and maintaining continued engagement andinterest (Baran & Correia, 2014). Rewards includestipends, technology, internal and external recognitionand respect, credit toward promotion, job security, andrelease time for professional development opportunitiesand online course development (Baran & Correia, 2014;McGee et al., 2017). Part-time faculty teach onlinecourses more frequently than full-time faculty, makingrewards such as stipends and release time may be moreimportant for them (Herman, 2013). Lackey (2012)points to the need for institutions to find ways to gainbuy-in from faculty. These types of rewards can provideincentives toward this aim. UPCEA (2020) recommendsthat compensation for faculty in terms of money and timeshould be standardized by institutional policy andcommunicated to all stakeholders clearly. However, alarge-scale study involving 821 HEIs in the United Statesfound that the majority of faculty perceived the incentivesfor developing and delivering online classes to beinadequate (Herman, 2013). This study also showed thatabout half of the HEIs did not provide online coursedesign and delivery support or PD support in tenure andpromotion.

Organizational Culture

Organizational culture includes an institution’stechnology infrastructure to support online education anda positive attitude toward online teaching. Strategicleadership and advocacy at the institution level is criticalfor developing a strong organizational culture (King &Boyatt, 2014; UPCEA, 2020). For institutions seeking toestablish the infrastructure and culture necessary toadopt or grow their online offerings, a strategic planserves as the foundation for all faculty developmentsupport (King & Boyatt, 2014). Leaders who createopportunities for faculty to feel encouraged, respected,supported, included, valued, and rewarded by theirinstitutions will see increased motivation in their facultyto teach online (Baran & Correia, 2014). One way tosustain faculty support and motivation is for an institutionto share governance with faculty, alumni, administrators,

The Journal of Applied Instructional Design, 10(2) 86

and students – a hallmark of excellence in onlineleadership (UPCEA, 2020).

Purpose Statement and ResearchQuestionPD for faculty developing and teaching online is stronglycorrelated to the quality of online programs in highereducation (Baran & Correia, 2014). Past research hasexamined the types of HEI support systems in place tofoster success in faculty members transitioning from face-to-face to other environments, but there is a lack ofresearch about the extent to which HEI’s use a holisticapproach to PD. The purpose of the current study is toinvestigate the impact of the types of organizational,community, and teaching support structures on facultysuccess in transitioning from to face-to-face to otherenvironments. The following research question will guidethis study: To what extent did a university’sorganizational, community, and teaching supportstructures impact faculty members’ perceptions of theirability to transition from conventional environments toERT?

Methods

Participants and Setting

Participants were recruited from the target population offaculty members in a public university on the east coastof the United States. This study used a conveniencesampling approach to solicit participants. At the time thisstudy was conducted, there were around 1200 facultymembers who may have transitioned their face-to-faceclasses to ERT due to the COVID-19 pandemic in thespring and summer of 2020. 88 responses to the surveywere collected in total. Those who volunteered toparticipate and did not teach one or more courses with aface-to-face component on campus during the transitionto ERT during the spring or summer semesters in 2020were excluded from the study. Additionally, facultymembers who taught online classes exclusively were alsoexcluded, leaving the number of participants at 55.

Instruments

The authors created an online survey to collect data usingQualtrics. The items in the survey were based onteaching, community, and organizational supportdescribed in the PDFOT (Baran & Correia, 2014). Teaching support consisted of three subscales ofparticipant satisfaction with technological, pedagogical,and design and development support created or expandedduring the shift to ERT. Participants indicatedsatisfaction with community support and agreement with

organizational support for technology-enhanced learningalready in place prior to the transition. The authorsconducted a check of the internal consistency of thesurvey’s scale items (See Table 1).

Table 1

Reliability Statistics

Cronbach’sAlpha

N ofitems

Teaching supports Technology .92 8 Pedagogy .98 6 Design and Development .93 6Community supports .79 6Organizational supports .80 8

Data Collection

The survey consisted of two demographic items regardingthe participant’s college or role (e.g., Arts and Letters orAdministrative office) and status (e.g., adjunct instructor,assistant professor, etc.), and one inclusion question todetermine if the participant taught on campus prior to thetransition to emergency remote teaching in the springand summer of 2020. In addition, participants wereasked whether they had previously taught an online orblended course to gauge prior familiarity with theuniversity’s support systems for online teaching. Theremaining items included five-point Likert scale itemsranging from "not at all satisfied" to "extremely satisfied,"five-point Likert scale items ranging from “stronglydisagree” to “strongly disagree,” and four open-endeditems. Participants agreed to informed consent prior tobeginning the study and no identifying information wascollected.

Data Analysis

The data was analyzed in SPSS using descriptivestatistics. The open-ended questions on the questionnairewere analyzed using open-coding procedures and furtherrefined by secondary and axial-coding techniques. Thepurpose of the procedures was to triangulate emergingthemes within the data (Creswell, 2019).

Results

Descriptive Statistics

Descriptive analysis, shown in Table 2, revealedparticipants were most satisfied by community support(M = 4.17) and then, followed by satisfaction withteaching support (M = 3.92). The higher mean scores for

The Journal of Applied Instructional Design, 10(2) 87

community support may be attributed to fewerrespondents who indicated that they used this type ofsupport. With regard to teaching support, participantswere most satisfied with pedagogical support and (M =3.95) and least satisfied by design and developmentsupport (M = 3.83). Participants perceivedorganizational support as the weakest, as indicated bytheir agreement scores (M = 3.41). Scores for communitysupport deviated the least (SD = .77) and the most forpedagogical support (SD = .93).

Table 2

Satisfaction scores for teaching, community, andorganizational support

Total

responses(N)

M SD

Teaching supportssatisfaction

technology supportssatisfaction

30 3.94 0.80

pedagogical supportssatisfaction

18 3.95 0.93

design and developmentsupport satisfaction

12 3.83 0.84

Community supportssatisfaction

22 4.17 0.77

Organizational agreementscore

50 3.41 0.82

Teaching Support

The survey items (See Table 3) asked participants if theysought teaching, pedagogical, or design and developmentsupport during the transition to ERT. Technology supportwas used the most (56.4%) while design and developmentsupport was used the least (23.1%).

Table 3

Teaching Support: Total Responses

During thetransition toERT did you

seek…

Yes NoTotal

responses(N)

PercentTotal

responses(N)

Percent

technologysupport?

31 56.4% 24 43.6%

pedagogicalsupport?

19 34.5% 36 65.5%

design anddevelopmentsupport?

12 23.1% 40 76.9%

Technology Support

As described in the PDFOT, technology support refers tohelp with online environment, infrastructure, andtechnical issues (Baran & Correia, 2014). Participantswere prompted with specific technology supportexamples customized to the particular support providedby the university (e.g., Blackboard, Zoom, etc.).Technology support was provided by three departmentswithin the university: Information Technology Services(ITS), the Center for Learning and Teaching (CLT), andthe Center for Faculty Development (CFD). Supportincluded websites, workshops, and one-on-one staffassistance. It should be noted that although eachdepartment had maintained websites prior to thetransition, CLT developed an additional site called “KeepTeaching (KT)” to organize and consolidate resources toassist instructors during the transition. Table 4 showsthat among various technical support, participants weremost satisfied by workshops provided by the CFD (M =4.12) and CLT (M = 3.96), and least satisfied by CLT andCLD staff assistance (M = 3.67; M = 3.78).

Table 4

Technology support: Measure of central tendency andspread of Likert-type items

ExtDiss(n)

SomewhatDiss (n)

Neither(n)

SomewhatSat (n)

ExtSat(n)

Total(N) M SD

ITSwebsite

1 1 5 12 8 27 3.92 1.00

ITS HelpDesk

1 4 2 6 11 24 3.92 1.28

CLT’s KTwebsite

0 2 5 13 6 26 3.88 0.86

CLTworkshops

0 3 2 14 7 26 3.96 0.92

CLT staffassistance

1 2 4 2 6 15 3.67 1.35

CFDwebsite

0 1 5 13 4 23 3.87 0.76

CFDworkshops

0 2 2 5 8 17 4.12 1.05

CFD staffassistance

0 1 3 2 3 9 3.78 1.09

Pedagogical Support

This type of support refers to providing information aboutselecting appropriate technological tools and teachingstrategies to help learners meet course objectives (Baran& Correia, 2014). Two organizations providedpedagogical support during the transition to ERT – theCLT and the CFD. Participants were most satisfied withthe CLT and CFD staff assistance (M = 4.0; M = 4.20)and least satisfied with CLT’s KT website (M = 3.69) andCFD workshops (M = 3.75) (see Table 5).

Table 5

The Journal of Applied Instructional Design, 10(2) 88

Pedagogical supports: Measure of central tendency andspread of Likert-type items

ExtDiss(n)

SomewhatDiss (n)

Neither(n)

SomewhatSat (n)

ExtSat(n)

Total(N) M SD

CLT’s KTwebsite

0 2 2 7 2 13 3.69 0.95

CLTworkshops

0 2 3 4 6 15 3.93 1.10

CLT staffassistance

0 1 0 2 2 5 4.00 1.22

CFDwebsite

0 0 4 4 3 11 3.91 0.83

CFDworkshops

0 1 3 6 2 12 3.75 0.87

CFD staffassistance

0 0 1 2 2 5 4.20 0.84

Design and Development Support

Design and development support refers to assistance withediting media, designing online content, and evaluatingcourses (See Table 6). The CLT and the CFD websites(developed during the transition to ERT) provided themost satisfying experiences for participants (M = 3.83; M= 3.8), while the CLT and CLD staff provided the leastsatisfying experiences (M = 3.60; M = 3.67). This resultcould be attributed to the emphasis instructors andsupport staff placed on technology and pedagogicalsupport during the transition to ERT. Few instructorsrequested design and development support during thistime. The low satisfaction scores and low number ofresponses to the staff assistance items may also reflect alack of awareness of the availability of staff support fordesign and development by respondents who did notteach in technology-enabled environments prior to thepandemic.

Table 6

Design and development support: Measure of centraltendency and spread of Likert-type items

ExtDiss(n)

SomewhatDiss (n)

Neither(n)

SomewhatSat (n)

ExtSat(n)

Total(N) M SD

CLT’s KTwebsite

1 0 2 6 3 12 3.83 1.11

CLTworkshops

0 1 4 4 3 12 3.75 0.97

CLT staffassistance

0 1 1 2 1 5 3.60 1.14

CFDwebsite

0 0 4 4 2 10 3.80 0.79

CFDworkshops

0 0 4 3 2 9 3.78 0.83

CFD staffassistance

0 0 1 2 0 3 3.67 0.58

Thematic Analysis of Open-Ended Items

Teaching Support

Analysis of the open-ended items provided by theparticipants (n = 33) revealed three main themes:frustration, satisfaction with workshops and one-on-oneassistance, and lack of need for teaching support. Though quantitative analysis revealed that participantsrelied mainly on technology support (56.4%), many of theparticipants’ comments focused on pedagogical anddesign and development support. These findings may bean indication of the overlap among supports.

Though the overall perception of teaching support waspositive, some participants indicated a sense offrustration. Several participants noted that the one-weektransition to ERT made it difficult for instructors to findand use what they needed. One participant noted:

I didn't really have the opportunity to useall of the resources at the time of thetransition in the spring, as I was trying tobalance my full-time responsibilities,modifying my course content and learningnew technology on the fly.

Others mentioned frustrations with the LMS and havingto purchase their own screen-capture and video editingsoftware.

Those respondents who weighed in about workshops andone-on-one assistance had mixed feelings. Someindicated they found the workshops very helpful whileothers found the workshops did not always provide themwith what they needed. So, they sought one-on-oneassistance from CLT and CFD staff. A participant stated“some of the workshops moved too quickly and assumedthere was a base level of knowledge that was not there.Other workshops were too basic.” This sentiment wasechoed by other participants who said “some workshopswere more informative than others. I found theworkshops that provided specific examples to be mosthelpful to identify if the approach would be appropriate toadopt.”

Another common theme was the lack of need for teachingsupport. Several participants described the transition toERT as “seamless” and “simple” due to having taughtonline or having participated previously in a variety of PDopportunities. One participant noted, “I felt verycomfortable knowing that the support was there if Ineeded it, and there were multiple PD opportunities tosupport faculty available.”

Community Support

Community support refers to opportunities providedwithin or outside of the university for faculty members toconnect with each other in formal and informal ways. For

The Journal of Applied Instructional Design, 10(2) 89

example, this type of support could mean reaching out toa peer for social, emotional, or teaching support. Almosthalf of participants indicated they used communitysupport (42.6%). Participants indicated they were mostsatisfied with peer collaboration and group support most(M = 4.35) and least satisfied by university supportnetworks (M = 3.80) (see Table 7).

Table 7

Community support: Measure of central tendency andspread of Likert-type items

ExtDiss(n)

SomewhatDiss (n)

Neither(n)

SomewhatSat (n)

ExtSat(n)

Total(N) M SD

Formalsupportnetworksdevelopedby theuniversityor dept.

0 3 2 5 5 15 3.80 1.146

Formalsupportnetworksdeveloped byexternalprofessionalorganizations

0 2 2 5 4 13 3.85 1.068

Informal,self-directedpeermentoring orcollaborationsupport

0 1 2 6 11 20 4.35 0.875

Informal,self-directedone-on-onepeer support

0 1 3 6 11 21 4.29 0.902

Informal,self-directedgroup peersupport

0 1 3 2 11 17 4.35 0.996

Social medianetworks

0 1 3 4 6 14 4.07 0.997

One main theme emerged from the responses – supportfrom colleagues. The responses overwhelmingly alignedwith the quantitative results indicating that individualand peer-group support provided by colleagues was themost used and valued community support. Almost allrespondents (n = 16) mentioned the assistance providedby their peers was very helpful. They communicated withpeers in a variety of ways: web-based meetings, phonecalls, emails, and messaging apps such as Slack. Oneparticipant wrote “this is the resource I've used the most,mostly informal collaborations and peer support. I get themost benefit from hearing other's ideas of how they havetaught online or re-vamped assignments or experientialactivities, and then work through how to apply those tomy courses.” Another wrote “the community support waskey! It involved mostly reaching out to friends/colleaguesand saying, ‘this is what I'm thinking... what do youthink?’ or ‘What are you doing about X?’ I think whatmade these conversations so valuable was they were

based on my schedule and quick.”

Organizational Support

A system of rewards for faculty developing and teachingonline courses, as well as overall organizational culture,comprises organizational support. This type of supportsystem is different from teaching and community supportin the current study because the university alreadydeveloped robust and long-standing online teaching andlearning efforts. These efforts were largely unchangedduring the transition with the exception of expandedtraining opportunities and technology tools. With regardto the rewards in place at the time of the transition toERT, participants indicated that the university did abetter job of providing training opportunities (M = 3.19)and equipment and technology tools (M = 3.49) thanfinancial stipends (M = 2.55) and release time (M = 2.26)for developing online courses (see Table 8). With regardto the organizational culture in place, participants feltstrongly that online teaching and learning weresupported at the highest administrative levels (M = 3.81).Participants also strongly agreed that they valued theuniversity’s online teaching and learning initiatives (M =4.21) but conversely, felt that their peers did not valuethem as much (M = 3.36).

Table 8

Organizational supports: Measure of central tendencyand spread of Likert-type items

Stronglydisagree

(n)

Somewhatdisagree

(n)Neither

(n)Somewhatagree (n)

Stronglyagree

(n)Total(N) M SD

Instructors receive adequate…for developing and teaching online coursesfinancialstipends

11 11 5 6 5 38 2.55 1.41

release time 14 12 4 4 4 38 2.26 1.35trainingopportunities

4 7 13 15 4 43 3.19 1.12

equipmentand tech tools

3 9 9 11 13 45 3.49 1.29

Online teaching and learning are…well-respectedby universityfaculty andstaff

2 5 12 12 16 47 3.74 1.17

supported atthe highestadmin levels

2 5 13 7 20 47 3.81 1.23

…value university online teaching and learning initiatives My peers 4 8 13 11 11 47 3.36 1.26I 1 3 7 11 26 48 4.21 1.05

Two themes emerged as a result of analysis: lack ofawareness of available organizational support and lack ofcentralized information. 27 participants responded to thisprompt. A few respondents (n = 9) indicated they feltfully supported at the organizational level, but some (n =4) indicated that the only organizational support theywere aware of or had used were the trainingopportunities provided by the CLT or CFD. Notably,several (n = 8) respondents pointed out though they had

The Journal of Applied Instructional Design, 10(2) 90

developed online classes, they never received rewardssuch as financial stipends, release time, or equipment andtechnology. At the university, such rewards are typicallydecentralized at the department level and vary widely. Only faculty who agree to work with CLT to developonline courses are provided a stipend at theorganizational level. Therefore, other faculty may beunaware that such rewards exist. A respondent summedthis up well indicating, “I do not believe the universitydoes a very good job of telling faculty about the supportavailable.”

The other theme that developed was frustration with thelack of centralized dispersion of information. Threerespondents wrote responses indicating during thetransition to ERT, they received too much informationfrom too many sources. One noted “reading through aflurry of emails, some with very helpful and relevantinformation, but some repeatedly advertising non-essential meetings, was a bit confusing.” Another wrote:

Often, I'd get 3 separate emails from 3different places with the sameinformation. I became increasinglyfrustrated with this ‘throw everything atthe faculty’ approach in the hopes thatsomething might be of use. A centralrepository should have been set up in anorganized way, rather than feeling theneed to send an email to faculty each timesomeone had a thought or idea.

Overall Support

At the end of the survey, 35 participants responded to theopen-ended prompt: How can the university support yourteaching better in technology-enabled environments (e.g.,remote, online, hybrid/blended, hybrid/classroom, etc.)?Analysis revealed that most respondents (n = 33)requested additional teaching and organizational support(see Table 9). This supports the qualitative data showinglower satisfaction scores of teaching and organizationalsupports than community support. The low number ofrespondents requesting community support may reflecttheir perceptions of community support mainly as a self-directed, peer-to-peer social activity rather than auniversity-supported activity.

Table 9

Recommendations for supports

Teaching supports N ofresponses

Continue providing good supports 7 Improve technology tools, equipment,and integration

4

Provide additional asynchronousresources

4

Provide hardware, software, Internetfor ERT at home

4

Provide research regardingeffectiveness of online teaching

1

Provide more resources to reducecheating on exams

1

Add more CLT support staff 1

Community supports N ofresponses

Provide more opportunities for peersharing

2

Organizational supports N ofresponses

Improve communication (e.g.centralized dissemination of information,tech. outages, info for TAs)

4

Provide more rewards 3 Reduce class size 2

DiscussionThe present study investigates instructors’ satisfactionwith their university’s PD support assisting withtransition to ERT. Overall, participants indicated theywere mostly satisfied with the PD support received. Thisuniversity has over 30 years of experience in providingquality distance learning and online programs, therefore,it is not surprising that many reported feeling well-supported during the transition to ERT. However, it isevident that PD at this university can be improved andrecommendations for improvement are discussed asimplications for practice.

Personalize PD Opportunities to MeetIndividual Needs

Participant responses indicated a wide range in teaching-support needs suggesting the need for personalization.Several participants noted some workshops were toobasic while other workshops were too detailed. This maybe because many workshops were quickly developed andput in place within a week. Consequently, both the CLTand the CFD were building resources for faculty whiledelivering the resources. In addition, staff membersinitially involved in delivering workshops and on-on-oneassistance were limited to those accustomed to workingclosely with faculty. Later, as it became clear that

The Journal of Applied Instructional Design, 10(2) 91

demand greatly outweighed supply, other staff memberswere brought into assist.

The disparity in participants’ experiences with workshopsand other teaching support may also be attributed to aclearer understanding of personal needs. As instructorssought more professional development, they likelybecome more aware of personal gaps in skills (González-Sanmamed et al., 2014). In other words, additionaltraining may have revealed to instructors that theyoverestimated certain skills and were not as proficient insome areas as they thought (Brinkley-Etzkorn, 2019;Rhode, et. al, 2017). By placing priority on instructors’individual needs and focusing on those types of PDofferings, support staff can maximize relevance, increasemotivation, and encourage effective transfer of skills(Adnan, 2018; Baran & Correia, 2014; Baran, 2015;Philipsen et al., 2019). One way to address this need isfor support staff to administer an initial self-assessmentthat determines an instructor’s individual needs (Rhodeet al., 2017).

Provide a Variety of PD Opportunities

With those who sought PD assistance, a three-tieredprocess emerged in participant responses. Some (n = 3)described starting with web resources, moving tosynchronous and asynchronous workshops, and thenseeking one-on-one staff assistance to resolve outstandingquestions and need for assistance. One participantsummed up this process well by indicating, “generalmaterials like the webpages were useful, but only afterattending seminars or personal help to navigate them. Even then, I had to turn for personal help more than onceto understand what I saw there.” HEIs should provide awide variety of PD opportunities to meet different needsand encourage participation (Elliott et al., 2015; McGee2017). Opportunities should include one-on-one supportwith feedback, online resources, and workshops offeredin different modalities and feasible durations (Grover etal., 2016; Lackey, 2011; Philipsen et al., 2019).

A key lesson learned for this university was that a morebalanced approach to offering PD opportunities wasneeded. Prior to the pandemic, online resources wereminimal, and instructors relied heavily on one-on-oneconsultations and workshops. The insufficiency of onlineresources likely contributed to the overwhelming amountof work by the CLT and CFD and the frustrations ofinstructors unable to get the information needed in atimely manner.

Contextualize PD

Quantitative and open-ended responses in this surveyshowed a strong preference for one-on-one supportconsistent with findings in other studies (Grover et al.,

2016; Philipsen et al., 2019). One-on-one support offersinstructors contextualization of information withinpersonal teaching environments. Workshops should offersimilar contextualization by engaging instructors in anonline environment in an active, hands-on way using casestudies (McGee et al, 2017) or creating a useful product(AlexiouRay & Bentley, 2016; Gregory & Salmon, 2013;Philipsen et al., 2019). Contextualization also includescreating opportunities for instructors to critically reflecton their roles as educators and their educationalphilosophies (Baran et al., 2011; Lane 2013; Philipsen etal., 2019). Self-assessment and reflection on onlineteaching practices can reshape faculty members’perceptions of the value of online learning (Glass, 2016)and empower them (Baran et al., 2011; Lane 2013).

Provide Opportunities for Peer Support

During the transition to ERT, participants were highlysatisfied with community support over any other type ofsupport. Open-ended item responses revealed instructorsvalued interactions and shared experiences with theircolleagues more than any other type of support which isconsistent with results from other studies (Grover et al.,2016; Lackey, 2011). These findings demonstrate thehuman need for social and emotional support from peersduring a time in which all aspects of life were disruptedby the COVID-19 pandemic. Feeling isolated from others,by the requirement of maintaining physical distance fromothers, is likely the driving factor behind instructors’impulses to collaborate with colleagues (Golden, 2016;Lackey, 2012). A strong desire to informally connect wasfound in participant responses and also highlights thesuggestion to incorporate community support in all levelsin the PDFOT (Scarpena, et al., 2018). Peer supports arecritical and evidentially, this university could buildcommunity support by strengthening or developingadditional ways for instructors to connect with eachother. Opportunities for social engagement should befacilitated by teaching and learning centers that provideopportunities for peer collaboration, sharing ofexperiences, and mentorship (AlexiouRay & Bentley,2016; Baran, 2015; Glass, 2016; Philipsen et al., 2019;Trust et al., 2017).

Establish Clear Communication at theOrganizational Level

Support for PD at the institutional and leadership levelsenhances acceptance of online teaching by instructors(Philipsen et al., 2019). Therefore, communication of thissupport is important. The data in this study highlight theneed for improved communication at the organizationallevel. Participants pointed to the need for improvedcommunication regarding rewards and dissemination ofinformation.

The Journal of Applied Instructional Design, 10(2) 92

Rewards are critical to faculty buy-in and continuedsupport for online courses (Herman, 2013; Lackey, 2012),but several (n = 6) respondents indicated unawareness ofthem. As part of a faculty-driven institution, colleges anddepartments determine which, if any, rewards will beoffered to instructors to develop and teach onlinecourses. These responses could also be due to thespending freeze put in place by the university at thebeginning of the pandemic in response to uncertaintyabout future revenue sources. The freeze resulted in nocourse releases granted, adjunct hiring suspended, andother rewards made unavailable to online developers andinstructors. Nevertheless, participants’ responseshighlight the university’s need for improved coordinationand communication about rewards.

In addition, respondents to the organizational and overallopen-ended items (n = 5) described frustration with alack of centralized communication strategy fordisseminating information about the availability of PDopportunities during the transition to ERT. Anotherlesson learned by this institution was the CLT and theCFD should coordinate communications about PD tofaculty to prevent confusion and redundancy. The successof PD initiatives is based on creating and clearlyarticulating an institutional strategy addressing PDstandards, resources, and guidance for implementation(King & Boyatt, 2015; Philipsen et al., 2019; UPCEA,2020). An internal needs assessment could help thisuniversity determine PD needs and align support effortswith online faculty competencies (Frass et al., 2017;McGee et al., 2017).

Limitations and Recommendationsfor Future ResearchThe PDFOT survey shows adequate to robust internalconsistency, and its use in this study elicited informationuseful to the university for increasing PD support for web-enabled environments. The survey was customized to thesupport provided by a particular university; therefore, theresults are limited in scope and not generalizable. Another limitation of this study is the small number ofparticipants, who, understandably, were exceptionallybusy transitioning to ERT during the time data wascollected.

The institution in this study learned valuable lessonsabout the types of PD support needed by its facultyduring the rapid transition to ERT. Many changes andimprovements to PD were made by this university whilethis study was conducted, thus, a follow-up study usingthe same PDFOT survey to assess faculty satisfaction withnew teaching, community, and organization supportinitiatives could provide valuable insights for additionalimprovements.

For future research, it may be possible to standardize thePDFOT for studies across multiple universities. ThePDFOT survey could also be customized for individualHIEs looking to identify areas for improvement in PDofferings. In the current study, open-ended items on thesurvey provided valuable context to the quantitative data.Future qualitative or mixed-methods studies usinginterviews could provide richer data for HEIs seeking toimprove their PD for those teaching and developingtechnology-enabled courses.

ReferencesAbdous, M. (2011). A process-oriented framework for

acquiring online teaching competencies. Journal ofComputing in Higher Education, 23(1), 60-77. doi:10.1007/s12528-010-9040-5

Abdous, M. (2020). Designing online courses as a team:A team-based approach model. International Journalof Online Pedagogy and Course Design, 10(1), 61-73.doi: 10.4018/IJOPCD.2020010105

Adnan, M. (2018). Professional development in thetransition to online teaching: The voice of entrantonline instructors. ReCALL, 30(1), 88-111. doi:10.1017/S0958344017000106

Alexiou-Ray, J., & Bentley, C. C. (2015). Facultyprofessional development for quality online teaching.Online Journal of Distance Learning Administration,18(4). Retrieved from https://edtechbooks.org/-PAC.

Baran, E. (2015). Investigating faculty technologymentoring as a university-wide professionaldevelopment model. Journal of Computing in HigherEducation, 28(1), 45-71. doi:10.1007/s12528-015-9104-7

Baran, E., & Correia, A. P. (2014). A professionaldevelopment framework for online teaching.TechTrends, 58(5), 95-101.doi:10.1007/s11528-014-0791-0

Baran, E., Correia, A., & Thompson, A. (2011).Transforming online teaching practice: Criticalanalysis of the literature on the roles andcompetencies of online teachers. DistanceEducation, 32(3), 421-439.https://edtechbooks.org/-MDs

Bozkurt, A. & Sharma, R.C. (2020). Emergency remoteteaching in a time of global crisis due to CoronaViruspandemic. Asian Journal of Distance Education,15(1), i-vi. https://edtechbooks.org/-PNd

Brinkley-Etzkorn, K. E. (2020). The effects of training oninstructor beliefs about and attitudes toward online

The Journal of Applied Instructional Design, 10(2) 93

teaching. American Journal of Distance Education,34(1), 19-35. doi:10.1080/08923647.2020.1692553

Buckenmeyer, J., Hixon, E., Barczyk, C., & Feldman, L.(2011). Invest in the success of online programs atthe university? Mentor professors. ContemporaryIssues in Education Research, 4(6), 1-6. doi:10.19030/cier.v4i6.4380

Elliott, M., Rhoades, N., Jackson, C., & Mandernach, J.(2015). Professional development: Designinginitiatives to meet the needs of online faculty. TheJournal of Educators Online, 12(1), 160-188. doi:10.9743/JEO.2015.1.2

Ferguson, H., & Wheat, K. (2015). Early career academicmentoring using Twitter: The case of#ECRchat. Journal of Higher Education Policy andManagement, 37(1), 3-13. doi:10.1080/1360080X.2014.991533

Frass, L., Rucker, R., and Washington, G. (2017). Anoverview of how four institutions prepare faculty toteach online. Journal of Online Higher Education,1(1), 1-7. Retrieved fromhttps://edtechbooks.org/-HjVy

González-Sanmamed, M., Muñoz-Carril, P.-C., & Sangrà,A. (2014). Level of proficiency and professionaldevelopment needs in peripheral online teachingroles. International Review of Research in Open andDistance Learning, 15(6), 162-187.doi:10.19173/irrodl.v15i6.1771

Glass, C. (2017). Self-expression, social roles, and facultymembers’ attitudes towards onlineteaching. Innovative Higher Education, 42(3),239-252. doi: 10.1007/s10755-016-9379-2

Golden, J. (2016). Supporting online faculty throughcommunities of practice: Finding the facultyvoice. Innovations in Education and TeachingInternational, 53(1), 84-93.https://edtechbooks.org/-rBGe

Gregory, J., & Salmon, G. (2013). Professionaldevelopment for online university teaching. DistanceEducation, 34(3), 256-270.doi:10.1080/01587919.2013.835771

Grover, K. S., Walters, S., & Turner, R. C. (2016).Exploring faculty preferences for mode of delivery forprofessional development initiatives. Online Journalof Distance Learning Administration, 19(1), 1-12.Retrieved from https://edtechbooks.org/-Ayb

Herman, J. (2013). Faculty incentives for online coursedesign, delivery, and professional development.Innovative Higher Education, 38(5), 397-410.

doi:10.1007/s10755-012-9248-6

Hodges, C., Moore, S., Lockee, B., Trust, T., & Bond, A.(2020, March 27). The difference between emergencyremote teaching and online learning. EducauseReview. https://edtechbooks.org/-QQca

King, E., & Boyatt, R. (2015). Exploring factors thatinfluence adoption of e‐learning within highereducation. British Journal of Educational Technology,46(6), 1272-1280. doi:10.1111/bjet.12195

Lackey, K. (2011). Faculty development: An analysis ofcurrent and effective training strategies forpreparing faculty to teach online. Online Journal ofDistance Learning Administration, 14(4), 1-27. Retrieved from https://edtechbooks.org/-JdN

Lane, L. M. (2013). An open, online class to preparefaculty to teach online. Journal of Educators Online,10(1), 1-32. doi:10.9743/jeo.2013.1.1

Lantz-Andersson, A., Lundin, M., & Selwyn, N. (2018).Twenty years of online teacher communities: Asystematic review of formally-organized andinformally-developed professional learninggroups. Teaching and Teacher Education, 75,302-315. https://edtechbooks.org/-iLmI

Luo, T., Freeman, C., & Stefaniak, J. (2020). “Like,comment, and share”—professional developmentthrough social media in higher education: Asystematic review. Educational Technology Researchand Development, 68(4), 1659-1683.https://edtechbooks.org/-tiTi

McGee, P., Windes, D., & Torres, M. (2017). Experiencedonline instructors: Beliefs and preferred supportregarding online teaching. Journal of Computing inHigher Education, 29(2), 331-352.doi:10.1007/s12528-017-9140-6

Mohr, S. C., & Shelton, K. (2017). Best practicesframework for online faculty professionaldevelopment: A delphi study. Online Learning, 21(4).doi:10.24059/olj.v21i4.1273

Philipsen, B., Tondeur, J., Pareja Roblin, N., et al. (2019).Improving teacher professional development foronline and blended learning: A systematic meta-aggregative review. Educational Technology andResearch Development, 67(5), 1145–1174.https://edtechbooks.org/-eGRG

Reilly, J., Vandenhouten, C., Gallagher-Lepak, S., &Ralston-Berg, P. (2012). Faculty development for e-learning: A multi-campus community of practice(COP) approach. Journal of Asynchronous LearningNetworks, 16(2), 99-110.

The Journal of Applied Instructional Design, 10(2) 94

https://edtechbooks.org/-LZH

Rhode, J., Richter, S., & Miller, T. (2017). Designingpersonalized online teaching professionaldevelopment through self-assessment. TechTrends,61(5), 444-451. doi:10.1007/s11528-017-0211-3

Richter, S., & Idleman, L. (2017). Online teachingefficacy: A product of professional development andongoing support. International journal of nursingeducation scholarship, 14(1).doi:10.1515/ijnes-2016-0033.doi/10.1515/ijnes-2016-0033

Samarawickrema, G., & Stacey, E. (2007). Adopting web-based learning and teaching: A case study in highereducation. Distance Education, 28(3), 313-333.https://edtechbooks.org/-GDk

Scarpena, K., Riley, M., & Keathley, M. (2018). Creating

successful professional development activities foronline faculty: A reorganized framework. OnlineJournal of Distance Learning Administration, 21(1),1-8. Retrieved from https://edtechbooks.org/-zNP

Trust, T., Carpenter, J., & Krutka, D. (2017). Movingbeyond silos: Professional learning networks inhigher education. The Internet and HigherEducation, 35, 1-11. https://edtechbooks.org/-gReT

University Professional and Continuing EducationAssociation [UPCEA] (2020). UPCEA hallmarks ofexcellence in online leadership. Retrieved fromhttps://edtechbooks.org/-DoBk

Velez, E. (2015). Administration and faculty: Howrelationships create and solve the problems. Journalof Applied Learning Technology, 5(4), 37-40.Retrieved from https://edtechbooks.org/-rhX

The Journal of Applied Instructional Design, 10(2). https://dx.doi.org/10.51869/jaid2021102

CC BY: This work is released under a CC BY license, which means that youare free to do with it as you please as long as you properly attribute it.