Learning science principles for effective online learning in the workplace

6
Session F2D 978-1-4244-6262-9/10/$26.00 ©2010 IEEE October 27 - 30, 2010, Washington, DC 40 th ASEE/IEEE Frontiers in Education Conference F2D-1 Learning Science Principles for Effective Online Learning in the Workplace Daryl Lawton, John Bransford, and Nancy Vye University of Washington, [email protected], [email protected], [email protected] Michael C. Richey, Vivian T. Dang, and David E. French The Boeing Company, [email protected], [email protected], [email protected] Abstract — We present an experiment involving the analysis and redesign of an online course that has been in existence for several years at Boeing. We created an experimental version of the course to explore the differential impact of a mix of learning science frameworks: integrating different types of formative and reflective feedback; minimizing cognitive load by restructuring lecture-derived text-centric materials into short, narrated, student-controlled videos; and using a learning management system (LMS) that supported social interactions and learner collaboration. We found significant differences between the existing course and the new course. Subjects in the experimental course learned more and at a greater rate. They had a more engaged attitude toward their future learning and less dependence on their level of initial subject knowledge. The social collaboration and data collection capabilities of the LMS supported continuous online course improvement during course development, but these tools had limited spontaneous use by students during the experiment. We describe the impact of these findings on developing an evidence-based methodology for implementing online learning experiences and future work. Index Terms — Online learning, Assessment, Cognitive Load, Expertise, Learning Science, Workplace Learning. INTRODUCTION The Boeing Company provided over 5 million hours of instruction in 2009 to more than 150,000 employees in 45 countries. Approximately 40 percent of these courses are partially or totally online, with the overall trend of online courses continuing to increase. Because of the scale and impact on learning outcomes this involves, small but effective changes in engineering epistemologies, learning systems, and assessments within the workplace environment can have large cumulative value in terms of increased efficiency and knowledge transfer throughout the enterprise. For these reasons, our goal is to develop an evidence-based methodology to develop online learning experiences and that leverages shared resources, processes, and tools. To realize this goal, we have created an industry– university partnership based on the reciprocal and mutually beneficial connections between learning science and the workplace. We are learning scientists from the University of Washington and coaches and instructional developers from The Boeing Company who are engaged in a series of research projects to understand and improve workplace learning. Our research focuses on the critical and evolving nature of engineering skills and the intrinsic learning issues involved within a complex adaptive social system. This study is part of a broader research portfolio to explore learning science research and implement evidence-based findings for continuous improvement within a corporate learning community. BACKGROUND Workplace products and processes are constantly changing, driven largely by innovation, technology, business economics, and social factors. The global workplace requires effective methods for the continuous creation and adaptation of instruction. A wide range of relevant theories and tools for this are found in the learning sciences — an integrative mix of research in fields such as cognitive psychology, education, neuroscience, and computer science. At the same time, the workplace is an extraordinarily rich place for learning science. The workplace is a non- classroom, often informal learning environment. It is inherently social, multicultural, and international. Workplace learning is inherently collaborative — the best resources, teaching, and ideas frequently come from the workforce and its shared experiences, especially as learners develop expertise and become ideal teachers for novices. Online learning is an increasingly critical part of learning in the workplace. It simplifies scheduling to have courses continually available online. It reduces traveling costs and helps to leverage the distributed expertise of subject matter experts by allowing them to work virtually. Often times, most students, especially incumbent engineers, are busy throughout the day. It is much more convenient for engineers to organize their learning to fit their schedule. Work is increasingly online and virtual, and courses that are offered online can directly incorporate these features. At the same time, designers of online courses are challenged to create experiences where students learn as well as in instructor-led courses. Feedback is fundamental

Transcript of Learning science principles for effective online learning in the workplace

Session F2D

978-1-4244-6262-9/10/$26.00 ©2010 IEEE October 27 - 30, 2010, Washington, DC 40th ASEE/IEEE Frontiers in Education Conference F2D-1

Learning Science Principles for Effective Online Learning in the Workplace

Daryl Lawton, John Bransford, and Nancy Vye

University of Washington, [email protected], [email protected], [email protected]

Michael C. Richey, Vivian T. Dang, and David E. French The Boeing Company, [email protected], [email protected], [email protected]

Abstract — We present an experiment involving the analysis and redesign of an online course that has been in existence for several years at Boeing. We created an experimental version of the course to explore the differential impact of a mix of learning science frameworks: integrating different types of formative and reflective feedback; minimizing cognitive load by restructuring lecture-derived text-centric materials into short, narrated, student-controlled videos; and using a learning management system (LMS) that supported social interactions and learner collaboration. We found significant differences between the existing course and the new course. Subjects in the experimental course learned more and at a greater rate. They had a more engaged attitude toward their future learning and less dependence on their level of initial subject knowledge. The social collaboration and data collection capabilities of the LMS supported continuous online course improvement during course development, but these tools had limited spontaneous use by students during the experiment. We describe the impact of these findings on developing an evidence-based methodology for implementing online learning experiences and future work. Index Terms — Online learning, Assessment, Cognitive Load, Expertise, Learning Science, Workplace Learning.

INTRODUCTION

The Boeing Company provided over 5 million hours of instruction in 2009 to more than 150,000 employees in 45 countries. Approximately 40 percent of these courses are partially or totally online, with the overall trend of online courses continuing to increase. Because of the scale and impact on learning outcomes this involves, small but effective changes in engineering epistemologies, learning systems, and assessments within the workplace environment can have large cumulative value in terms of increased efficiency and knowledge transfer throughout the enterprise. For these reasons, our goal is to develop an evidence-based methodology to develop online learning experiences and that leverages shared resources, processes, and tools. To realize this goal, we have created an industry– university partnership based on the reciprocal and mutually beneficial

connections between learning science and the workplace. We are learning scientists from the University of Washington and coaches and instructional developers from The Boeing Company who are engaged in a series of research projects to understand and improve workplace learning. Our research focuses on the critical and evolving nature of engineering skills and the intrinsic learning issues involved within a complex adaptive social system. This study is part of a broader research portfolio to explore learning science research and implement evidence-based findings for continuous improvement within a corporate learning community.

BACKGROUND

Workplace products and processes are constantly changing, driven largely by innovation, technology, business economics, and social factors. The global workplace requires effective methods for the continuous creation and adaptation of instruction. A wide range of relevant theories and tools for this are found in the learning sciences — an integrative mix of research in fields such as cognitive psychology, education, neuroscience, and computer science. At the same time, the workplace is an extraordinarily rich place for learning science. The workplace is a non-classroom, often informal learning environment. It is inherently social, multicultural, and international. Workplace learning is inherently collaborative — the best resources, teaching, and ideas frequently come from the workforce and its shared experiences, especially as learners develop expertise and become ideal teachers for novices.

Online learning is an increasingly critical part of learning in the workplace. It simplifies scheduling to have courses continually available online. It reduces traveling costs and helps to leverage the distributed expertise of subject matter experts by allowing them to work virtually. Often times, most students, especially incumbent engineers, are busy throughout the day. It is much more convenient for engineers to organize their learning to fit their schedule. Work is increasingly online and virtual, and courses that are offered online can directly incorporate these features.

At the same time, designers of online courses are challenged to create experiences where students learn as well as in instructor-led courses. Feedback is fundamental

Session F2D

978-1-4244-6262-9/10/$26.00 ©2010 IEEE October 27 - 30, 2010, Washington, DC 40th ASEE/IEEE Frontiers in Education Conference F2D-2

for learning [1], and research on feedback is especially important for online learning because students can be working on difficult material, on their own, without direct access to a teacher.

Several types of assessments such as summative, formative, reflective, authentic, and transparent have different roles in providing learner feedback [2]. A key distinction is between assessments that measure a skill and those that provide guidance to the learner. Black and Wiliams’ [3] meta-analytic study contrasted work in summative and formative assessments and found that the formative assessment was consistently responsible for sizable gains in learning.

The structure of learner feedback and its integration with practice has been extensively studied in theories of expertise. The studies of Ericsson [4] and others have investigated exceptionally skilled individuals across a wide range of areas. Expertise research show that expert level skill is not based on innate talent, but on significant practice. Practice for expertise differs from repetitive drill in being deliberate and having an engaged and reflective quality. Practice often occurs on a skill boundary with conscious exploration to create meaningful failures followed by purposeful feedback akin to debugging. A basic implication is that online learning experiences involve practice in the actual activity being taught and that the practice is not rote drill. Learners need to be engaged in feedback about their own activities so they become increasingly self-regulatory and evaluative.

Limiting confusion is critical in online courses because no teacher is present to clarify things. The clarity and usability of instruction needs to be understood and carefully analyzed. Cognitive load theory provides a basic framework for this [5] using a general cognitive architecture. A key insight for instructional design is respecting limits on how much information can be present in short-term memory during learning. Mayer has used the theory for extensive studies on the effective use of multimedia [6].

Learner feedback can also come from group interactions and collaboration. Collaboration supports learning in several ways — it increases the repetition of material in new contexts as people work with one another, people understand things better when they explain them to others, and social prestige can become motivating. Course developers can obtain substantial feedback about a course by watching students interact and making their knowledge explicit as a part of communicating with each other through chats, discussion boards, and wikis.

The infrastructure for providing feedback and collaboration in online learning is continually improving. Software for online learning is driven, in part, by the general development of productivity, multimedia, and collaborative tools. This has been accelerated by the creation of software components and tools for Internet-based presentations and collaborations. A recent development has been packaging this functionality into Learning Management Systems

(LMS), which provide authoring level interfaces for Internet-based learning experiences. There is an increasing number [7] of LMSs, and it is not difficult to find discussions comparing them. There are issues about whether LMSs are a constraining framework, or even necessary, in light of diverse Web services that can be integrated as Personal Learning Environments (PLE).

THE ESTABLISHED ONLINE COURSE

For our experiment, we selected an established online Boeing course about Product Data Management (PDM) and its role in Product Lifecycle Management (PLM). The course introduces the integrative database technologies used to organize product design, development, manufacture, and maintenance — all the phases of a product’s lifecycle [8-9]. The course presents large orienting concepts and several complex procedures for dealing with prodigious amounts of dynamic data. The existing PDM course is completely online and without any teacher intervention. It requires 8 hours to complete. The PDM course is considered challenging based on feedback from students, instructors, and managers.

The existing course is presented as a series of slides grouped by topic (Figure 1). Each topic is bracketed by slides that introduce the objectives of the topic sequence followed by a review of what was covered. Several of the topics end with a simulation created with Adobe Captivate [10] from PDM screen captures. The course was presented using an intranet-enabled, multimedia presentation system.

I. Analysis of Existing Course

We analyzed the existing course by going through it repeatedly, mapping out when concepts were introduced and where they were referred to. We interviewed students, developers, and instructors who took and developed related PDM courses. From this analysis, we determined several critical issues.

FIGURE 1

SLIDE SEQUENCE FROM ESTABLISHED COURSE. SQUARES INDICATE POSITION OF TOPIC OBJECTIVES SLIDES. VERTICAL BAR INDICATES

POSITION OF SCRIPTED SIMULATION.

The assessment in the existing course was done largely by a final test consisting of multiple choice questions. The assessments provided very limited in-context feedback to students. More feedback was provided by the scripted

Session F2D

978-1-4244-6262-9/10/$26.00 ©2010 IEEE October 27 - 30, 2010, Washington, DC 40th ASEE/IEEE Frontiers in Education Conference F2D-3

simulations. However, the simulations did not require interaction beyond clicking at indicated screen locations. Students could easily game these by clicking until they were presented with the correct thing to do. The feedback from these simulations was low-level instructions about what to click or type next.

There were some recurring conceptual blockages for students. The course assumed some familiarity with database concepts, which were not understood functionally by all students. Many of these could be determined by asking students to sketch the relations between client software and the server database and how PDM operations impact the locations and attributes of stored data.

Students found the multimedia presentation system in the established course straightforward to use because navigation basically involved going forward or backward through sequentially ordered slides. The presentation system was not intended to function as a full-functioned LMS and lacked several features, such as in-context discussion boards and forums, specifying groups of students working together on a challenge or activity, or support for students uploading potentially complex work products. There was no facility for asking questions about the material, getting timely feedback, or interacting with others.

There were several types of cognitive load in the existing course. Definitions and concepts were introduced but then never used or referenced. Many of the slides had extraneous graphics to create visual interest. There was memorization of complex operation sequences that would in reality be performed by using reference materials.

Finally, a major challenge of the PDM course in general was its bootstrap nature. It introduced PDM concepts but without having students use the associated PDM tools. Students were introduced to several concepts and procedures, many quite involved, but which they would use only after the course.

THE EXPERIMENTAL COURSE

The experimental course was designed around our interest in developing an evidence-based design methodology for online learning and understanding how to structure feedback for students. We also wanted to address the issues from our analysis of the control course. We maintained the no hands-on use of the PDM tools in the experimental course for purposes of comparison to the existing course and to use identical resources. The contrasts between the control and the experimental course with respect to the relevant theoretical frameworks are presented in Table 1. There were several key contrasts.

Learner-Controlled Videos: Instead of a sequence of slide-based materials that students would go through sequentially, we used focused narrated videos. Each was 2 to 4 minutes in length. Students could start, stop, and replay the videos as they wanted.

TABLE I

CONTRASTS BETWEEN ESTABLISHED AND EXPERIMENTAL COURSE

Integrated Formative Assessments: The control

course relied almost exclusively on a final summative assessment. We organized the experimental course around formative assessments that were interleaved with the video resources. After seeing a video, a formative assessment would present a screen shot that students were asked to interpret based on actual PDM use. The formative assessments were graded, but students were told the assessments could be attempted as often as they wanted to improve any grade, and, in fact, were encouraged to do so.

Reflective Assessments: At the end of each module in the experimental course, students were presented with concepts or pictures from the module and were asked to evaluate their level of understanding on a 5-point scale. Immediately following the student evaluation, there were links back to relevant course resources students could use if they felt uncertain.

Learning Management System: Instead of course software that was built to provide multimedia content over an intranet, we used Moodle 1.9, a widely used, open-source LMS, which has a wide range of features [11]. These features included substantial functionality for development and administration, including detailed logging of student activities, a question data base, and a wide range of social features (wikis, discussion board, chats, and groups).

The design of the experimental course was strongly influenced by the Star Legacy Challenge Cycle [12] and Understanding by Design (UbD) [13] in which learning experiences are organized around essential concepts or activities (called Big Ideas or Challenges). The Star Legacy Challenge Cycle stresses the use of focused but loosely coupled resources that students freely explore as opposed to a directed, lecture-based narrative reflected in our use of video resources and formative activities.

Control Course Experimental Course Theory\Framework

Reading/Viewing Slides

Student Controlled, Focused, Narrated Videos

Cognitive Load

Mostly Summative Assessments

Integrated Formative and Reflective Assessments

Theories of Feedback for Learning

Directed Simulations with low-level feedback

Interpret Realistic Screen Shots

Theories of Expertise

Multimedia Presentation Software

Moodle LMS Learning Management Systems (LMS); Personal Learning Environments (PLE)

No online interactions

Support for Extensive Social + Collaborative Interaction

Computer-Supported Collaborative Learning

Directed Instructional Design

Star-Legacy Challenge Cycle [Modified]

Theories of Pedagogic Forms

Session F2D

978-1-4244-6262-9/10/$26.00 ©2010 IEEE October 27 - 30, 2010, Washington, DC 40th ASEE/IEEE Frontiers in Education Conference F2D-4

FIGURE 2

PORTIONS OF A MODULE FROM EXPERIMENTAL COURSE SHOWING THE COURSE DASHBOARD, MODULE BIG IDEA, VIDEO LINK ICONS FOLLOWED BY FORMATIVE ASSESSMENTS , AND A FINAL REFLECTIVE ASSESSMENT.

The experimental course was broken into a series of

modules, corresponding to significant concepts (PLM history, product and manufacturing configuration) or skill (searching, controlling documents). The typical format of a module is shown in Figure 2. The experimental course Web page is broken into two columns. The right-hand side is a narrow dashboard of features for navigating the course, tracking work, and social interactions. This included discussion boards, personal Web pages, tags, and various ways of finding and contacting other people in the course and asking questions. There were also links to course and module-specific discussion boards and each student’s personal Web page.

The wider column on the left presents the module content. Each module begins with a Big Idea expressed by a conceptual overview diagram. This is followed by a short narrative description with highlighted vocabulary and concepts. This was followed by Resources and Activities that consisted of the short videos (indicated by icons showing a frame from the video) interspersed with the formative assessments. The formative assessments were indicated by the Moodle quiz icon, which appears as a red check mark. Modules conclude with a Summing Up section containing a reflective assessment.

THE EXPERIMENT

Subjects were recruited through an internal Web site used to access online courses. People intending to register for the existing course were presented with the opportunity to take part in a study to improve online learning at Boeing.

Subjects were then randomly assigned to either the control course or the experimental course.

Comparison between the control and experimental groups was based on identical pretests and posttests. The pretest and the posttest were both structured into three parts. The first part was a series of multiple choice questions. The second part was a series of screen shots of the Data Base User interface and associated dialogs, which students would be asked to analyze. The last section contained brief essay questions. Questions could have multiple parts allowing for fractionally correct answers. Pretests and posttests were scaled to 100%.

Subjects could take a survey after the posttest regarding their attitudes toward the course they took. The first part consisted of several statements that subjects would rank on a 5-point scale (1 [Strongly Disagree] ↔ 5 [Strongly Agree]). Scores on these questions were combined into four composite measures for (1) overall satisfaction, (2) attitudes toward future learning and use of PDM tools, (3) negative impressions, and (4) impressions of the course software. The second part of the survey was for comments on the course.

RESULTS

In the experimental group, 38 people took the pretest. Of these, 22 finished the posttest with 1 removal for taking more than 3 hours on the posttest (the posttest required an hour). Of the original 38, the mean score on the pretest was roughly 62%. Of those who completed the experimental course, the mean on the pretest was 61.45%.

In the control group, 38 people took the pretest. Of these, 24 finished the posttest with 2 removals for taking more than 3 hours on the posttest. Of the original 38 people, the mean score on the pretest was also roughly 62% —- essentially identical to the experimental group. Of those who completed the control course, the mean on the pretest was 67.18%.

The results of a split plot repeated measures ANOVA are in Table 2. There is a statistically significant main effect with respect to pretest and posttest scores indicating that people learned in both courses (FTest = 107.26, df=1, 41, p < .001). There is also a significant interaction effect for tests scores and whether someone was in the control or experimental group (FTestxGroup = 12.208, df=1, 41, p = .001). The mean scores on the pretest were 61.45 (experimental) and 67.18 (established). The mean scores on the posttest were 79.93 (experimental) and 76.39 (established). The effect size for the interaction was large by Cohen’s criteria [14] (partial η2 = 0.229, power = .927). The sphericity assumption was upheld with Mauchly’s W, Greenhouse-Geisser, Huynh-Feldt, and the lower bound all equal to 1.0 and all yielding equivalent results. These results indicate that people in the experimental course learned more and at a greater rate than subjects in the control course.

Video Link Icon

Session F2D

978-1-4244-6262-9/10/$26.00 ©2010 IEEE October 27 - 30, 2010, Washington, DC 40th ASEE/IEEE Frontiers in Education Conference F2D-5

TABLE II SPLIT PLOT ANOVA TABLE

Figure 3 shows scatter plots of the relation between

the pretest (predictor variable) and the posttest (dependent variable) scores for the established and the experimental courses. Adjusted r2 = .617 (established), while adjusted r2 = .242 (experimental). Overall, the reduced correlation between pretest and posttest scores by people in the experimental course indicates their performance had less dependence on initial knowledge.

Twenty-one people completed the survey from the experimental course and 18 from the control course. The questions and mean responses for the composite scores are shown in Figure 4.

FIGURE 3

SCATTER PLOTS OF PRETEST (HORIZONTAL) VS. POSTTEST (VERTICAL) FOR ESTABLISHED (LEFT, R2 = .62) AND EXPERIMENTAL (RIGHT, R2 = .24)

FIGURE 4

MEANS OF COMPOSITE SURVEY SCORES FOR COURSE SATISFACTION, FUTURE LEARNING ATTITUDE, NEGATIVE IMPRESSIONS, LMS IMPRESSIONS.

EXPERIMENTAL COURSE IS THE LEFT BAR. 1 (STRONGLY DISAGREE) 2 (DISAGREE) 3 (NEUTRAL) 4 (AGREE) 5 (STRONGLY AGREE).

DISCUSSION

The results show a significant improvement in the overall learning in the experimental course with respect to the control course, with less dependence on students’ initial knowledge. An obvious but critical implication of this is that online courses, even those based on the same material, are not equivalent just by virtue of being online. The scale of improvement seen here is similar to that described in the meta-analysis by Black and Wiliams [3] comparing formative to summative assessments. It is interesting that we find similar effects in an autonomous non-classroom online course without direct teacher involvement.

The survey questions regarding course attitude show students had a higher degree of satisfaction with the experimental course with less of an overall negative impression (Figure 4). They also had a better attitude toward future learning and use of PDM technology. There were a wide range of opinions expressed in the survey comments for both courses. A few things were repeated with some consistency: (1) people in both courses wanted hands-on use of PDM tools, (2) people appreciated the videos in the experimental course and the simulations in the control course, and (3) there were several complaints regarding navigation problems with the LMS in the experimental course and not having enough time to become familiar with its features while using it for the first time. This was less of an issue with the presentation system used in the control course because navigation was basically going to the next screen. While the experimental LMS was favored in the overall ratings, there appears to have been greater cognitive load associated with it due to more complex features and navigation issues. It would be interesting to see if this decreases with use over several courses. Additionally, the LMS we used (Moodle 1.9) is being updated with improved navigation features (Moodle 2.0).

An open issue is why, of the people who completed both courses, those in the experimental course had lower pretest scores and if this had any significance. Perhaps the experimental course features made it more attractive or accessible to learners with less of an initial background (or, conversely less attractive to learners with more of a background). The regression analysis (Figure 3) is useful in this regard. It clearly shows less of a correlation between final knowledge with initial knowledge for students in the experimental course than the control course, which is often the dominant factor in student performance.

The experimental course presented students with several social and collaborative features. Students were highly encouraged to use them to ask questions and interact with others. We found these were not spontaneously or widely used. One reason for this was clear once the experiment was completed. The recruitment process was not set up to schedule people to be online at the same time. Several times only one person was online in the experimental course at a time.

Even so, it was surprising that people made limited use of the social features for asynchronous interaction. In the Moodle LMS, it is simple to set up a personal Web page

Source SS dF MS F Sig Within Subjects: 6067.19 43 Tests 4055.32 1 4055.32 107.26 <.0001 Groups x Tests 461.62 1 461.62 12.21 0.001 Subjects x Tests 1550.25 41 37.81 Between Subjects: 6294.71 42 Groups 25.75 1 25.75 0.17 0.682 Pooled Subjects 6268.96 41 152.9 Total 12361.9 85

Session F2D

978-1-4244-6262-9/10/$26.00 ©2010 IEEE October 27 - 30, 2010, Washington, DC 40th ASEE/IEEE Frontiers in Education Conference F2D-6

so you are publicly identified as a course participant. But of the 21 subjects who completed the experimental course, only 5 (~24%) identified themselves and left a brief identifying comment on their personal Web page. This was far from an onerous thing to do. The fact that most people did not implies they chose not to.

There could be many reasons for the hesitancy to take on an online course presence. People are busy and not interested in spending any more time than needed. There are also concerns about using unfamiliar social network software. There is a lot of passive online viewing even in open, collaborative learning communities. This does indicate that active online collaborative learning requires more explicit design and organization.

It is valuable to focus on the people who did set up profiles and supplied feedback about the course through the discussion boards and direct comments to the course developers. We came to refer to these as spontaneously active online collaborators and realized they are valuable resources that an online course helps discover. They are a community of potential beta-testers on other courses or sources of additional course resources that can be obtained through interviews about their use of PDM tools. They can be treated as seeds for small world networks [15] by promoting their presence in the online course, minimally by keeping their discussion posts, but also by providing them with a blog or highlighting them for contact by others who take the course in the future.

CURRENT AND FUTURE WORK

A basic direction for future work is to apply and refine these methods with other online courses, especially ones that involve more direct application of concepts to work situations. Earlier work in our group dealt with the impact of small collaborative groups for creating social networks and for sharing future knowledge [16]. The work here shows the impact of learning science design principles for structuring learner feedback in online courses as well as providing baseline information for online collaboration. These approaches should be combined on tasks involving learning through authentic collaborative tasks, where social interactions are mediated by online technology so they can scale throughout the enterprise.

The formative and reflective assessments in the course provide a detailed time-stamped data stream of learner activity. Several types of information about both individuals and their collective course behavior can be derived from this and the differences in the time stamps between student actions. A rich area for future work is applying quantitative usability and data mining techniques to this data.

ACKNOWLEDGMENT

Work supported by The Boeing Company and NSF (SBE- 0354453). Information presented here does not represent the official policy of the NSF.

REFERENCES

[1] Hattie, J. and Timperley, H. "The Power of Feedback”, Review of Educational Research 2007; 77; 81 [2] Pellegrino, J. W., Chudowsky, N, and Glaser, R. Knowing What Students Know: The Science and Design of Educational Assessment. National Academies Press, 2001 [3] Black, P. and Wiliams, D., "Assessment and Classroom Learning." Assessment in Education, Vol. 5, No. 1, 1998. [4] Ericsson, K. A., Charness, N., Feltovich, P., & Hoffman, R. R. Cambridge handbook of expertise and expert performance. Cambridge, UK: Cambridge University Press. 2006. [5] Sweller, J., Van Merriënboer, J., & Paas, F. (1998). "Cognitive architecture and instructional design." Educational Psychology Review 10: 251–296. [6] Mayer, R. E., & Moreno, R., "Nine ways to reduce cognitive load in multimedia learning." Educational Psychologist, 38(1), 43-52. [7] LMS, http://en.wikipedia.org/wiki/Learning_management_system. Last accessed Jan 5, 2010. [8] ENOVIA, http://www.3ds.com/products/enovia. Last accessed Jan 5, 2010. [9] Stark, J., Product Lifecycle Management: 21st century Paradigm for Product Realization. Springer, 2004. [10] Captivate, http://www.adobe.com/products/captivate/. Last accessed Feb 20, 2010. [11] Moodle, http://moodle.org/. Last accessed Jan 5, 2010. [12] Schwartz, D., Lin, X., Brophy, S., & Bransford, J.D., “Toward the development of flexibly adaptive instructional designs. " In C.M. Reigeluth (Ed.), Instructional design theories and models: A new paradigm of instructional theory (Vol. II) (pp. 183–213). Mahwah, NJ: Lawrence Erlbaum Associates. 1999. [13] Wiggins, G. and McTighe, J. Understanding by Design, Expanded 2nd Edition. Prentice Hall, 2005. [14] Cohen, J. Statistical power analysis for the behavioral sciences, Lawrence Earlbaum Associates, Hillsdale, NJ. 1988. [15] Watts, Duncan J.; Strogatz, Steven H. "Collective dynamics of 'small-world' networks". Nature 393: 440–442. June 1998. [16] O’Mahony, T. K., Bransford, J. D., Vye, N., Lin, K.Y., Richey, M., Soleiman, M., & Dang, V. "Learning and Collaboration in Fast Changing Environments." Journal of the Learning Sciences. In review, 2009.

AUTHOR INFORMATION

Daryl Lawton, Graduate Student, Learning Sciences, College of Education, University of Washington, [email protected].

John Bransford, Professor, Learning Sciences, College of Education, University of Washington, [email protected].

Nancy Vye, Senior Learning Scientist, LIFE Center, University of Washington, [email protected]

Michael C. Richey, Associate Technical Fellow, Learning, Training and Development, The Boeing Company, [email protected]

Vivian T. Dang, Engineering Education Project Leader, Learning, Training and Development, The Boeing Company, [email protected]

David E. French, Curriculum Development Integrator, Learning, Training and Development, The Boeing Company, [email protected]