Monitoring collaborative activities in computer supported collaborative learning

19
PLEASE SCROLL DOWN FOR ARTICLE This article was downloaded by: [Sarti, Luigi] On: 25 April 2010 Access details: Access Details: [subscription number 921487301] Publisher Routledge Informa Ltd Registered in England and Wales Registered Number: 1072954 Registered office: Mortimer House, 37- 41 Mortimer Street, London W1T 3JH, UK Distance Education Publication details, including instructions for authors and subscription information: http://www.informaworld.com/smpp/title~content=t713412832 Monitoring collaborative activities in computer supported collaborative learning Donatella Persico a ; Francesca Pozzi a ;Luigi Sarti a a Istituto per le Tecnologie Didattiche del Consiglio Nazionale delle Ricerche, 16149 Genova, Italy Online publication date: 21 April 2010 To cite this Article Persico, Donatella , Pozzi, Francesca andSarti, Luigi(2010) 'Monitoring collaborative activities in computer supported collaborative learning', Distance Education, 31: 1, 5 — 22 To link to this Article: DOI: 10.1080/01587911003724603 URL: http://dx.doi.org/10.1080/01587911003724603 Full terms and conditions of use: http://www.informaworld.com/terms-and-conditions-of-access.pdf This article may be used for research, teaching and private study purposes. Any substantial or systematic reproduction, re-distribution, re-selling, loan or sub-licensing, systematic supply or distribution in any form to anyone is expressly forbidden. The publisher does not give any warranty express or implied or make any representation that the contents will be complete or accurate or up to date. The accuracy of any instructions, formulae and drug doses should be independently verified with primary sources. The publisher shall not be liable for any loss, actions, claims, proceedings, demand or costs or damages whatsoever or howsoever caused arising directly or indirectly in connection with or arising out of the use of this material.

Transcript of Monitoring collaborative activities in computer supported collaborative learning

PLEASE SCROLL DOWN FOR ARTICLE

This article was downloaded by: [Sarti, Luigi]On: 25 April 2010Access details: Access Details: [subscription number 921487301]Publisher RoutledgeInforma Ltd Registered in England and Wales Registered Number: 1072954 Registered office: Mortimer House, 37-41 Mortimer Street, London W1T 3JH, UK

Distance EducationPublication details, including instructions for authors and subscription information:http://www.informaworld.com/smpp/title~content=t713412832

Monitoring collaborative activities in computer supported collaborativelearningDonatella Persico a; Francesca Pozzi a;Luigi Sarti a

a Istituto per le Tecnologie Didattiche del Consiglio Nazionale delle Ricerche, 16149 Genova, Italy

Online publication date: 21 April 2010

To cite this Article Persico, Donatella , Pozzi, Francesca andSarti, Luigi(2010) 'Monitoring collaborative activities incomputer supported collaborative learning', Distance Education, 31: 1, 5 — 22To link to this Article: DOI: 10.1080/01587911003724603URL: http://dx.doi.org/10.1080/01587911003724603

Full terms and conditions of use: http://www.informaworld.com/terms-and-conditions-of-access.pdf

This article may be used for research, teaching and private study purposes. Any substantial orsystematic reproduction, re-distribution, re-selling, loan or sub-licensing, systematic supply ordistribution in any form to anyone is expressly forbidden.

The publisher does not give any warranty express or implied or make any representation that the contentswill be complete or accurate or up to date. The accuracy of any instructions, formulae and drug dosesshould be independently verified with primary sources. The publisher shall not be liable for any loss,actions, claims, proceedings, demand or costs or damages whatsoever or howsoever caused arising directlyor indirectly in connection with or arising out of the use of this material.

Distance EducationVol. 31, No. 1, May 2010, 5–22

ISSN 0158-7919 print/ISSN 1475-0198 online© 2010 Open and Distance Learning Association of Australia, Inc.DOI: 10.1080/01587911003724603http://www.informaworld.com

Monitoring collaborative activities in computer supported collaborative learning

Donatella Persico, Francesca Pozzi*, and Luigi Sarti

Istituto per le Tecnologie Didattiche del Consiglio Nazionale delle Ricerche, Via de Marini, 6, 16149 Genova, ItalyTaylor and FrancisCDIE_A_472982.sgm(Received 31 July 2009; final version received 17 January 2010)10.1080/01587911003724603Distance Education0158-7919 (print)/1475-0198 (online)Original Article2010Open and Distance Learning Association of Australia, Inc.311000000May [email protected]

Monitoring the learning process in computer supported collaborative learning(CSCL) environments is a key element for supporting the efficacy of tutor actions.This article proposes an approach for analysing learning processes in a CSCLenvironment to support tutors in their monitoring tasks. The approach entailstracking the interactions within the communication platform to identify cues of theparticipative, social, cognitive, and teaching dimensions of the learning process.Both quantitative and qualitative indicators are employed to achieve a completeand thorough picture of the learning dynamics. A set of methodological andtechnological tools based on this approach has been tried out in the context of theonline component of a blended course in educational technology addressingtrainee teachers. The results of the study support the applicability of the proposedapproach to content domains where discussion and reflective practice are the mosteffective learning strategy.

Keywords: CSCL; monitoring; assessment; evaluation; tutor support; quantitativeindicators; qualitative indicators

Setting the scene

The value of online collaborative learning has been increasingly recognised andinvestigated by researchers and designers of computer supported collaborative learn-ing (CSCL) (see Anderson & Kanuka, 2003; Bransford, Goldman, & Pellegrino,1991; Dillenbourg, 1999; O’Malley, Suthers, Reimann, & Dimitracopoulou, 2009;Roberts, 2003; Scardamalia & Bereiter, 1994). In CSCL contexts students work onlinein groups; each group is usually engaged in tasks (discussing a topic, solving aproblem, studying a case) with concrete outputs, which act as catalysts for interactionand collaboration with peers. As a consequence the process is intrinsically learner-centred, with the teacher (or tutor, as the person in charge of the overall process inthese contexts is usually referred to) acting as a mediator. Techniques or strategiesusually adopted in these contexts include discussion, peer review, role play, jigsaw,and case study, with the aim to provide a structure to the activities and foster collab-oration and exchange among peers, while the tutor is in charge of facilitating andorchestrating the emerging group interactions and dynamics (Pozzi, 2009).

The potential benefits of these approaches range from the promotion of criticalthinking and conceptual understanding to the enhancement of motivation and

*Corresponding author. Email: [email protected]

Downloaded By: [Sarti, Luigi] At: 16:53 25 April 2010

6 D. Persico et al.

development of group problem-solving abilities. However, in order to further developboth research and practice in this field, a number of key problems are yet to be solved.Among these, the development of effective methods and tools to monitor onlinecollaborative learning is of paramount importance. In fact, monitoring is a crucialactivity for informing practice as well as research. It is usually carried out by the tutorsand plays a pivotal role in both the management and evaluation of CSCL processes.

As for practice, although tutors generally deploy their own personal tutoring style,effective monitoring is arguably the conditio sine qua non for carrying out their job inan effective way. The tutor tasks include providing guidance and support to partici-pants, facilitating access to the learning environment, and providing help with its use,mediating between the instructional design decisions and the spontaneous dynamicsof the learning group, helping individuals to work collaboratively towards theachievement of common goals, stimulating discussion, and promoting cohesion andmutual understanding among students. To effectively perform these tasks tutorsshould always have a clear picture of what is happening; they should know who isparticipating and how. And they need relevant information to identify different learn-ing styles, in order to take advantage of individual abilities and adequately deal withproblems. Such information includes data about students’ behaviours, not only activebut also reactive behaviours (that is, lurkers and silent participants) (Lobry de Bruyn,2004). Research on the role of the tutor in CSCL environments has taken a pragmaticstandpoint identifying and demonstrating the wide range of competences and skills atutor must possess to effectively support learning within a virtual community, as wellas investigating and discussing the pros and cons of various tutoring styles (Mason &Kaye, 1989; Rowntree, 1995; Soby, 1992). In spite of their different aims, authorshave acknowledged that the work of the tutor is tiresome, time consuming, and oftenstressful (Collins & Berge, 1996; Conrad, 2004; de Laat, Lally, Lipponen, & Simons,2007; Salmon, 2004), especially if the student cohort is large. As a consequence, manyadvocate the need for tools that provide just-in-time information to keep track ofstudents’ activities (Soller, Martínez, Jermann, & Muehlenbrock, 2005). Somecomputer mediated communication (CMC) systems provide tools for gatheringgeneric statistics that can be used to monitor participant behaviours, but most of themprovide only an incomplete, quantitative view of events. What is needed is a complete,quantitative, and qualitative picture (Caballé, Juan, & Xhafa, 2008).

As far as research is concerned, approaches based on the analysis of interactionsthat occurs among students and tutors during the learning process have been increasinglyadopted to investigate learning dynamics. In most studies, quantitative data extracteddirectly from the CMC system is juxtaposed with more qualitative information obtainedthrough approaches that usually rely on the content analysis of the messages exchanged(see Aviv, Erlich, Ravid, & Geva, 2003; Daradoumis, Martinez-Monés, & Xhafa, 2004;Hara, Bonk, & Angeli, 2000; Henri, 1992; Kaleidoscope Network of Excellence, n.d.;Lally & de Laat, 2002; Lipponen, Rahikainen, Lallilmo, & Hakkarainen, 2003;Martinez, Dimitriadis, Rubia, Gomez, & De La Fuente, 2003; Rourke, Anderson,Garrison, & Archer, 2001b; Schrire, 2006; Weinberger & Fischer, 2006).

However, these approaches are predominantly manual and considerably timeconsuming. This has, so far, discouraged their use for monitoring purposes. The ideathat the tutors’ work should include extensive content analysis tasks appears unaccept-able to those who have experienced the pressure and tension of having to follow eachand every learner without missing any situation where intervention is needed.Nevertheless, there are at least three reasons why similar research tools can be of great

Downloaded By: [Sarti, Luigi] At: 16:53 25 April 2010

Distance Education 7

help in the everyday working activities of a tutor. Foremost, they enable achievementof a deep understanding of the learning process, rather than the surface picture that isusually drawn with more heuristic methods; and such understanding is needed topursue and sustain a high quality educational experience. Secondly, they adopt asystematic, unifying approach, which is respectful of tutoring styles but also guaran-tees uniformity of treatment and assessment of individual students. Finally, theyensure repeatability and transferability of the approach to different contexts and bydifferent, possibly less experienced, tutors.

In this study, the authors developed a theoretical model, along with conceptual andtechnical tools, that tutors could use in real contexts, to go beyond the surface levelapproach to understanding interactions and gain information about the social andcognitive processes that take place in a CSCL environment. This was possible fromthe data automatically recorded by the system, including the students’ interactions,enriched with information provided by the tutors themselves in such a way that theycould build and use an overall, dynamic picture of the process at any time.

The aim of this article is to illustrate the origin and features of the model andinvestigate the feasibility and soundness of the monitoring approach, by providingexamples of its use and discussing the outcomes of its application in a real-worldsituation.

A model for monitoring and evaluating CSCL processes

The approach proposed in this study has been developed from the experience of theauthors and previous research work (see Garrison & Anderson, 2003; Henri, 1992).This adopted model is based on the need to provide our tutors with a complete viewon students’ performance with minimum information overloading.

The Istituto Tecnologie Didattiche – CNR (Italian National Research Council) hasbeen delivering online and blended courses, mostly for in-service and pre-serviceteachers, since the beginning of the 1990s. In particular, the course in educationaltechnology for pre-service teachers, which provided the context for this study, ranfrom 2000 to 2007. This course, described in detail in the next section, annuallycomprised a cohort of about 150 students and employed about half a dozen tutors.Most of the tutors were quite experienced, had developed effective tutoring methods,but had different approaches and styles. Some of them were less experienced andneeded training before the course, as well as support during the event. As a conse-quence, the course designers and the tutors themselves soon perceived the need for ahomogeneous and systematic approach. We examined several studies concerningmethodologies and tools that can effectively support monitoring and/or evaluation ofCSCL processes (e.g., Daradoumis et al., 2004; Garrison & Anderson, 2003; Haraet al., 2000; Henri, 1992; Kaleidoscope Network of Excellence, n.d.; Lally & de Laat,2002; Lipponen et al., 2003; Martinez et al., 2003; Rourke et al., 2001b). Amongthese, Garrison and Anderson’s model seemed to best suit our purposes andapproaches, except for the fact that it was devised for inquiry learning, while in ourcontext the nature of the learning objectives imposes a prevalence of reflective prac-tice over the inquiry component. For this reason, we adapted Garrison and Anderson’smodel to our context as illustrated in Pozzi, Manca, Persico, and Sarti (2007), wherethe preliminary version of the model we propose here is thoroughly described. Thisadaptation took into consideration five dimensions along which the collaborativelearning process was analysed. These comprised the participative, the interactive, the

Downloaded By: [Sarti, Luigi] At: 16:53 25 April 2010

8 D. Persico et al.

social, the cognitive, and the teaching dimensions. Each dimension of the model ismeant to be analysed by looking for indicators of participants’ behaviour within therecords of their interactions through the CMC system, taking advantage of the fact thatthe students’ exchanges and the log files provide information that helps to understandthe dynamics of the learning process.

The present version of the model differs from the preliminary one because itconsiders four dimensions only, namely, participative, social, cognitive, and teachingdimensions. The information provided by the missing dimension, interactivity, hasbeen integrated and captured in the other four in order to make the model lighter andeasier to manage. The degree of interactivity between people was in fact defined bythe reciprocal influences in the cognitive processes (Dillenbourg, 1999) and, as such,its investigation addresses aspects such as cross-reference to others’ messages anddocuments, or co-production of artefacts, which can also be analysed in conjunctionwith the other dimensions. Besides, the model presented here differs from the originalversion in the presence of meta-cognitive indicators associated with the cognitivedimension, as these indicators are in fact essential to investigate reflective practice.

The relevant dimensions and indicators of the model can be identified accordingto the aim of the analysis and the nature and type of the learning experience. Possibleaims are the evaluation of the quality of the learning process or aspects of it, themonitoring of students’ performance in order to inform tutor actions, or the assess-ment of individual learning processes in order to carry out formative and summativeevaluation of students’ performances. Besides the range of purposes, other variablesmay impact on the use of the model, such as the nature of the learning experience.CSCL is used both in formal and informal learning, by adults and young students, inscientific, well-formalised disciplines, or to develop skills in ill-defined, arguabledomains. This article focuses on the use of the model for monitoring purposes ininitial teacher training on educational technology, a subject where there is littleprocedural knowledge to be acquired and few formalised laws to be explored throughinquiry-based processes.

In the following, we briefly describe the four dimensions of the model. In thesubsequent sections, we elaborate on how the model has been used for monitoring inthe online component of one of our blended learning courses for trainee teachers. Wethen discuss the outcomes of the field test, based on a survey on the opinions of andfeedback from the tutors who used the model.

The participative dimension

The participative dimension takes into consideration all the actions that learnersperform in relation to their virtual presence in the CMC system and their involvementin the learning activities (reading and writing messages, uploading and downloadingdocuments, logging in and out of the environment). This dimension is of fundamentalimportance in the monitoring process: it gives the tutor an idea of who is participatingand how much they are involved in the process. In this dimension, three main catego-ries of indicators are considered: active participation, reactive participation, and conti-nuity. Active participation includes the students’ visible actions, such as sending amessage or uploading a document. Reactive participation includes actions that are lessvisible, but can be tracked by the CMC system, such as reading messages and down-loading documents. Reactive participation in the original model (Pozzi et al., 2007)was called passive participation. This change in terminology acknowledges that

Downloaded By: [Sarti, Luigi] At: 16:53 25 April 2010

Distance Education 9

reading is far from being a passive process, because it calls for significant interpreta-tive involvement from the reader. Finally, continuity is a measure of the frequency andduration of online sessions.

All in all, this dimension focuses on information that is particularly useful to sortout and interpret critical events of the learning process, such as online silence (Zemby-las & Vrasidas, 2007). Data concerning participation are of a quantitative nature andcan be automatically recorded by the software environment.

The social dimension

Social presence can be defined as ‘the ability of participants in a community of inquiryto project themselves socially and emotionally, as “real” people (i.e., their full person-ality), through the medium of communication being used’ (Garrison, Anderson, &Archer, 1999, p. 94). In order to investigate the social dimension, it is thereforenecessary to identify cues that testify to affection and cohesiveness. As a consequence,examples of indicators of the social dimension include expressions of emotion orintimacy, the use of vocatives, references to the group, and salutations.

Data concerning the social dimension are of a qualitative nature and can beobtained only through content analysis of the messages exchanged among participants(De Wever, Schellens, Valke, & Van Keer, 2006). Table 1 shows the indicators of thesocial dimension and examples of message excerpts for each of them.

The cognitive dimension

Cognitive presence can be defined as ‘the extent to which learners are able toconstruct and confirm meaning through sustained reflection and discourse in a criticalcommunity of inquiry’ (Garrison, Anderson, & Archer, 2001, p. 12). In our model,cognitive presence is revealed by indicators of the various phases of the inquiryprocess (revelation, exploration, integration, and resolution).

In addition to the indicators proposed by Garrison et al. (2001), given the reflec-tive nature of the tasks proposed in our courses, we felt the need to experiment with aset of meta-reflection indicators. Their interpretation deserves special caution for tworeasons: firstly, meta-cognitive and cognitive aspects are often very strictly

Table 1. Examples of data and indicators of the social dimension (inspired by Rourke,Anderson, Garrison, & Archer, 2001a).

Indicator Data Examples

Affection Expressions of emotions ‘I just can’t stand it when … !!!!’‘ANYBODY OUT THERE?’;-))

Expressions of intimacy ‘What really frustrates me is …’Presentation of personal

anecdotes‘When I work, this is what I do …’

Cohesiveness Vocatives ‘John, what do you think?’Referring to the group

using inclusive pronouns‘I think we veered off the track …’

Phatics, salutations ‘Hi all!’‘That’s it for now!’

Downloaded By: [Sarti, Luigi] At: 16:53 25 April 2010

10 D. Persico et al.

intertwined (Newman, Webb, & Cochrane, 1995); and secondly, it rarely happensthat learners spontaneously manifest meta-cognitive processes, which makes itdifficult to obtain this kind of information with interaction analysis methods. Table 2shows the indicators of the cognitive dimension and examples of message excerptsfor each of them.

As in the case of the social dimension, data concerning the cognitive dimensioncomprise qualitative information that can be drawn through content analysis.

The teaching dimension

Teaching presence can be defined as ‘the design, facilitation, and direction of cogni-tive and social processes for the purpose of realizing personally meaningful andeducationally worthwhile learning outcomes’ (Anderson, Rourke, Garrison, &Archer, 2001, p. 5). In other words, teaching presence is the binding element incultivating a learning community: messages carry teaching presence when theyaddress objectives such as providing guidance and instruction, facilitating discourseand managing organisational matters. It should be mentioned that teaching presenceis not necessarily reserved for teachers and tutors only: this role is sometimescarried out by students as well, for example, when they provide group leadership or

Table 2. Examples of data and indicators of the cognitive dimension.

Indicator Data Examples

Revelation Recognising a problem ‘The problem I see here is how to identify …’

Showing a sense of puzzlement

‘This is not clear to me, I really need clarification.’

Explaining a point of view ‘In my opinion …’Exploration Expressing agreement or

disagreement‘I agree because …’

Sharing ideas and information ‘Searching the net I found this interesting paper, that helped me in understanding …’

Negotiating ‘What do you think? Do you agree with me?’

Integration Connecting ideas ‘Starting from Mark’s considerations and taking into account John’s warning, I would suggest …’

Making synthesis ‘In the following table I have reported the opinions expressed up to now …’

Creating solutions ‘A possible solution for this could be …’Resolution Referring to real-life

application‘While experimenting this, I realised

that …’Testing solutions ‘When I tested this approach …’

Meta-reflection Evaluating own knowledge, skills, limits, cognitive processes

‘I have to admit that this sounds really hard to me …’

Planning, monitoring, or adjusting own cognitive processes

‘I think I need more time for this …’

Downloaded By: [Sarti, Luigi] At: 16:53 25 April 2010

Distance Education 11

reciprocal support within the learning community. Data concerning the teachingdimension are qualitative data obtainable through content analysis. Table 3 showsthe indicators of the teaching presence and examples of message excerpts for eachof them.

Context of the field test

The above model was implemented in the 2006 edition of the previously mentionedblended learning course in educational technology, designed and run by the CNR forthe Liguria Postgraduate School for Secondary Teaching. The design of the coursewas informed by a strong need for a modular and flexible learning process capable ofmeeting the diversified needs of a large and heterogeneous target population (Delfino& Persico, 2007).

The course aimed to enable participants to develop a good degree of mastery ofthe educational use of information and communications technology and entailed theintegration of six face-to-face lectures with 8 weeks of online activity. The latter wascarried out via a CMC system: Centrinity FirstClass™. Face-to-face sessions weredevoted to laying the bases for both a better understanding of the subject and an effec-tive participation to online activity. Online work was mainly collaborative, with thecohort of students in work groups, each supported and coordinated by a tutor.Communication was mostly asynchronous, though synchronous communication in theform of chat was occasionally used.

The online component of the course consisted of five modules (see Table 4).

Table 3. Examples of data and indicators of the teaching dimension (inspired by Andersonet al., 2001).

Indicator Data Examples

Proposing activities ‘The task to be carried out consists in elaborating a shared document …’

Diagnose misconceptions ‘Remember, Bates is speaking from an administrative perspective, so be careful when you say …’

Confirming understanding through assessment and explanatory feedback

‘You’re close, but you didn’t account for … this is important because …’

Facilitating discourse

Identifying areas of agreement/disagreement

‘Joe, Mary has provided a compelling counter-example to your hypothesis. Would you care to respond?’

Acknowledging or reinforcing participant contributions

‘Thank you for your insightful comments.’

Setting the climate for learning ‘Don’t feel self-conscious about “thinking out loud” on the forum. This is a place to try out ideas after all.’

Organisational matters

Introducing topics ‘This week we will be discussing …’

Explaining methods ‘I am going to divide you into groups, and you will debate …’

Reminding students of deadlines ‘Please post a message by Friday …’

Downloaded By: [Sarti, Luigi] At: 16:53 25 April 2010

12 D. Persico et al.

Module 1 (2 weeks) was specifically devoted to socialisation among participantsand familiarisation with the learning environment. Socialisation among participantswas also encouraged throughout the whole course, in a virtual area set up as a follow-up of this module.

Module 2 (2 weeks) was a simulation activity where each team of students wasrequired to act as a committee for awarding a prize to an educational website. To doso, each group had to analyse a couple of popular websites for schools and proposeand negotiate criteria to evaluate them, such as correctness, completeness, consis-tency, user-friendliness.

Module 3 (3 weeks) was a role play on the integration of educational technologyin class work. Trainees were required to put themselves in the shoes of teachers ofdifferent types (the behaviourist, the constructivist, the bureaucrat, the technophobe,the technology enthusiast, the school principal) and evaluate the design of an educa-tional activity, in order to identify its strengths and weaknesses and devise ways toovercome the latter. The activity ended with a face-to-face presentation and discussionof the results of the group work.

Module 4 (1 week) was a concluding activity, focusing on participants’ self-evaluation, where they reflected on and assessed their own participation in the course.

Module 5 (6 weeks) was an ongoing activity devoted to discussion and meta-cognitive reflection about the course itself and its method.

Course participants were 112: 86 females (77%) and 26 males (23%); they hadvery different backgrounds: humanities and arts (38%), mathematics and sciences(32%), human science (8%), and foreign languages (22%); and were aged between 24and 47 years (average age 31.7, standard deviation 5.5). Trainees worked in smallinterdisciplinary groups, whose size (around a dozen people each) and compositionremained stable for most of the course. Each of the six tutors involved monitored theactivities of two groups. The tutors were all quite experienced except for one of them,a first-time tutor.

Technological tools to put the model into practice

Two tools were developed on the basis of the described model above to support thetutors in the monitoring activities: one was dedicated to the collection and display ofthe quantitative data; the other was devoted to gathering and elaborating thequalitative data.

The first tool was aimed at supporting monitoring of the participative dimension,providing data for each group and each module. Since this dimension is based solelyon quantitative data automatically tracked by the CMC system, it was possible to

Table 4. Structure of the online activities in the blended course on educational technology.

Week 1 Week 2 Week 3 Week 4 Week 5 Week 6 Week 7 Week 8

Module 1Familiarisation

Module 2Simulation on

technological resources for education

Module 3Role play on integration of

educational technology in class work

Module 4Conclusion

and self-assessment

Follow-up of Module 1 Socialisation

Module 5 Meta-reflection

Downloaded By: [Sarti, Luigi] At: 16:53 25 April 2010

Distance Education 13

develop an ad hoc program to extract data from the system, store them in a databasetogether with other information on the course, and present them to the tutors.

For instance, Figure 1 shows the values of active participation indicators by themembers of a group of nine people during Module 3 (Role play on integration ofeducational technology in class work; see Table 4). Similar tables can be produced bythe monitoring software for reactive participation and continuity. The same data canbe presented at varying levels of aggregation (for example, time, activity, andparticipant).Figure 1. Active participation monitor. Note: An example of the monitor displaying data concerning the participative dimension (active participation) of a group of nine students in Module 3. The last line shows the data concerning the tutor.In contrast, for the social, cognitive, and teaching dimensions it is not possible toautomatically obtain the information from the CMC system because of the difficultyof identifying an exhaustive set of keywords. As a consequence, in order to monitorthese dimensions, the tutors have to analyse the messages while reading them andkeep track of the indicators they detect. Although the message coding procedure canonly be carried out manually, the tracking process can be supported by the secondtool, which keeps track of the indicators, provided that the tutors input them into thedatabase underpinning the monitoring system. On demand, the tool will subsequentlydisplay updated quantitative information about the qualitative dimensions andindicators (see Table 5).

Examples of use

The following provides examples of use of the monitoring model and associated tools,focusing on the interactions of the same group of nine people during Module 3 of ourcourse. The task required students to carry out a 3-week collaborative activity. Thetutor coded the messages every day, and used the tools to monitor students and adjusther teaching accordingly.

Table 6 provides an overall view on participation of this group at the end of theactivity, as provided by the quantitative tool. The table reports a synthesis of

Figure 1. Active participation monitor. Note: An example of the monitor displaying dataconcerning the participative dimension (active participation) of a group of nine students inModule 3. The last line shows the data concerning the tutor.

Downloaded By: [Sarti, Luigi] At: 16:53 25 April 2010

14 D. Persico et al.

Tabl

e 5.

Dat

a co

ncer

ning

the

cogn

itive

, the

teac

hing

, and

the

soci

al d

imen

sion

s (s

ame

grou

p as

in F

igur

e 1)

.

Qua

litat

ive

anal

ysis

Cog

nitiv

e pr

esen

ceT

each

ing

pres

ence

Soci

al p

rese

nce

Part

icip

ants

Rev

elat

ion

Exp

lora

tion

Inte

grat

ion

Res

olut

ion

Met

a-re

flec

tion

Org

anis

atio

nal

mat

ters

Faci

litat

ion

Dir

ect

inst

ruct

ion

Aff

ectio

nC

ohes

ion

Mic

aela

22

60

014

00

68

Am

bra

11

60

111

00

410

Patr

izia

15

100

013

10

23

Mau

rizi

o4

05

01

140

02

9L

aura

20

70

036

188

816

Mar

ina

37

110

111

00

812

Fran

ca3

45

00

81

05

12B

arba

ra1

44

00

72

02

4V

aler

ia1

45

00

61

04

7

Tot

als

1827

590

312

023

841

81

Downloaded By: [Sarti, Luigi] At: 16:53 25 April 2010

Distance Education 15

quantitative indicators of active and reactive participation, as well as of continuity;data are averaged across students, so as to provide an overall view of the groupparticipation as a whole.

Similar data may be supplied for individual learners, thus supporting a betterunderstanding of the roles assumed by each of them during the learning activity andconsequently allowing their assessment.

In particular, in the example, students sent 235 messages, while the tutor sent 23messages; a total of 258. The values of the indicators of reactive participationsuggest that most students read most of the messages and documents. As for activeparticipation, a mean of 26 messages in 3 weeks is usually considered quite good,although the high values of the standard deviation suggest that there were signifi-cant differences among students. The continuity indicators also point to a gooddegree of involvement. However, the time spent online by each student cannot bedirectly associated to their commitment to the course – much of the work (reading,summarising, writing documents, and even writing messages) is usually doneoffline. More generally, it is evident that information of a qualitative nature is alsoneeded to complement the quantitative information. As already mentioned, in ourapproach this kind of information derives from the content analysis carried out bythe tutor on the messages exchanged by the students. The tutor coded 235 studentmessages. The unit of analysis was the message; each message could be assigned amaximum of three indicators. A second coder then validated the coding process byanalysing a sample of 64 messages (30% of the total). Both coders had beeninvolved in the design of the monitoring model, so no training was required. Theinter-rater reliability, calculated using Holsti’s coefficient (De Wever et al., 2006),was 0.91 (per cent agreement 0.82).

Figure 2 shows the distribution of messages among the three dimensions for thesame group in the same activity.

Table 6. The participative dimension (same group as in Figure 1).

Active participation Sent messages (total: 235)Mean: 26.11 SD: 11.19 Range: 6–49Uploaded documents (total: 28)Mean: 3.11 SD: 3.11 Range: 0–9Attended chats (total: 15)Mean: 3.17 SD: 1.34 Range: 1–5

Reactive participation Read messages (total: 258)Mean: 233.44 SD: 24.47 Range: 179–258Downloaded documents (total: 29)Mean: 15.89 SD: 7.05 Range: 6–28

Continuity Mean (minutes) SD Range (minutes)Week 1 349.95 134.97 144.70–571.08Week 2 337.71 126.84 150.23–589.77Week 3 273.51 107.92 77.38–454.02Mean 320.39SD 33.52Range 273.51–349.95

Note: Group: LS 1; tutor: Francesca; duration: 3 weeks; number of participants: 9.

Downloaded By: [Sarti, Luigi] At: 16:53 25 April 2010

16 D. Persico et al.

Figure 2. Messages concerning the participative, cognitive, teaching, and social dimensions. Note: Number of messages concerning the participative, cognitive, teaching, and social dimensions for the same group as in Figure 1. Values in the intersection areas represent those messages that feature indicators of more than one dimension of the model. The sizes of the circles are not proportional to the numbers of corresponding messages. The messagesrelevant for the participative dimension were 235, out of which 219 featured at least one teaching, cognitive, or social indicator.This kind of picture helps the tutor to check if the discussion is ‘weak’ in anydimension and whether intervention is necessary. Intersection areas represent thosemessages that feature indicators of more than one dimension of the model. These dataare cross-compared with the learning task to be performed. For example, during theanalysed module, based on a role play, one of the students was specifically asked toact as the coordinator and rapporteur of the group. The assignment for the groupconsisted of collaboratively writing a report. This required a strong teaching presencefrom the above-mentioned student and explains why the teaching dimension (that isusually peculiar to the tutor’s messages) was so high.

It is also interesting to analyse each dimension individually and focus on thenumber of instances for each indicator. Figure 3 shows the distribution of indicatorsof the cognitive dimension during Module 3.Figure 3. Instances of indicators of the cognitive dimension (same group as in Figure 1).The individual contributions are reflected in the revelation and exploration indica-tors. The discussion is expressed in the integration category, while the resolutionphase here is practically non-existent, as the activity did not require any testing of theresults obtained by the group. In this kind of activity, low values of the integrationcategory should give rise to tutor intervention to ensure that the group achieve ashared understanding of the work done.

Figure 2. Messages concerning the participative, cognitive, teaching, and social dimensions.Note: Number of messages concerning the participative, cognitive, teaching, and social dimen-sions for the same group as in Figure 1. Values in the intersection areas represent those mes-sages that feature indicators of more than one dimension of the model. The sizes of the circlesare not proportional to the numbers of corresponding messages. The messages relevant for theparticipative dimension were 235, out of which 219 featured at least one teaching, cognitive, orsocial indicator.

Downloaded By: [Sarti, Luigi] At: 16:53 25 April 2010

Distance Education 17

Finally, it is useful to have a look at how the three dimensions vary in time: Figure4 provides the values for each dimension in the three weeks of the module.Figure 4. Variation of the qualitative dimensions in time (same group as in Figure 1).The diagram helps the tutor keep under control the overall trend of the dimensions,in order to monitor whether and how the group is evolving and take the necessarymeasures.

Figure 3. Instances of indicators of the cognitive dimension (same group as in Figure 1).

Figure 4. Variation of the qualitative dimensions in time (same group as in Figure 1).

Downloaded By: [Sarti, Luigi] At: 16:53 25 April 2010

18 D. Persico et al.

It is worthwhile noting that similar representations of data may focus on individualstudents, so to provide an idea of the level of individual contribution and help the tutorto identify the attitudes and behaviours of individual learners, thus supporting themanagement of group dynamics.

Field test method and study outcomes

During the course described above, our model and tools were tried out to assess thefeasibility of the approach and obtain feedback for fine-tuning. The followingprocedure was adopted.

A subset of four of the six course tutors volunteered to participate in the field testof the whole approach, including the qualitative analysis. This group comprised oneof the course designers (who was also involved in the development of the model), acouple of very experienced tutors, and a beginner tutor. The other two tutors only usedthe quantitative tools. Before starting, tutors received guidelines describing the model,how to carry out the content analysis, and how to use the tools. The tutors monitoredtheir own groups individually, and the tools allowed them to focus on the dataconcerning each group. However, the tutors did not work in isolation: a forum wasavailable to discuss tutoring and monitoring problems, as well as coding issues. Thisforum, invisible to the course students, was a useful (and unbiased) source ofinformation about the applicability of the indicators, since it revealed all the doubtsemerging while using them. During the field test, the tutors also kept a logbook ofproblems encountered, and at the end they filled in a questionnaire concerning theiropinions on the monitoring approach. The questionnaire was composed of eight open-ended questions aimed at investigating tutors’ impressions as far as the usability of themodel and the associated tools, and the impact they had had on tutoring workload. Inaddition to this, the tutors participated in individual interviews, aimed at elicitingmore details on their experience.

The results of this survey show that the tutors believe their job can be stronglysupported by the proposed approach. In comparing their previous, more heuristic wayof monitoring their groups with the new approach, some of them pointed out that theextra effort they had to devote to systematic coding of messages had been paid offwith some important advantages. In fact, the ongoing availability of the data allowscomparison of the performance of different groups and individuals, thus ensuringuniformity of treatment for all students. Further, the use of the model supports inex-perienced tutors more systematically and effectively. Some tutors claimed that, with-out our tools, sometimes they had to go through the students’ messages several timesbecause they had forgotten who had written what. All in all, all the tutors declaredthat they were satisfied with the monitoring experience; their use of the tools wascontinuous and effective for fine-tuning their teaching. Moreover, they felt the modelwas much better grounded when based on both quantitative and qualitative data. Inparticular, the quantitative dimension, which is one of the features differentiating thismodel from the one proposed by Garrison and Anderson (2003), was considered veryuseful, especially in the first phases of the course, when participants were gettingacquainted with the platform and the group. In particular, the data played a key rolein the number of read messages (a piece of information that not all CMC systemsprovide) because it is an important clue in interpreting online silence and gives arough idea of the degree to which students are considering each other’s contributions.Participation data became less and less meaningful as time went by because, as the

Downloaded By: [Sarti, Luigi] At: 16:53 25 April 2010

Distance Education 19

number of messages grew, the qualitative data emerging from the cognitive, thesocial, and the teaching dimensions became more relevant. However, according toour tutors, the indicators of the cognitive dimension were not easy to apply. This maybe due to the fact that the learning community was not involved in fully fledgedinquiry activities and the four phases of the inquiry model (revelation, exploration,integration, and resolution) fitted only partially with the learning tasks proposed inour experiment. For this reason, we have further elaborated the indicators of thecognitive dimension in a direction more coherent with the nature of the tasks usuallyassigned to our students, thus leading to the distinction between two main categoriesof indicators, namely, the individual knowledge building and the group knowledgebuilding indicators, which better reflect the main working phases our activities areusually based on.

Further, our tutors judged the introduction of meta-reflection as an indicator of thecognitive dimension as appropriate, maybe because teacher training is a field whereself-awareness of one’s own learning dynamics is an important prerequisite to developcompetence.

Finally, the tutors considered the indicators of the social and teaching dimensionseffective and easy to apply.

All in all, the four dimensions of the model won the tutors’ approval, whoconsidered them exhaustive and, at the same time, flexible enough to capture theongoing collaborative process and to make interactivity emerge as a sort of ‘super-dimension’ permeating the others.

Conclusions and further developments

Monitoring the learning process in CSCL environments is a key element for support-ing the efficacy of tutor actions. This article proposes a model that codifies andembodies in one methodology previous research results as well as a wealth of heuris-tics developed through years of practical experience by several tutors and coursedesigners. The model was reified in a set of monitoring tools used by six tutors in anonline course. The model has been tried out in a blended learning course for teachers.

The results of the field test were positive, as the questionnaire analysis and theinterviews with the tutors supported the completeness of the model and the set ofindicators, as well as its usability in our context, with the few exceptions mentionedin the previous section.

Although the extra effort required for carrying out the content analysis whilemanaging the course was not negligible, the tutors deemed it worthwhile. To furtherimprove the cost–benefit ratio, the use of automatic tools for text analysis could beconsidered, to support the coding procedure. This would of course imply a period oftesting, so as to achieve a high degree of coherence between manual analysis and thesemi-automatic process of message analysis. Another option for making the contentanalysis lighter could be that of endowing the CMC system itself with features for thetutors to tag messages with their indicators while reading them. This latter option callsfor further study and experimentation. At the moment the authors are carrying outfurther experimental studies, oriented to support tutors with external observers, who –after a period of training with the tutors – are in charge of carrying out the analysis onbehalf of the tutors themselves.

In addition, some considerations could also be given to the developed tools. Thetool aimed at extracting quantitative data could be enhanced with an automatic

Downloaded By: [Sarti, Luigi] At: 16:53 25 April 2010

20 D. Persico et al.

elaboration of suitable representations (e.g., graphs, tables) that effectively andsynthetically display the results of the analysis.

The need of capturing and, as far as possible, assigning meaning to qualitative andquantitative data calls for a formal representation of contextual information thatdescribes the course. For instance, the course calendar is relevant in the process ofcomputing the continuity indicator; the directory of participants needs to befrequently accessed to discriminate roles and groups; an accurate description of theplanning and the structure of activities is the basis for computing participation indica-tors. All of this information has to be imported from various sources into the monitor-ing database. To this end, a number of specifications and tools are being developedand can be adopted to represent CSCL processes. At the moment, we are consideringthe adoption of IMS-LD (IMS Global Learning Consortium, 2003) and LAMS(Dalziel, 2003) as possible means for describing our courses and we are planning towork further in this direction.

Notes on contributorsDonatella Persico is a senior researcher at the Institute for Educational Technology of theItalian National Research Council (CNR). She has been active in the field of educational tech-nology, theory, and applications, since 1981. Her major interests include instructional design,e-learning, self-regulated learning, and teacher training.

Francesca Pozzi is a researcher at the Institute for Educational Technology of the ItalianNational Research Council (CNR). Her major research interests include the design and imple-mentation of online courses in CSCL, the design of strategies and techniques for fosteringonline collaboration, and issues in monitoring and evaluating the learning process in CSCL.

Luigi Sarti is a senior researcher at the Institute for Educational Technology of the ItalianNational Research Council (CNR). His research interests focus on methodologies and tech-niques for applying ICT to learning processes. In particular, he is active in the field of CSCLin an attempt to adapt and re-interpret technologies for the representation of educational data.

ReferencesAnderson, T., & Kanuka, H. (2003). E-research: Methods, strategies, and issues. Boston,

MA: Allyn and Bacon.Anderson, T., Rourke, L., Garrison, D.R., & Archer, W. (2001). Assessing teaching pres-

ence in a computer conferencing context. Journal of Asynchronous Learning Networks,5(2), 1–17. Retrieved from http://www.aln.org/publications/jaln/index.asp

Aviv, R., Erlich, Z., Ravid, G., & Geva, A. (2003). Network analysis of knowledge construc-tion in asynchronous learning networks. Journal of Asynchronous Learning Networks,7(3), 1–23. Retrieved from http://www.aln.org/publications/jaln/index.asp

Bransford, J., Goldman, S., & Pellegrino, S. (1991). Some thoughts about constructivism andinstructional design. Educational Technology, 31(9), 16–18. Retrieved from http://asianvu.com/bookstoread/etp/

Caballé, S., Juan, A.A., & Xhafa, F. (2008). Supporting effective monitoring and knowledgebuilding in online collaborative learning systems. Lecture Notes in Artificial Intelligence,5288, 205–214. Retrieved from http://www.springer.com/series/1244

Collins, M., & Berge, Z. (1996). Facilitating interaction in computer mediated online courses.Paper presented at the FSU/AECT Distance Education Conference, Tallahassee, FL.Retrieved from http://www.emoderators.com/moderators/flcc.html

Conrad, D. (2004). University instructors’ reflections on their first online teaching experi-ences. Journal of Asynchronous Learning Networks, 8(2), 31–44. Retrieved from http://www.aln.org/publications/jaln/index.asp

Downloaded By: [Sarti, Luigi] At: 16:53 25 April 2010

Distance Education 21

Dalziel, J. (2003). Implementing learning design: The Learning Activity Management System(LAMS). Paper presented at the 2003 ASCILITE Conference, University of Adelaide.Retrieved from http://www.lamsinternational.com/documents/ASCILITE2003.Dalziel.Final.pdf

Daradoumis, T., Martinez-Monés, A., & Xhafa, F. (2004). An integrated approach for analys-ing and assessing the performance of virtual learning groups. Lecture Notes in ComputerScience, 3198, 289–304. Retrieved from http://www.springerlink.com/content/105633/

De Laat, M.F., Lally, V., Lipponen, L., & Simons, R.-J. (2007). Online teaching in networkedlearning communities: A multi-method approach to studying the role of the teacher.Instructional Science, 35(3), 257–286. doi:10.1007/s11251-006-9007-0

De Wever, B., Schellens, T., Valke, M., & Van Keer, H. (2006). Content analysis schemes toanalyse transcripts of online asynchronous discussion groups: A review. Computers &Education, 46(1), 6–28. doi:10.1016/j.compedu.2005.04.005

Delfino, M., & Persico, D. (2007). Online or face-to-face? Experimenting with different tech-niques in teacher training. Journal of Computer Assisted Learning, 23(5), 351–365.doi:10.1111/j.1365-2729.2007.00220.x

Dillenbourg, P. (Ed.). (1999). Collaborative learning: Cognitive and computationalapproaches. Oxford: Elsevier.

Garrison, D.R., & Anderson, T. (2003). E-learning in the 21st century: A framework forresearch and practice. London: RoutledgeFalmer.

Garrison, D.R., Anderson, T., & Archer, W. (1999). Critical inquiry in a text-based environ-ment: Computer conferencing in higher education. The Internet and Higher Education,2(2–3), 87–105. doi:10.1016/S1096-7516(00)00016-6

Garrison, D.R., Anderson, T., & Archer, W. (2001). Critical thinking, cognitive presence, andcomputer conferencing in distance education. American Journal of Distance Education,15(1), 7–23. Retrieved from http://www.ajde.com/index.htm

Hara, N., Bonk, C.J., & Angeli, C. (2000). Content analysis of online discussion in an appliededucational psychology (CRLT Technical Report No. 2-98). Retrieved from http://crlt.indiana.edu/publications/techreport.pdf

Henri, F. (1992). Computer conferencing and content analysis. In A. Kaye (Ed.), Collabora-tive learning through computer conferencing (pp. 115–136). Berlin: Springer.

IMS Global Learning Consortium. (2003). IMS learning design information model. Retrievedfrom http://www.imsglobal.org/learningdesign/ldv1p0/imsld_infov1p0.html

Kaleidoscope Network of Excellence. (n.d.). Interaction & Collaboration AnaLysis support-ing Teachers and Students Self-regulation (ICALTS). Retrieved from http://www.rhodes.aegean.gr/ltee/kaleidoscope-icalts/

Lally, V., & de Laat, M. (2002). Elaborating collaborative interactions in networkedlearning: A multi-method approach. Paper presented at the Networked Learning 2002Conference, University of Sheffield. Retrieved from http://www.networkedlearningcon-ference.org.uk/past/nlc2002/proceedings/symp/09.htm

Lipponen, L., Rahikainen, M., Lallilmo, J., & Hakkarainen, K. (2003). Patterns of participa-tion and discourse in elementary students’ computer-supported collaborative learning.Learning and Instruction, 13(5), 487–509. doi:10.1016/S0959-4752(02)00042-7

Lobry de Bruyn, L. (2004). Monitoring online communication: Can the development ofconvergence and social presence indicate an interactive learning environment? DistanceEducation, 25(1), 67–81. doi:10.1080/0158791042000212468

Martinez, A., Dimitriadis, Y., Rubia, B., Gomez, E., & De La Fuente, P. (2003). Combiningqualitative evaluation and social network analysis for the study of classroom social inter-actions. Computers & Education, 41(4), 353–368. doi:10.1016/j.compedu.2003.06.001

Mason, R., & Kaye, A. (Eds.). (1989). Mindweave: Communication, computer and distanceeducation. Oxford: Pergamon Press.

Newman, D.R., Webb, B., & Cochrane, C. (1995). A content analysis method to measure crit-ical thinking in face-to-face and computer supported group learning. InterpersonalComputing and Technology Journal, 4(1), 57–74. Retrieved from http://www.aect.org/Intranet/Publications/ipct-j/index.html

O’Malley, C., Suthers, D., Reimann, P., & Dimitracopoulou, A. (Eds.). (2009). Computersupported collaborative learning practices – CSCL2009 conference proceedings.Retrieved from http://www.isls.org/cscl2009/

Downloaded By: [Sarti, Luigi] At: 16:53 25 April 2010

22 D. Persico et al.

Pozzi, F. (2009). Using collaborative techniques in virtual learning communities. LNCS –Lecture Notes in Computer Science, 5794, 670–675. doi:10.1007/978-3-642-04636-0_66

Pozzi, F., Manca, S., Persico, D., & Sarti, L. (2007). A general framework for tracking andanalysing learning processes in CSCL environments. Innovations in Education andTeaching International, 44(2), 169–179. doi:10.1080/14703290701240929

Roberts, T.S. (Ed.). (2003). Online collaborative learning: Theory and practice. Hershey, PA:Idea Group Press.

Rourke, L., Anderson, T., Garrison, R., & Archer, W. (2001a). Assessing social presencein asynchronous text-based computer conferencing. Journal of Distance Education/Revue de l’enseignement à distance, 14(2), 50–71. Retrieved from http://www.jofde.ca/index.php/jde

Rourke, L., Anderson, T., Garrison, R., & Archer, W. (2001b). Methodological issues in thecontent analysis of computer conference transcripts. International Journal of ArtificialIntelligence in Education, 12, 8–22. Retrieved from http://ihelp.usask.ca/iaied/ijaied/

Rowntree, D. (1995). Teaching and learning online: A correspondence education for the 21stcentury? British Journal of Educational Technology, 26(3), 205–215. doi:10.1111/j.1467-8535.1995.tb00342.x

Salmon, G. (2004). E-moderating: The key to teaching and learning online. London: Taylor &Francis.

Scardamalia, M., & Bereiter, C. (1994). Computer support for knowledge-building communi-ties. Journal of the Learning Sciences, 3(3), 265–283. doi:10.1207/s15327809jls0303_3

Schrire, S. (2006). Knowledge building in asynchronous discussion groups: Going beyondquantitative analysis. Computers & Education, 46(1), 49–70. doi:10.1016/j.compedu.2005.04.006

Soby, M. (1992). Waiting for electropolis. In A. Kaye (Ed.), Collaborative learning throughcomputer conferencing (pp. 39–50). Berlin: Springer-Verlag.

Soller, A., Martínez, A., Jermann, P., & Muehlenbrock, M. (2005). From mirroring to guiding:A review of state of the art technology for supporting collaborative learning. InternationalJournal of Artificial Intelligence in Education, 15(1), 261–290. Retrieved from http://www.ijaied.org/journal/

Weinberger, A., & Fischer, F. (2006). A framework to analyze argumentative knowledgeconstruction in computer-supported collaborative learning. Computers & Education,46(1), 71–95. doi:10.1016/j.compedu.2005.04.003

Zembylas, M., & Vrasidas, C. (2007). Listening for silence in text-based, online encounters.Distance Education, 28(1), 5–24. doi:10.1080/01587910701305285

Downloaded By: [Sarti, Luigi] At: 16:53 25 April 2010