Measuring the Quality of Bank Regulation and Supervision with an Application to Transition Economies
Measuring self-regulation in complex learning environments
Transcript of Measuring self-regulation in complex learning environments
Measuring self-regulation 1
MEASURING SELF-REGULATION IN COMPLEX LEARNING ENVIRONMENTS
Measuring self-regulation in complex learning environments
Paper for ICO Toogdag 2006*
Maaike Endedijk**, Jan Vermunt, Mieke Brekelmans and Perry den Brok
IVLOS, Utrecht University
Nico Verloop
ICLON, Leiden University
* This is work in progress. If you want to make a reference to this paper, please
contact the presenting author for the most recent version.
** Presenting author - Contact information: [email protected] / 030 2532269
Measuring self-regulation 2
Abstract
This paper describes an exploratory study to the quality of instruments to measure
self-regulated learning in complex learning environments. Four different instruments
were developed for the context of teacher education: portfolio, portfolio interview,
task-based questionnaire and an interview about concrete learning experiences. A
small sample of 8 student teachers participated voluntarily to test the instruments. To
assess the quality of the instruments, the instruments were evaluated on different
content criteria and practical aspects. The results show that the portfolio and
portfolio interview do not measure all for this context relevant aspects of self-
regulated learning. The developed task-based questionnaire could be a good
instrument to use in future research for measuring self-regulated learning in the
context of teacher education.
Measuring self-regulation 3
Measuring self-regulation in complex learning environments
During the last decades research on self-regulated learning (SRL) has
increased enormously. Different models have been developed to describe the active,
constructive process whereby learners set goals for their learning and attempt to
monitor, regulate and control their cognition, motivation, and behaviour, guided and
constrained by their goals and the contextual features in the environment (Pintrich,
2000). There is considerable agreement about the importance of self-regulation in
human survival, but there has been disagreement about how it can be analyzed and
defined in a scientifically useful way (Zimmerman, 2000). Especially the
characterization of SRL is a still ongoing debate. Is self-regulation solely a general
disposition of the learner or is it domain or situation specific? We agree with
Alexander (1995) that the construct of SRL can no longer be separated from the
situation or context in which it occurs. This became more relevant, since more and
more different curricula demand more SRL from their students. These new curricula
are based on constructivist theories of learning in which students’ learning activities
are under control of the learner, instead of the external control of students’ learning
activities in traditional instructional design theories (Vermunt, 1998). Most of these
new didactics are developed in higher education, but are also introduced in primary
and secondary education (Vermunt, 2006).
Nowadays, students are expected to master lifelong learning skills in order to
be able to regulate their own learning once they are working in their fields of
expertise (Van Eekelen, Boshuizen and Vermunt, 2005). “Lifelong learning has
become the catch-cry of the new millennium” (Cornford, 2002, p. 357), and that
makes it more important to acquire skill in thinking activities that make students
capable of assimilating new knowledge in order to deal with the huge amounts of
information that they are confronted with in their work (Vermunt, 1996). For
example, former models of vocational education were front-end, they required a
period of formal education and/or training to be completed by entrants to an
occupation before they can be regarded as qualified workers (Hager, 2004). Models
for vocational education are now more often based on forms of dual learning. In
Measuring self-regulation 4
dual learning programmes, students combine two types of learning environments:
studying at the university (or other educational institute) with learning form practice
(Vermunt, 2003).
All university-based teacher education programmes in The Netherlands are
designed in dual learning programmes and have the format of one-year
postgraduate programmes. Student teachers spend half of their time on teaching
practice at a secondary schools. Sometimes they are fully responsible for the teaching
in some classes from the first day. Student teachers monitor their improvements on
different aspects of their development in their portfolio. They make self-evaluations
to asses their competencies and a personal development plan to direct their learning
process. Concerns and learning needs form the student teachers form the base for the
sequence in which subject matter is offered, instead of subject-logical arguments
(Vermunt, 2003). This together leads to a higher demand on self-regulation skills. But
although curricula require a lot of self-regulation from students, they do not
necessarily develop positive forms of self-regulation (Evensen, Salisbury-Glennon
and Glenn, 2001) When students enter a new type of education, there may be a
temporary misfit or friction between the students’ learning conceptions, orientations
and strategies and the demands of the new learning environment (Vermunt and
Verloop, 1999).
According to Boekaerts and Minnaert (1999) it is urgently needed to explore
whether students rely on different regulation modes when working in formal and
informal learning settings. Until now, for the most part theorists in educational
psychology have focused on describing how students form and maintain learning
intentions, but there is little information about students’ actions and efforts at
regulation when they are not so mindfully engaged in learning (Boekaerts and
Corno, 2005). Because most research to SRL is focused on student learning in
traditional instructional settings in primary, secondary and higher education,
instruments for measuring self-regulation are also developed in these contexts.
However, we should be able to measure SRL in more complex learning environments
as well, to be able to support students in their preparation for life long learning.
Measuring self-regulation 5
In this paper we will try to answer the following question: What is the quality of
different instruments to measure self-regulated learning in dual learning programmes? For
this purpose, we will develop and compare four instruments for measuring self-
regulation in teacher education. The selection of these instruments will be based on
theoretical and empirical findings from two fields of research. First, we will describe
characteristics of (self-regulated) learning in the context of teaching and teacher
education more in detail and explore how research has been conducted presently in
this field. Second, we will look at the kind of instruments that have been developed
for assessing self-regulation in the context of academic learning and evaluate them in
terms of the usability for our research context.
Theoretical framework
The theoretical framework consists of an analysis of two fields of research. First will
theories and research about SRL in the context of teaching and teacher education be
described. The specific characteristics of (student) teacher learning will be portrayed
and the instruments and results of the few studies that have been done in this field.
After that a classification and evaluation will be presented of the instruments that are
generally used to measure self-regulation.
Self-regulated learning in the context of teaching and teacher education
In two studies important differences between teacher learning and learning in
conventional settings for measuring SRL have been identified. Lin, Schwartz and
Hatano (2005) mentioned three underlying features of tasks for which most
metacognitive interventions are made, which are missing in the reality of teaching
practice: Tasks in traditional instructional contexts are well defined and value free,
the environments for which students are prepared are fairly stable and the trainees
and instructors share common learning goals and values. They conclude that
conventional applications of metacognition fall short when it comes to the challenges
teachers often face. Also Van Eekelen et al. (2005) identified some features of teacher
learning: Teacher learning, like other workplace learning, is complex and relational.
A large numbers of interrelated factors affect the effectiveness of learning. Much of
that learning is unplanned and serendipitous, and does not have preset objectives or
Measuring self-regulation 6
easily identifiable outcomes. Sometimes learning has significance over a very long
timescale.
Only a few empirical studies into (student) teachers’ self-regulation in in-
service teacher education have been done. Studies often used questionnaires or
interviews to detect conceptions of or general approaches to self-regulated learning
(Kremer-Hayon & Tillema, 1999; Boulton-Lewis, Wilss and Mutch; 1996; Oolbekking-
Marchand, Van Driel and Verloop, 2006) or to give suggestions for improving
teacher education (Taks, 2003; Lin et al., 2005; and Tillema and Kremer-Hayon; 2002).
The actual regulation-activities of (student) teachers have seldom been studied, only
a few studies are known about the preferences of teachers for a certain (regulation of)
learning style. Donche, Vanhoof, and Van Petegem (2003) identified with the
“inventory of learning styles” (Vermunt, 1998) that student teachers in general
indicate to have an occupation oriented learning style and rely little on self-
regulation strategies. Oosterheert and Vermunt (2003) developed a questionnaire to
find out how student teachers learn and self-regulation was one of the aspects taken
into account. Three sources of regulation turned out to play a role in knowledge
construction in learning to teach: external sources to provide new information, active
internal sources to deliberately focus on (new) information, and dynamic internal
sources to spontaneously reconceptualize prior understandings (Oosterheert &
Vermunt, 2003). Van Eekelen et al. (2005) discerned three types of regulation from
interviews and diaries among teachers: externally regulated learning, self-regulated
and both externally- and self-regulated learning. Their research also showed that
most of the learning situations were unplanned, they learn in a non-linear way by
solving problems. In these cases, teachers do not regulate in order to learn, but they
regulate primarily their teaching practice.
Based on previous research, we can conclude that student teachers say to rely
minimally on self-regulation strategies. Something is known about (student) teacher
learning but no research has been done investigating the actual self-regulation, thus
an instrument has to be developed. Based on theoretical and empirical findings the
instruments should take into account the following features of (student) teacher
learning. It needs to be suitable for all different learning activities student teachers
Measuring self-regulation 7
perform in different and complex environments. It should be able to cover regulation
of active deliberate learning and more spontaneous learning processes, thus also of
learning processes that are less (learning) goal directed. Furthermore, it should be
able to measure regulation on different levels, e.g. the more static regulation of
development, as well as the more dynamic regulation of small learning steps.
Existing instruments to measure SRL
The increase of research on self-regulation has lead to the development of a
hotchpotch of assessment techniques (Veenman, 2005). Two major distinctions can be
made to classify the instruments. Winne and Perry (2000) make the distinction
between instruments that measure self-regulation as an aptitude and instruments
that measure SR as an event. An aptitude describes a relatively enduring attribute of
a person that predicts future behaviour. An event-instrument describes the
regulation activities of a specific task. Van Hout-Wolters (2006) makes the distinction
between on-line and off-line methods to measure SR. This distinction is based on the
moment SR is measured. On-line methods measure SR during the learning task, off-
line methods measure SR independently from or directly after a concrete learning
task (Van Hout-Wolters, 2006). We classified instruments mentioned in several
overviews (e.g. Boekaerts & Corno, 2005; Van Hout-Wolters, 2006; Van Hout Wolters,
Simons & Volet, 2000; Winne & Perry, 2000; Veenman, 2005) according to these
distinstions in Table 1.
Measuring self-regulation 8
Table 1
Classification of instruments
On-line Off-line
Aptitude • Self-report questionnaires
• Oral interviews
• Teacher judgments
Event • Think-aloud methods
• Eye-movement registration
• Observation and video-
registration of behaviour
• Performance assessment
through concrete study tasks,
situational manipulations or
error detection tasks
• Trace analysis
• Stimulated recall interviews
• Portfolios and diaries/ logs
• Task-based questionnaire
• Hypothetical interview
On-line instruments always measure SRL as an event. The advantage of on-
line methods is that the measurement is at the same moment as the task, so little
information is lost (Van Hout-Wolters, 2006). On the other hand, there is a bigger
chance of disturbance of the learning process and on-line behavioural observations
can only focus on highly specific activities: “What you see is what you get, despite
the risk that not all metacognitive processes may be fully disclosed” (Veenman, 2005,
p. 91). Furthermore, these on-line methods only take into account the SR activities
which are performed during the learning activity. When SRL is measured as an
aptitude, a single measurement aggregates over or abstracts some quality of SRL
based on multiple SRL events (Winne & Perry, 2000). These instruments ask the
learner to answer across different situations. It is not clear for the researcher which
situations the learner has in mind and which references he or she has for comparison.
Furthermore, the learner can forget aspects of learning activities, or give social
desirable answers (Van Hout-Wolters, 2006).
Measuring self-regulation 9
For the context of teacher education, an off-line instrument should be
developed that measures SRL as an event. Online methods are not suitable, because
regulation of unintentional learning at the workplace has no specified learning tasks
to perform. Furthermore, online instruments do not take into account some for this
context important aspects of SRL that happen before or after the task, like planning,
selection of a strategy or reflection afterwards. The instrument should measure SRL
as an event, because the goal is to get an overview of student teachers’ concrete
regulation activities for different learning experiences. Beside that, we expect that
regulation may differ among learning contexts and it is not clear whether these
student teachers who just started their teacher education can describe a general
aptitude of SR in this new rich learning context.
Towards features of instruments to measure SRL in teacher education
In this paragraph we will translate the findings of the two research fields as
described above into features for instruments to measure self-regulation in dual
learning settings as teacher education. Pintrich’s definition of SRL as mentioned in
the introduction shows that SRL covers three areas: cognition, motivation and
behaviour (Pintrich, 2000). Regulation activities can be classified in three phases of
the regulation process: Forethought, monitoring/ control and self-reflection
(Zimmerman, 2000). Therefore, instruments of high quality should be able to
measure all these areas and phases of self-regulated learning. Another feature of the
instruments has to be the ability to measure the regulation of all kind of student
teachers’ learning processes. This encloses formal and informal, intentional and
unintentional learning and longer developmental processes as well as learning on a
micro-level. As described before, we already know something about the attitude of
(student) teachers towards SRL. To get a qualitative description of student teachers’
actual regulation activities, instruments which measure SRL off-line and as an
aptitude have to be used. As described in Table 1, four examples of off-line
instruments to measure SR as an event are known from literature: portfolios and
keeping diaries , stimulated recall interviews, task-based questionnaire and the
hypothetical interview. All these formats will be used to develop four instruments to
Measuring self-regulation 10
measure SRL in teacher education. The development of the instruments will be
described in the method section.
Method
The method section will first portray the context in which the assessments took place
more detailed. After that the choice, development and content of the four
instruments will be discussed. Subsequently the sample and method of analysis will
be described.
Context
This study has been conducted at one teacher educational institute in The
Netherlands. This is a post-graduate teacher education programme, meaning that
students first finish their master degree in a specific subject and then enter a one-year
programme to obtain their teaching degree for secondary education. Student teachers
have a lot of possibilities in designing their personal curriculum, based on their prior
experiences and concerns, except for some compulsory courses. Half of the year is
spent in schools, under the supervision of university staff and trained cooperating
teachers. The educational programme is designed around six competence domains.
Student teachers monitor their improvements on these competences in their
portfolio. They make self-evaluations and a personal development plan to direct
their learning processes. Student teachers find support and they exchange
experiences in peer groups, mentor groups and specific subject matter groups and
they can get extra theoretical knowledge from lectures.
The teacher education institute has three different programmes: The first
programme is an initial programme in which the students are gradually exposed to
practice. They start with observing experienced teachers and get during the year ful
responsibility for some classes. In the in-service programme, the second programme,
student teachers already have a job at a school, but they are following the same
educational programme as the initial students. The last programme is an alternative
certification programme for second career teachers. Student teachers in this
programme have to be older than 32 and have prior experiences in another job. Their
educational programme is based on their prior competences.
Measuring self-regulation 11
Instrument development
As described before, four examples of off-line instruments to measure SR as an
event are known from literature: portfolios and keeping diaries, stimulated recall
interviews, task-based questionnaire and the hypothetical interview. We will first
explain how we used these formats to come to develop the instruments. After that,
we will describe the content of the instruments and the procedure of using it
separately for every instrument.
Portfolio
According to Randi (2004), pre-service teachers’ diaries can be used to provide
illuminating examples and narrative accounts, not only of their students’ SRL, but
also of the ways they themselves regulate. Therefore, the electronic portfolios used at
the teacher education institute for student teachers to describe their learning
processes was the first instrument used to identify regulation activities. The
stimulated recall technique requires participants to review a videotape of their
performance on a specific task and to reproduce their thought process (Veenman,
2005). Because of the variety of contexts in which the often unintentional learning
processes take place, we chose for another way of recalling the learning experiences
than videotapes. The self-reported learning experiences from the portfolios were read
out load for recalling and they were used as a base for the interview questions. The
second instrument was thus a variant on the stimulated recall technique and is called
the portfolio interview. The task-based questionnaire was the third instrument to
measure SRL. This questionnaire was developed to get a qualitative description of
specific learning experiences and the corresponding regulation activities. The
hypothetical interview is the only instrument that measures SRL before the learning
task (for a comparison of prospective and retrospective instruments, see Veenman,
2005). Measuring SRL activities before the learning experiences, can provide better
insights in the actual activities of the forethought phase. To take all aspects of the
regulation process into account, the last instrument, an interview based on a concrete
learning experience, measured not only self-regulation activities directly before but
also after the learning experience.
Measuring self-regulation 12
Self-report in portfolio
All student teachers of the educational institute have to use an electronic
portfolio to assess themselves based on their own learning goals and to keep a log or
diary. In the portfolio examined in this study, the log is an empty format, student
teachers can choose by themselves how often and how detailed they describe their
learning experiences. The self-assessment part in this portfolio has a more structured
environment. Student teachers show their development with respect to six
competence domains. At the start of the programme student teachers describe their
competences and formulate learning goals and strategies to improve them. At the
end of the first semester, they evaluate these goals and set new goals and strategies
for the second semester. At the end of the year they evaluate their development as a
teacher and this is also their final assessment. The portfolio is comparable with
portfolios used in other teacher education programmes , but puts a relatively big
emphasis on self analysis and the documentation function is used to prove the self
analysis (for a more detailed description see Van Tartwijk, Lockhorst and Tuithof,
2002). Student teachers’ evaluations and new development plan is assessed by their
teacher educators. We evaluated student teachers’ digital portfolios to find self-
regulated learning activities. We downloaded the portfolios on the moment they
were in the condition for assessment by the mentor to make sure that they were all in
the same state of completion.
Portfolio interview
The second instrument was a semi-structured interview based on fragments
student teachers described in their portfolio. The portfolio was first analysed to
identify regulation activities, according to the procedure as described below in the
method of analysis. The fragments used for this instrument were selected to
represent all phases and areas of regulation as well as possible. These examples were
read out loud one by one to the student teachers to recall them. For every fragment
the researcher asked the student teachers elaborate on several regulation aspects such
as orientation, planning, monitoring, evaluation or reflection. If some areas of phases
were not mentioned in the portfolio, specific attention was paid to get extra
information of these aspects of regulation. At the end of the interview student
Measuring self-regulation 13
teachers were asked to describe the role of the portfolio in their development.
Furthermore, they were asked to illustrate how they decided which learning
experiences to report in their portfolio and which not. The interviews were recorded
on audiotape. Student teachers were told that we wanted to understand better how
student teachers learn and how they describe that in the portfolio. It was made clear
in the instruction that their level of functioning as a teacher was irrelevant and that
the interest only focused on their way of improvement.
Task-based questionnaire: Week reports
A questionnaire called “week report” was developed in which student
teachers could describe a specific learning experience and the corresponding
regulation activities. The questions were made to identify regulation activities from
all phases and areas as described in the introduction. This lead to the following open
questionnaire:
1. What did you learn?
2. In what context did the learning take place (think about place, time, presence of others,
your mood etc.)?
3. Did you have the intention to learn this? Did you change your plans during the learning
process?
4. Why did you want to learn this? Did you have the feeling that you were going to succeed?
5. How did you do it? Why did you choose this strategy?
6. From whom did you receive or miss help during this learning experiences? Did you ask
for help?
7. How did you come to realize you learned something?
8. What kind of effect did this learning experience have on your confidence or motivation?
9. What elements in this learning experience did you experience as satisfying? What would
you change the next time?
10. How will you proceed with this (learning) experience? Are you making new plans?
The instruction for using the week report was given orally and was also
attached on paper to the week reports. Student teachers were instructed to choose a
learning experience from the last week and this could be any kind of experience that
was part of their development as a teacher. They could choose if they wanted to
describe their learning experience by answering the questions one by one or by
Measuring self-regulation 14
describing their own story with the questions in mind. It was possible to describe a
successful or less successful learning experience, for example when learning was
planned, but did not take place. The first question would then be: “What did you
want to learn?” and an extra question was: “Why didn’t it work out the way you
expected”? The instruction asked the student teachers to answer every question, but
if some questions turned out to be irrelevant for their learning experience they could
skip them. Student teachers had to complete the week report twice to determine
regulation activities of different learning experiences.
Interview based on concrete learning experiences
The fourth instrument was a combination of a prospective and retrospective
measurement including an observation of the learning experience. This was the
instrument that was closest to the learning experience. Student teachers were asked
to mention three moments when they expected a chance to learn. This could vary
from a teaching experience, a meeting with their peers to a theoretical lecture. The
researcher attended all three experiences to see if any regulation behaviour was
observable, but above all to interpret the answers of the student teachers in the
interview afterwards. Before the experience the researcher asked the student teachers
about their forethought: their preparation, possible goals for the meeting,
argumentation for attending the meeting, expectations etc. The questions were
formulated to prevent social desirable answers as much as possible. For example,
instead of asking “what is your goal?”, they were asked what they expected to
happen and whether they wanted to learn something specific during this activity.
The interview directly after the experience asked student teachers to evaluate the
experience by themselves. After that questions were asked (if not already mentioned
by the student teachers) like: Did you learn anything? Are you satisfied? Did the
experience meet your expectations or goals? If not, what made that happen and how
do you want to reach the goals now? All the interviews were audio-taped.
Sample
The population consisted of student teachers of the teacher training
programme. A sample of eight student teachers participated voluntarily in this
study. They differed in prior teaching experiences, teaching subject, age, sex and the
Measuring self-regulation 15
programme they followed at the educational institute. Due to the short time of the
educational programme and the busy schedule of the student teachers, the data
collection had to take place at two moments with different student teachers and
different instruments. The first group consisted of five student teachers: a woman
teaching English from the alternative certification programme; two men teaching
geography, one from the in-service programme and from the initial programme; a
woman teaching history from the initial programme and a woman teaching Dutch
from the in-service programme. They were all at the end of the educational
programme. Their portfolios were analyzed and they were interviewed based on the
portfolio fragments. The second group consisted of three student teachers: a woman
teaching Dutch from the in-service programme (with one year of teaching
experience); a woman teaching mathematics form the initial programme and a man
teaching English from the in-service programme. These students were half-way in
their educational programme. Their portfolio was also analyzed, they were
interviewed around three concrete learning experiences and they were asked to
complete the week report twice. One student teacher was ill at one of the meetings
for the interviews and he was therefore interviewed only twice about a concrete
learning experience. Another student teacher completed the week report three times
instead of two, because she enjoyed working with it.
Method of analysis
All four instruments were first separately analysed to obtain insight in the self-
regulation activities student teachers had undertaken. From the portfolio the self-
assessment part was analyzed to identify examples of regulation activities of all
phases and areas as mentioned in the introduction. This part required the same from
each student teacher and was therefore the best comparable information. The self-
assessment contained references to other aspects of the portfolio, for example to the
log to illustrate and proof their self-evaluation. These parts were also included in the
analysis. The interviews based on the portfolio were transcribed and subsequently a
content analysis was done, also to identify regulation activities of all phases and
areas. The answers to the questions on how student teachers use the portfolio were
summarized. The week-report questions were already structured around regulation
Measuring self-regulation 16
activities. A list was made of the different answers to the questions about these
activities. The interviews based on concrete experiences were all transcribed. Parts
from the text which identified regulation activities were selected. Also, general
information about the function of the specific learning experience for their
development and general remarks on learning or regulation were selected. When
observation notes also contained information with respect to regulation activities,
they were add to the selection as well. Based on this information, for every person an
overview was made about his or her regulation activities for the different learning
experiences.
The second step of the analyses was done to compare the different
instruments. For this comparison criteria were set, based on the features of
instruments to measure SRL in teacher education as described in the introduction.
These included:
1. The extent to which the instrument measures all phases of self-regulated
learning: forethought, control and reflection;
2. The extent to which the instrument measures all areas of SRL, namely
cognition, motivation and behaviour;
3. The extent to which the instrument measures self-regulation of both formal
and informal learning;
4. The extent to which the instrument measures self-regulation of both
intentional and unintentional learning;
5. The extent to which the instrument measures self-regulation at a macro level,
e.g. regulation of the development of becoming a teacher as well as on a micro
level, e.g. the regulation of small learning steps;
Every instrument was given a rating on a criterion with plus and minus signs,
explained with examples from the data. Except these content criteria, we also took
into account some pragmatic aspects of the instruments. These factors were not
decisively for the quality of the instrument, but they can help to choose among the
instruments for use in practice. For example, it will be an advantage to use or adapt
existing instruments of teacher education to make it more easy for student teachers to
Measuring self-regulation 17
participate. Therefore, we also rated the instruments on time investment for the
student teachers and efforts needed from the researcher to use and analyse the
instrument.
Results
The results will be described separately for every instrument. First, the
instruments will be discussed with respect to the five content criteria as described
above. To illustrate the findings, examples from raw data were added. After that, the
practical aspects of the instruments will be discussed. Information about student
teachers’ opinion about using the portfolio and the week report was also added, if
present. To compare the instruments, they were rated on all the criteria with plus and
minus signs. The results are summarized in table 2.
Portfolio
Content analyses showed that student teachers of our sample never described
behaviour of monitoring or controlling the learning process in their portfolio. All
student teachers described learning goals in the self-assessment part of the portfolio,
but not for every goal a strategy is described. There were differences among the
student teachers in the specificity and concreteness of the descriptions of the goals
and strategies. An example of a very global description is: “I have to learn a lot to
reach this competence. My goal is to reach the competence as it is described by the
educational institute and I am going to attend all the courses and use the information
in my lessons.” The log showed more detailed descriptions from experiences, but not
every student uses the log to the same extent. This varied from never till once a day.
Furthermore, these descriptions were mostly memories of activities, without
evaluations or reflections on the possible learning that occurred. Almost never
regulation of motivation of affect was found, this is limited to cognition and
behaviour. Both formal (for example learning from lectures) and informal learning
(for example learning from teaching) activities were described. In the self-assessment
part student teachers described learning situations that were related to the goals they
set. They also described learning results, but the connection between goals and
results and how he learning process evolved is hardly found. The regulation
Measuring self-regulation 18
reported in the portfolios was at a very global level, the self-assessment was only
done twice a year. Regulation activities for smaller learning processes were hardly
found.
The portfolio is a regularly used instrument in teacher education and therefore
easy to collect and requires no extra time investment for student teachers. A problem
for using this portfolio as a research instrument, may be that student teachers
differed a lot in how and how often they used the portfolio. It is only possible to
compare the portfolios just before they have to discuss their portfolio with their
mentor, because then they are all required to present a complete portfolio.
Sometimes, the portfolio is even empty till one day before they have to send it to
their mentor. The efforts for the researcher to collect electronic portfolios are very
low, but the analyses can be harder. Portfolios can contain hundreds of pages with
lots of hyperlinks, filled with descriptions of lessons which all have to be filtered for
the analysis. Furthermore, some student teachers design their own format for the
portfolio. This asks a lot of preliminary work, before analysis can start. In the
portfolio interview was asked for student teachers’ experiences with the portfolio
and this will be reported at the and of the next paragraph.
Portfolio interview
Because the portfolio interview was based on the learning experiences as
described in the portfolio, this instrument partly deals with the same problems as
described above. The advantage of the interview is that it can add to the empty parts
of the portfolio. It is possible to ask student teachers for other areas of phases of
regulation, for example monitoring, regulation of motivation and affect etc. The only
problem that emerged during interviewing was that some of the learning experiences
happened more than a half year ago. Answers in such cases are more likely to be
based on general preferences for learning strategies then on the specific learning
experience. For example, an answer to the question why the student teacher chose a
specific strategy for reaching a certain learning goal was: “I like going to lectures,
because they are structured and there is possibility for interaction and I like that”. It
remains doubtful whether these kind of answers are based on what their motivation
was on the moment of the learning experience or what they thought about it during
Measuring self-regulation 19
the interview. Since the goal of this instrument was not to ask student teachers to
describe other learning experiences, the measurement of (in)formal and
(un)intentional learning and regulation on different levels was the same as with the
previous instrument.
Student teachers and the researcher had to invest an hour for the interview.
Data analysis may cost more time, if audio-tapes have to be transcribed. Student
teachers also reported the use of the portfolio in the interview. Four student teachers
had difficulties with the format of the self-assessment part and three of them
experienced the portfolio as time-consuming. The choice for including experiences in
the portfolio varied: Three student teachers reported that they included experiences
from which they had to learn or already learned, one student teacher reported in the
beginning almost everything he learned and later on only experiences that fitted in
the description of one of the competence domains. Another student reported only
experiences that were connected to formerly formulated goals.
Week-report
The questions of the week report were set up to ask about regulation activities
of all phases and areas. Not all the questions turned out to be relevant for every
learning experience, but sometimes (sub)questions were also not answered while
they seem to be relevant for the learning experience. The question about the method
was interpreted by one student teacher as being the method for reaching the working
goal for the activity that lead to the learning experience, while it was meant to be
about the strategy that was chosen to reach the learning goal. Student teachers could
describe formal and informal learning experiences and intentional and unintentional
experiences in the week report and this indeed happened. Learning experiences from
different contexts were described, for example learning from a teaching situation and
learning during a workshop and from both situations goal directed as well as
spontaneous learning was portrayed. In the instruction was told to the student
teachers that a learning process could be a very little step or change, but could also
cover a learning process that took days, weeks or months. Most of the experiences
reported described short learning experiences, one experience evolved over a few
weeks.
Measuring self-regulation 20
Student teachers reported that it took them about 15 minutes to complete one
week report. All three were enthusiastic about using the week report, because it
provided structure to describe a learning experience and it prevented forgetting
elements from the experience to report. The efforts for the researcher were not high,
the week reports were sent and received back by email, only initial instruction asked
for face-to-face contact. One student teacher needed some extra stimulation to send
the week reports back. With a larger group of respondents, this could be a more
time-consuming activity. Data from week reports are already structured around
regulation activities, what makes the analysis relatively less effortful.
Concrete experience interview
Because of the closed character of the interviews on the experiences very
detailed data was obtained about regulation activities. Also, information on the
relation between the different regulation activities became more clear. For example,
it turned out that the evaluation of the learning experience to a big extent was about
what happened and what student teachers learned from it and less on the coherence
with their learning goal. The prospective part of the interview asked about
preparation and the choice of goals and strategies. It was not clear to what extent the
questions also stimulated to formulate goals for the activity. For one student teacher
it took some time to formulate his goal, which could indicate that he was creating it
at that specific moment. Because student teachers were asked to mention activities
from which they possibly learn, the chance increases to measure more intentional
and formal learning with this instrument than unintentional and informal learning.
Because of the focus on one activity the instrument is also more focused at learning
experiences that happen during this activity and less at learning that is part from a
larger developmental process. For example, it is possible that the student teacher
realises later on that he learned something from a lecture, because of coming into a
situation in which information turned out the be useful that was evaluated as
irrelevant directly after the lecture. Sometimes, it may take some time for a student
teacher to reflect on an experience and to value it as a learning experience.
These interviews required a lot of time, especially for the researcher, but also
for the student teachers, because all the activities had to be attended at many
Measuring self-regulation 21
different locations. Student teachers also had to arrange some moments in their busy
schedules to have an interview directly before and after the activity. This turned out
to be hard to organize, especially for activities at the practice schools. Appointments
for the interview had to be rescheduled over and over, because of emerging
problems with pupils, colleagues etc. which had a higher priority. The analysis also
required a lot of time, because many interview fragments had to be transcribed and
extra notes have to be added for contextual information before the analysis could
start.
Comparison of the instruments
The results as described above are summarized in Table 2. Based on the
description of the performance of the instruments, a score was given for every
criterion. The score ranges from “- -“ to “+ +”. The score for the main content
criterion was made through summarizing the scores on the sub-criteria.
Table 2
Results on the criteria
Portfolio Portfolio
interview
Week-report Concrete
experience
interview
Content criteria - +/- + +
All phases of SRL - - +/- + ++
All areas of SRL - +/- + ++
Formal & Informal + + ++ +/-
Intentional &
Unintentional
+/- +/- + +/-
Different levels of SRL - +/- + -
Practical aspects
Time investment ST ++ + +/- - -
Efforts for researcher + +/- + - -
Measuring self-regulation 22
Based on our findings we can conclude that the week report and the concrete
experience interview sort the best results on the content criteria. From these two
instruments the week report is less time consuming and more practical to use then
the specific interview. The portfolio and the portfolio interview scored lower on
most of the content criteria, because the portfolio only identifies certain aspects of
self-regulated learning, namely macro aspects of self-regulation of cognition and
behaviour. A comparison of the results of the content analyses among the three
student teachers of the second group also pointed out that evaluation of the portfolio
not always leads to the same overview of regulation activities as the week-reports
and the concrete experience interviews. For example, one student teacher used the
portfolio only as a database of learning products as lesson plans, without describing
(regulation of) learning processes in the portfolio, while the week report and the
concrete experience interview showed both a lot of different regulation activities. The
interpretation and implications of these results will be described in the discussion.
Conclusion and discussion
The quality of instruments to measure self-regulation in dual learning settings
as in teacher education was central to this study. Based on research to (student)
teacher learning, self-regulation in teacher education and assessment of SRL in more
traditional learning environments, essential features were denominated for the
instruments to measure SRL the context of teacher education. Four instruments were
developed: portfolio, portfolio interview, week report and the concrete experience
interview. The instruments were tested in a small sample of student teachers and
evaluated on content criteria and practical aspects. The content criteria included the
ability to measure self-regulation of all areas and phases of learning, of formal and
informal, intentional and unintentional learning and the regulation of different levels
of learning.
The week report had good scores on all of the content criteria. The specific
experience interview had problems with measuring self-regulation on a macro-level,
but the rest of the scores were in general also good, but the high time investment for
the student teachers and the high efforts for the researcher could be a practical
Measuring self-regulation 23
problem for using this instrument on larger scale. The portfolio scored lower on most
of the content criteria, because of several reasons: The self-assessment part of the
portfolio identifies mostly regulation of longer developmental processes and not of
small learning steps and student teachers reported primarily regulation of cognition
and behaviour and not of motivation. Also not all of the phases of self-regulation
were described in the portfolio, or not in relation to each other. The log part of the
portfolio described hardly any regulation activities. These findings are in line with
the results of Mansvelder-Longayroux (2006). Her study to portfolios in teacher
education showed that student teachers are more inclined to describe what they
changed than how their learning process had gone. The portfolio interview scored a
bit higher on the content criteria, because student teachers could add to the empty
parts of the portfolio.
The results could be explained by the differences in structure between the
week report and the portfolio. As the student teachers reported themselves, the week
report helps with not forgetting to describe certain aspects of the learning process.
On the other hand, carefulness is needed to prevent student teachers to describe
regulation activities which did not occur, just because of the desire to answer a
question about it. We tried to prevent this social desirable behaviour by pointing out
in the instruction that some questions are not relevant and it is not compulsory to
answer all the questions. Another explanation for the results can be the possible
influence of the way student teachers use the portfolio on the regulation activities
that they described. An example of the study showed that using the portfolio more as
a database for products of learning lead to almost no examples of self-regulated
learning, while this student teacher showed a lot of different regulation activities in
both the week reports and the concrete experience interviews. Further research is
needed to demonstrate whether student teachers’ capability of SRL is related to a
certain way of using the portfolio.
The limitations of the study are primarily coming from the small sample we
used to test the instruments. For example, we have no information about the
discriminative value of the instruments. Furthermore, the week-report still has to
prove itself in other complex learning environments and because the two groups did
Measuring self-regulation 24
not use the same instruments, we could not make a comparison between the
regulation activities identified by the different instruments for the whole sample. In
addition, all the instruments tested are self-report instruments. It could be valuable
to compare the results of these self-reports to other sources of information, e.g. the
judgments of the teacher educators. Although this probably will not provide a
detailed description of the regulation activities, this information may help for
validation of the instruments. The structure of the portfolio used in this study is
comparable with portfolios used in other dual learning settings, but there may be
some differences. Especially the instruction and supervision for using portfolios can
vary among and within educational institutes and this may lead to different
outcomes. A task-based questionnaire seems to be a good instrument for measuring
SRL in complex learning environments, but the questions of the week report could be
improved to prevent the misunderstandings experienced in this study. We also
suggest to limit the freedom of choice of the reported learning experiences a bit more,
to make the results among different student teachers better comparable. Learning
experiences from teaching experiences may ask for different regulation activities than
for example learning from courses of the educational institute. Asking student
teachers to report form both kind of learning experiences gives a more complete view
on their regulation activities.
This study made a step into the direction of a better understanding of self-
regulated learning in complex learning environments, by providing suggestions for
the use of instruments to measure SRL. Further research is necessary to test the
instruments with bigger samples and in different contexts. Task-based questionnaires
as the week report could be used for qualitative studies to get detailed descriptions
student teachers’ regulation activities. We suggest also to do more longitudinal
studies to obtain information about how students change and develop their
regulation of learning when entering complex learning environments. As Evensen et
al. (2001) showed, although curricula require a lot of self-regulation from students,
they do not necessarily develop positive forms of self-regulation. Therefore we also
recommend to study how curricula in complex learning environments should be
designed to foster the development of self-regulated learning.
Measuring self-regulation 25
References
Alexander, A. (1995). Superimposing a Situation-Specific and Domain-Specific
Perspective on an Account of Self-Regulated Learning. Educational Psychologist,
30(4), 189-193.
Boekaerts, M., & Corno, L. (2005). Self-regulation in the classroom: A perspective on
assessment and intervention. Applied Psychology: An International Review, 54(2),
199-231.
Boekaerts, M., & Minnaert, A. (1999). Self-regulation with respect to informal
learning. International journal of educational research, 31, 533-544.
Boekaerts, M., & Niemivirta, M. (2000). Self-regulated learning: Finding a balance
between learning goals and ego-protective goals. In M. Boekaerts, P. R.
Pintrich & M. Zeidner (Eds.), Handbook of self-regulation. San Diego: Academic
Press.
Boulton-Lewis, G. M., Wilss, L., & Mutch, S. (1996). Teachers as adult learners: their
knowledge of their own learning and implications for teaching. Higher
Education, 32, 89-106.
Cornford, I. A. (2002). Learning-to-learn strategies as a basis for effective lifelong
learning. International journal of lifelong education, 21(4), 357-368.
Donche, V., Vanhoof, J., & Petegem, P. V. (2003). Beliefs about learning environments:
How do student teachers think, reflect and act concerning self-regulated and
cooperative learning in Flanders (Belgium). Paper presented at the AERA, Seattle.
Evensen, D. H., Salisbury-Glennon, J. D., & Glenn, J. (2001). A qualitative study of six
medical students in problem-based curriculum: Toward a situated model of
self-regulation. Journal of educational psychology, 93(4), 659-676.
Hager, P. (2004). Conceptions of learning and understanding at work. Studies in
continuing education, 26(1).
Kremer-Hayon, L., & Tillema, H. H. (1999). Self-regulated learning in the context of
teacher education. Teaching and Teacher Education, 15, 507-522.
Lin, X., Schwartz, D. L., & Hatano, G. (2005). Toward teachers' adaptive
metacognition. Educational Psychologist, 40(4), 245-255.
Measuring self-regulation 26
Mansvelder-Longayroux, D. (in press). The learning portfolio as a tool for stimulating
reflection by student teachers.
Nenniger, P. (2005). Commentary on self-regulation in the classroom: A perspective
on assessment and intervention. Applied Psychology: An International Review,
54(2), 239-244.
Oolbekkink-Marchand, H. W., Van Driel, J., & Verloop, N. (2006). A breed apart? A
comparison ofsecondary and university teachers' perspectives on self-
regulated learning. Teachers and Teaching: theory and practice, 12(5), 593-614.
Oosterheert, I., & Vermunt, J. D. (2003). Knowledge Construction in Learning to
Teach: the Role of dynamic sources. Teachers and Teaching: theory and practice,
9(2).
Pintrich, P. R. (2000). The role of goal orientation in self-regulated learning. In M.
Boekaerts, P. R. Pintrich & M. Zeidner (Eds.), Handbook of self-regulation. San
Diego: Academic Press.
Randi, J. (2004). Teachers as self-regulated learners. Teachers College Record, 106(9),
1825-1853.
Taks, M. (2003). Zelfsturing in leerpraktijken: Een curriculumonderzoek naar nieuwe rollen
van studenten en docenten in de lerarenopleiding. Enschede: PrintPartners
Ipskamp.
Tillema, H. H., & Kremer-Hayon, L. (2002). "Practising what we preach"- teacher
educators' dillemas in promoting self-regulated learning: a cross case
comparison. Teaching and Teacher Education, 18, 593-607.
Van Hout Wolters, B. (2000). Assessing active self-directed learning. In P. R. J.
Simons, J. Van der Linden & T. Duffy (Eds.), New Learning (pp. 83-101).
Dordrecht: Kluwer.
Van Hout Wolters, B. (2006). Leerstrategieën meten: Soorten meetmethoden en hun
bruikbaarheid in onderwijs en onderzoek. Paper presented at the Onderwijs
Research Dagen, Amsterdam.
Van Hout Wolters, B., Simons, P. R. J., & Volet, S. (2000). Active learning: self-
directed learning and independent work. In P. R. J. Simons, J. L. v. d. Linden &
Measuring self-regulation 27
T. Duffy (Eds.), New Learning (pp. 21-37). Dordrecht: Kluwer Academic
Publishers.
Van Tartwijk, J., Lockhorst, D., & Tuithof, H. (2002). Universiteit Utrecht: Portfolio's
en de opleiding van docenten. In E. Driessen, D. Beijaard, J. Van Tartwijk & C.
Van der Vleuten (Eds.), Portfolio's. Groningen/ Houten: Wolters-Noordhoff.
Veenman, M. V. J. (2005). The assessment of metacognitive skills: What can be
learned from multi-method designs? In C. Artelt & B. Moschner (Eds.),
Lernstrategien und metakognition: Implicationen für Forschung und Praxis (pp. 77-
99). Münster: Waxmann.
Vermunt, J. D. (1996). Metacognitive, cognitive and affective aspects of learning
styles and strategies: A phenomenographic analysis. Higher education, 31, 25-
50.
Vermunt, J. D. (1998). The regulation of constructive learning processes. British
Journal of Eduacational Psychology, 68, 149-171.
Vermunt, J. D. (2003). The power of learning environments and the quality of student
learning. In E. De Corte, L. Verschaffel, N. Entwistle & J. J. G. Van Merriënboer
(Eds.), Powerful learning environments: Unravelling basic components and
dimensions (pp. 109-124). Amsterdam: Pergamon.
Vermunt, J. D., & Verloop, N. (1999). Congruence and friction between learning and
teaching. Learning and Instruction, 9, 257-280.
Winne, P. H., & Perry, N. E. (2000). Measuring self-regulated learning. In M.
Boekaerts, P. R. Pintrich & M. Zeidner (Eds.), Handbook of self-regulation. San
Diego: Academic Press.
Zimmerman, B. J. (2000). Attaining self-regulation: A social cognitive perspective. In
M. Boekaerts, P. R. Pintrich & M. Zeidner (Eds.), Handbook of self-regulation. San
Diego: Academic Press.