Post on 08-May-2023
1st Reading
August 13, 2010 19:0 WSPC/117-IJSEKE - SPI-J111 0218-1940 00485
International Journal of Software Engineering1
and Knowledge Engineering2
Vol. 20, No. 4 (2010) 1–213
c© World Scientific Publishing Company4
DOI: 10.1142/S02181940100048525
ENGINEERING THE LIFE-CYCLE OF SEMANTIC6
SERVICES — ENHANCED LEARNING SYSTEMS7
ERNIE GHIGLIONE8
Macquarie E-Learning Centre of Excellence, Macquarie University9
Sydney, NSW 2109, Australia10
ernieg@melcoe.mq.edu.au11
JUAN MANUEL DODERO12
Computer Languages and Systems Department, University of Cadiz13
Cadiz, 11001, Spain14
juanma.dodero@uca.es15
JORGE TORRES16
Distributed and Adaptive Systems Lab for Learning17
Technologies Development, Tecnologico de Monterrey18
Santiago de Queretaro, QRO 76130, Mexico19
jtorresj@itesm.mx20
Received21
Revised22
Accepted23
Service-oriented learning environments are the new paradigm for interoperability of24
learning management systems. They support a wider range of needs by integrating25
existing and emergent services, leading to an entirely new architectural design for such26
systems. The engineering life-cycle of resources and services can be enhanced and inte-27
grated in current and future virtual learning environments. This work defines a services-28
enhanced learning architecture and describes two levels of integration carried out to29
author, deploy and enact learning services from open web-based interaction protocols30
and semantic web service descriptions.31
Keywords: Semantic web services; virtual learning environments.32
1. Introduction33
Collaborative learning environments are the arena in which active, shared learning34
experiences are developed to involve students in reflecting on their own cognitive35
processes, facilitated and supported by other participants and learning resources36
[1]. The main components required to describe a learning experience should involve37
people and groups of people, engaged in a specific role, that systematically develop38
a number of learning activities. The flow of development of the learning activities39
1
1st Reading
August 13, 2010 19:0 WSPC/117-IJSEKE - SPI-J111 0218-1940 00485
2 E. Ghiglione, J. M. Dodero & J. Torres
can be structured in a pedagogically organized fashion, and supported by a set of1
resources and services.2
Learning experiences should be especially designed, made explicit using an Edu-3
cational Modeling Language (EML), and hosted in a Learning Management System4
(LMS) [2]. Major LMS are prepared to understand EML specifications by installing5
run-time engine extensions — such as CopperCore [3] or GRAIL [4] — that enable6
deploying and running collaborative learning experiences, based on the IMS Learn-7
ing Design (LD) specification. Other systems such as LAMS [5] define their own8
model for users, activities and resources to represent and host such relevant ele-9
ments of a learning experience. These systems are commonly used to deploy and10
execute web resources, applications and tools that keep an asymmetric relationship11
between users, i.e. the instructor produces a resource and the learner consumes12
it. The interoperability issue in such systems has been founded on the use of open13
e-learning standards [6], either based on individual learning contents (e.g. SCORM)14
or collaborative activities (e.g. IMS LD).15
In the search for interoperability, there is an emerging shift from such monolithic16
e-learning platforms towards supporting a wider range of needs by integrating exist-17
ing and emergent services [7,8]. This trend has lead to an entirely new architectural18
design for learning environments that is known as Personal Learning Environment19
(PLE). These are concerned with the practice of learners with external technolo-20
gies, focused on coordinating symmetric relationships between users and services,21
and based on open W3C standards [9].22
This work deals with how Internet-based services can be engineered for cur-23
rent LMS and future PLE. Henceforth we refer to both LMS and PLE as Virtual24
Learning Environment (VLE). The growth of freely available Internet-based services25
during recent years has impelled the demands to harness tailored versions of these26
services within current and future technology-enhanced learning systems. These ser-27
vices include web-based applications such as shared calendars, wikis, blogs, social28
software, and so forth. Service availability leads to implementing service-oriented29
VLE designs [7]. However, their complexity and variability still cause some issues30
for engineering authentic learning experiences, for which every activity must be31
provided with an assessment [10].32
Service-oriented systems architecture can be considered from an anatomical33
viewpoint, focused on how the learning system is cut down in parts so that its34
structure can be analyzed. On the other hand, a physiological viewpoint considers35
the functioning of the system. It is recognized that for an effective resource sharing36
and virtual community support, the physiological view is especially relevant [11].37
The contribution of this work is how a new anatomy for service-based learning sys-38
tems can enable the flexible engineering of new functionalities in a VLE. In our39
approach, users and activities are decoupled from learning services with the help40
of a mediating layer that contemplates a model of relevant features of services [12]41
and how to interact with them in a decoupled way. In particular, the provision of a42
configurable assessment service allows implementing authentic learning experiences43
1st Reading
August 13, 2010 19:0 WSPC/117-IJSEKE - SPI-J111 0218-1940 00485
Engineering the Life-Cycle of Semantic Services — Enhanced Learning Systems 3
for any learning activity hosted in a VLE. The proposed service-oriented learn-1
ing architecture has been applied to the combination of the LAMS VLE and an2
implementation of learning assessment services called EvalComix.3
The rest of this paper is structured as follows: First we explain the life-cycle4
engineering issues of a learning service and the motivation of our work. Second,5
a semantic services-enhanced learning architecture is proposed, that focuses on6
how ontology-based learning services annotations support the integration of ser-7
vices throughout their life-cycle. After that, our contribution is evaluated by a case8
study showing how it is applied to existing VLE and learning assessment services.9
Finally, some related work is related on the integration of web services in virtual10
learning environments, along with some conclusions of this work.11
2. Learning Services Life-Cycle: Issues and Motivation12
The engineering life-cycle that must occur before running a collaborative learn-13
ing experience consists of three main phases, namely authoring, deployment and14
enactment [4]. Authoring tasks involve creating courses such as packaged struc-15
tures that hold descriptions for all required activities, resources and services. After16
that, the objective of deployment is that all course elements are properly allocated17
on the VLE, i.e. to prepare learning resources, activities and services, based upon18
the desired flow of activities, and populate the roles defined in the course with the19
actual participants in the learning experience. Finally, enactment begins when users20
have to start interacting with the available resources and services provided in the21
activities.22
The main issue of engineering internet-based services in a collaborative learning23
experience is how these can be seamlessly integrated with the basic components24
of the environment, i.e. activities and user roles. Services may include web-based25
applications that were not designed in principle for an educational purpose. Ques-26
tions about the flexibility of the integration of such learning services can emerge in27
all phases of the engineering life-cycle:28
• On the authoring phase, learning services cannot be packaged and distributed as29
easily as traditional learning resources. The use of existing frameworks and lan-30
guages, such as Web Service Description Language (WSDL) or Web Application31
Description Language (WADL), explicitly document the interface of the opera-32
tions provided by the service. However, if the interface changes or evolves and33
the implementation of activities is not flexible enough, the learning experience34
should be re-engineered from the beginning.35
• In an actual learning environment, a change in services introduced during author-36
ing may have a severe impact on the deployment phase. For instance, when deploy-37
ing an activity, if a required service is not available, some adjustments are needed38
on the VLE to replace the service by another one equivalent (e.g. to replace an39
unavailable videoconferencing service with a chat).40
1st Reading
August 13, 2010 19:0 WSPC/117-IJSEKE - SPI-J111 0218-1940 00485
4 E. Ghiglione, J. M. Dodero & J. Torres
• During the enactment stage, users can access all learning resources and services1
via the set of deployed learning activities. However, when dealing with an external2
service, the VLE has to configure information that is not included as part of the3
course. For example, let us consider a learning experience that divides the students4
in groups and requires each group to have a form to exploit a service for group-5
based assessment and reporting. The overall number of users and user groups6
that will take part in the learning experience is not known until deployment,7
when the number of instances of the assessment service must be actually created8
and assigned to the learning activities.9
From a pedagogical perspective, collaborative learning environments provide10
authentic learning experiences as long as every activity is provided with an assess-11
ment [10]. Learning-oriented assessment is focused on assessment tasks as a means of12
self-diagnosing a student’s learning activities, in order to empower him for learning13
in a more effective and autonomous learning [13], as required by modern learning14
environments. Assessment sets up a difference between business processes that are15
based upon regular work flow systems and technology-enhanced learning flows and16
learning processes [14]. It also makes a difference between a general purpose web17
service and another that is specially intended to support a learning experience.18
All learning activities that are implemented through external services (i.e. out of19
the VLE control) should be subject to learning assessments. Therefore, all interac-20
tion and events occurring among an activity and its service implementation must be21
acknowledged, controlled and tracked by the VLE. However, if the learning activity22
yields control to an external service, the VLE will be able to find out what hap-23
pened during the interaction with the service as long as it is explicitly coded as part24
of the activity implementation. This raises again the flexibility issue of the service25
engineering process, but this time concerning authoring, deployment and enactment26
of assessments that are linked to the activities.27
When the learning service model is shared, the client activities are more tightly28
coupled [15]. A change on the service side can provoke undesired changes on a29
coupled implementation of activities. For that reason, the raw API-based integration30
of learning services needs to be enhanced. It means that activities must be aware31
of details of the service API so that explicit calls to service operations have to be32
included in the implementation of activities.33
3. Semantic Services-Enhanced Learning34
In distributed and changing environments, such as new LMS or PLEs, the pro-35
vision and integration of service components is a challenge. A semantic service-36
oriented architecture is presented that meets the flexibility of engineering life-cycle,37
as required by new service-oriented VLEs. Our approach stresses both the protocol-38
oriented description of the anatomy of a services-enhanced learning system, and39
the semantically enhanced physiology of those functionalities that learning services40
can provide to the VLE. The proposed solution ensures a seamless combination of41
1st Reading
August 13, 2010 19:0 WSPC/117-IJSEKE - SPI-J111 0218-1940 00485
Engineering the Life-Cycle of Semantic Services — Enhanced Learning Systems 5
services with learning activities through a learning services ontology that ensures1
the independence of the VLE that hosts the learning experience.2
3.1. Service design principles3
With the aim of supporting the complete life-cycle of service-oriented VLEs, we4
applied the following principles to the design of a Services-Enhanced Learning Archi-5
tecture (SELA):6
(i) Define the Application Programming Interface (API) of services as simple as7
possible.8
(ii) Separate user and role management from service implementations.9
(iii) Decouple as much as possible the service descriptions from the client applica-10
tions (i.e. the VLE) that consume them.11
These design principles will support an enhanced flexibility of learning service12
authoring, deployment and enactment. They describe protocol-oriented issues that13
support the anatomy of the service-oriented VLE. The design principles are aimed14
at decoupling the elements managed by the VLE from the management of learning15
services. Over them we have built a semantic model that integrates in the VLE16
physiology the required functionalities provided by external services.17
3.2. Services-enhanced learning architecture18
The anatomy of SELA is depicted in Fig. 1. Its main components enable to seam-19
lessly connect and interoperate learning activities and services through the following20
layers:21
The LMS interoperability layer is used to plug-in a specific LMS — e.g. LAMS22
or an IMS-LD runtime engine — that can manage learning activities. The VLE23
Fig. 1. Layer structure of the services-enhanced learning architecture.
1st Reading
August 13, 2010 19:0 WSPC/117-IJSEKE - SPI-J111 0218-1940 00485
6 E. Ghiglione, J. M. Dodero & J. Torres
provides user and activity management and the interoperability component connects1
them to learning resources and services.2
The Web Service Access (WS-Access) API defines the service operations allowed3
to be issued from the VLE or VLE-managed learning activity. Service interfaces can4
be specified using WSDL or WADL. The WS-Access API could be used to directly5
hard-code the requests to available services as part of the implementation of a VLE-6
managed activity — i.e., to skip the SEE and SWS layers. Nevertheless, this is not7
sufficient to promote interoperability among the VLE and services, since it would8
be difficult to overcome the issues described above.9
The Service Enactment and Execution (SEE) server is used to manage the learn-10
ing services life-cycle and issue service calls to the right learning services. It is11
provided with a formal description of learning processes called LPCEL (Learning12
Process Composition and Execution Language) that includes the modeling of user13
roles, learning activities and services.14
Although the VLE can delegate partially the management of activities to the15
SEE server, this work deals only with the management of services lifecycle. On16
the run time, the SEE layer also provides the service information model needed to17
transform all user interactions that happen on the activities’ user interface into real18
service calls. The mapping between the SEE server and actual learning services is19
based on augmenting the service APIs with RDF annotations that facilitate the20
automated generation of the activities’ user interfaces and service composition.21
The User Interface Generator (UIG) component provides activities with a22
generic user interface to exploit the service. The UIG component uses as input the23
WSDL/WADL service specification to generate a ready-to-use generic user interface24
to consume the service.25
The semantic meta-model of RDF annotations are grounded on an ontological26
combination of the LPCEL information model and the service description model.27
The LPCEL information model is used to specify learning processes including activ-28
ity structures, user roles, groups, restrictions, scenarios, contents, services, assess-29
ments and other learning resources, which can interact to achieve a set of learning30
objectives [16]. Since learning activities are not performed in an anarchic way, it is31
necessary to orchestrate and control the flow of activities in a learning process for32
every participant user role.33
The Semantic Web Services (SWS) layer maps the execution of activities to34
actual service calls. It is provided with an extended semantic model of specific35
services to be exploited in the learning activities, such as assessment, project man-36
agement or collaborative work services. When the SWS layer operates, interfaces37
generated by UIG can be enhanced with more usable interfaces, since the latter38
knows the type and semantics of the inputs to and outputs from service operations.39
Finally, the Learning Services Bus (LSB) layer brings together all available ser-40
vices, independently of their location. Figure 1 depicts some types of services, such41
as: WS-TestMobile (an implementation of IMS QTI for mobile environments); WS-42
Wiki (a web service-based implementation of a wiki); WS-Project (a web service43
1st Reading
August 13, 2010 19:0 WSPC/117-IJSEKE - SPI-J111 0218-1940 00485
Engineering the Life-Cycle of Semantic Services — Enhanced Learning Systems 7
interface to the Trac project tracking system); and WS-Assessment (an extensible1
service implementation of regular competence assessment instruments, such as con-2
trol lists, rubrics, and so forth). WS-Assessment instances are especially relevant,3
since they must be coordinated with the rest of services, independently of their kind,4
to implement real learning experiences that are founded on evaluation practices.5
3.3. Learning service ontology6
External services are integrated within activities of a learning experience by the7
semantic description of services at two levels. The first level is called Learning8
Service Ontology (LSO), which serves to portray general issues concerning service9
access the LSB; the second level is called Specific Service Ontology (SSO), which10
describes each service facility that is useful for any learning activity specific purpose.11
The flow control of learning activities is beyond the scope of this paper and12
is discussed in more detail elsewhere [17]. The LPCEL model defines a number of13
components needed to describe a learning experience, including learning objectives,14
outcomes, composite activities, resources, services and user roles. Since or goal deals15
with the services life-cycle, we stress on the part of LPCEL (see Fig. 2) that deals16
with resources and services integration. The model has been used as the basis to17
define The LPCEL LSO, from which an excerpt is shown next:18
@prefix: <http://example.org/lpcel.owl{\#}> .19
<http://example.org/lpcel.owl>20
rdf:type owl:Ontology .21
:ComponentActivity22
rdf:type rdfs:Class .23
:LearningActivity24
rdf:type rdfs:Class ;25
rdfs:subClassOf :ComponentActivity .26
:AssessmentActivity27
rdf:type rdfs:Class ;28
rdfs:subClassOf :ComponentActivity .29
:LearningService30
rdf:type rdfs:Class ;31
rdfs:subClassOf :Resource .32
:Service33
rdf:type :LearningService .34
:ServiceType35
rdf:type rdfs:Class .36
:AssessmentServiceType37
rdf:type :ServiceType .38
:WebServiceClient39
rdf:type rdfs:Class .40
1st Reading
August 13, 2010 19:0 WSPC/117-IJSEKE - SPI-J111 0218-1940 00485
8 E. Ghiglione, J. M. Dodero & J. Torres
:UserInterfacePage1
rdf:type rdfs:Class .2
:inputPage3
rdf:type owl:ObjectProperty ;4
rdfs:domain :WebServiceClient ;5
rdfs:range :UserInterfacePage .6
:resultsPage7
rdf:type owl:ObjectProperty ;8
rdfs:domain :WebServiceClient ;9
rdfs:range :UserInterfacePage .10
:AccessMethod11
rdf:type rdfs:Class .12
:RPCBasedAccess13
rdf:type rdfs:Class ;14
rdfs:subClassOf :RemoteAccess .15
:WebServiceDefinition16
rdf:type rdfs:Class ;17
In the LPCEL LSO, a ComponentActivity can be of one of three types:18
LearningActivity for describing activities that involve a specific learning objective;19
AssessmentActivity for activities that are extended with an evaluation of the learn-20
ing objective fulfillment; and ContextActivity for activities that do not involve21
learning but are needed to complete successfully a learning experience in which22
other activities are involved. A ComponentActivity can include the specification23
of several Resources to be harnessed during the activity execution. These can rep-24
resent local resources (e.g. SCORM contents) or remote applications (e.g. project25
repositories, virtual labs, simulators, or collaborative work tools). A specific kind of26
remote resources are RPC-based applications, which can be reached by a Service-27
Bus. The LSB puts together a number of services generally described by WSDL or28
WADL specification. Such services are represented in the LPCEL by the Interface29
element. An interface describes a learning service of one Type — e.g. assessment,30
collaborative writing, project management, and so forth. The Type element is used31
to select a specific learning service, whilst the Interface element is used to actually32
connect the LSO with the SSO.33
If a service is of type AssessmentServiceType, the LSO is connected with the SSO34
of an assessment service called EvalComix [18] for which we provide an assessment-35
specific ontology. EvalComix is the implementation of a general-purpose assessment36
service that allows authoring, deploying and enacting competence assessment instru-37
ments, such as control lists, rubrics, and so forth. EvalComix service is designed to38
fulfill the first two principles explained above through a decoupled design of the39
protocols that enable the interoperation of activities and services [19].40
The third design principle is put into practice through SA-REST semantic41
annotations [20] defined on the EvalComix SSO. This ontology is used to describe42
1st Reading
August 13, 2010 19:0 WSPC/117-IJSEKE - SPI-J111 0218-1940 00485
Engineering the Life-Cycle of Semantic Services — Enhanced Learning Systems 9
Fig
.2.D
etail
ofth
eLP
CE
Lin
form
ati
on
model
for
learn
ing
reso
urc
eand
serv
ice
inte
gra
tion.
1st Reading
August 13, 2010 19:0 WSPC/117-IJSEKE - SPI-J111 0218-194000485
10 E. Ghiglione, J. M. Dodero & J. Torres
different types of assessment instruments (e.g. rubrics and control lists) and their1
assessment values, as described next:2
@prefix: <http://example.org/evalcomix.owl #> .3
<http://example.org/evalcomix.owl>4
rdf:type owl:Ontology .5
:Instrument6
rdf:type rdfs:Class .7
:AssessmentValues8
rdf:type rdfs:Class .9
:InstrumentType10
rdf:type rdfs:Class .11
:Rubric12
rdf:type rdfs:Class .13
rdfs:subClassOf rdfs:InstrumentType .14
:ControlList15
rdf:type rdfs:Class .16
rdfs:subClassOf rdfs:InstrumentType .17
:RubricAssessmentValues18
rdf:type rdfs:Class .19
rdfs:subClassOf rdfs:AssessmentValues .20
:ControlListAssessmentValues21
rdf:type rdfs:Class .22
rdfs:subClassOf rdfs:AssessmentValues .23
:publicId24
rdf:type rdfs:Property ;25
rdfs:domain :Instrument ;26
rdfs:range rdfs:Literal .27
:title28
rdf:type rdfs:Property ;29
rdfs:domain :Instrument ;30
rdfs:range rdfs:Literal .31
:type32
rdf:type owl:ObjectProperty ;33
rdfs:domain :Instrument ;34
rdfs:range :InstrumentType .35
:description36
rdf:type rdfs:Property ;37
rdfs:domain :Instrument ;38
rdfs:range rdfs:Literal .39
:assessmenValues40
rdf:type rdfs:Property ;41
rdfs:domain :Instrument ;42
rdfs:range rdfs:Collection .43
1st Reading
August 13, 2010 19:0 WSPC/117-IJSEKE - SPI-J111 0218-1940 00485
Engineering the Life-Cycle of Semantic Services — Enhanced Learning Systems 11
3.4. Authoring interface for learning services1
The web services of the LSB are actually exploitable as long as an authoring inter-2
face is provided to the VLE. This section explains how the SELA ensures the3
exploitation of LSB services by providing the user interface needed to consume4
service operations. From the WSDL or WADL service specification, the SELA UIG5
component can generate a general-purpose interface based on web forms. However,6
such general-purpose forms can be replaced by enhanced interfaces that exploit the7
service in a more usable fashion.8
In particular, the user interface component for exploiting a learning assessment9
service is especially improved by the EvalComix authoring facilities. Such instru-10
ments enable creating adaptable instruments to assess all types of learning activities.11
The authoring service allows for extensions or additions of new types of instruments12
to the assessment service. The structure of every instrument is completely editable.13
It allows for the creation of dimensions to gather new assessment attributes or fields14
having features in common. Each field can be graded in a canonic scale that can15
be mapped to other grading schemes as required by the VLE. Besides, such fields16
can be weighted to adjust the relevance of each attribute for the overall evaluation.17
Nonetheless, some pre-defined assessment types of assessment instrument have been18
provided, including the following:19
• Control lists allow checking if qualitative evaluation criteria are properly fulfilled.20
• Value lists are used to provide quantitative values to a list of evaluation criteria.21
• Control +value lists are a combination of a control list and a value list. This type22
of instrument allows to check if the evaluation criteria are fulfilled and to provide23
each one with an assessment value.24
• Rubrics are structured instruments used to transform qualitative perceptions for25
a set of evaluation criteria into qualitative or quantitative assessment value sets.26
• Decision matrices are used to evaluate by selecting among a number of assessment27
choices, all of them equally relevant.28
• Mixed instruments are the combination of a number of instruments that can be29
structured on nested, weighted levels.30
Figure 3 depicts how a control list assessment instrument is edited with the31
authoring interface service provided by EvalComix. Such facilities are provided as32
part of the web-based implementation of the service, as described below.33
4. LAMS and EvalComix Service Integration34
Learning experiences require a loose coupling between the learning activities hosted35
in the VLE and their supporting services. This section describes how to achieve36
a flexible, decoupled integration of a VLE and learning services via a case study37
designed in two levels. The first level is known as raw integration, in which the archi-38
tectural style of Representational State Transfer or ReST [21] is followed to design,39
deploy and enact learning assessment services from a LAMS learning activity. In the40
1st Reading
August 13, 2010 19:0 WSPC/117-IJSEKE - SPI-J111 0218-1940 00485
12 E. Ghiglione, J. M. Dodero & J. Torres
Fig. 3. EvalComix user interface delivered for editing an assessment instrument of type control list.
second level, known as semantic integration, the SELA approach is applied to inte-1
grate a set of assessment services in learning activities. That shows how an external2
service can be further decoupled from the VLE, which then does not need to be so3
aware of the service API as to hard-code it in the implementation of activities.4
4.1. Raw integration of learning services5
The first integration level deals with protocol-oriented issues involved in the inter-6
action with the learning service. In this level, the allowed interactions between7
activities and services are available by providing activities with special-purpose8
user interfaces that enable accessing learning service functionalities. Although facil-9
ities of the semantic layer are not harnessed yet, the approach can still be useful10
to decouple users and services, since the VLE that manages users can also manage11
the user-activity mapping with a disregard for activity-service mapping.12
A ReST-based architectural style has been used to achieve the raw integration13
of services into LAMS activities. For that aim, ReST provides an explicit, resource-14
based representation of the service operational model that is exchanged with the15
activities. LAMS activities are free to use this model for their implementation, but16
also to map it to an appropriate internal model that can be exploited from the user17
interface of the activity.18
In the LAMS VLE, activity evaluations are implemented as a LAMS core ser-19
vice. LAMS core services are available for use by all activity tools requiring minimal20
coding efforts for a tool developer. The Eval Svc core service is the LAMS imple-21
mentation of SELA interoperability component, ready to interact with the API22
WS-Access of the assessment service. It acts as an abstraction layer between LAMS23
activity tools and EvalComix service API.24
(1) Authoring: As the teacher created the content for an activity tool, she can25
choose to create an evaluation instrument. The LAMS core service then creates26
1st Reading
August 13, 2010 19:0 WSPC/117-IJSEKE - SPI-J111 0218-1940 00485
Engineering the Life-Cycle of Semantic Services — Enhanced Learning Systems 13
a new instance of evaluation and passes the request to EvalComix to return its1
interactive flash authoring interface for the teacher to create the instrument.2
Once the teacher has created the evaluation, upon save, the instrument is stored3
in EvalComix and the instance identifier is returned to the LAMS core service4
as reference (see Fig. 4).5
(2) Deployment: After a student or a group of students complete an activity, then6
the teacher or tutor evaluates the work using the authored instrument. The7
evaluation instruments can be deployed individually or for the whole group,8
depending on the assessment procedure defined by the activity, as shown in9
Fig. 5. We remind that users and user groups are known and managed by LAMS,10
Fig. 4. UML sequence diagram showing how a teacher requests the ReST-based EvalComix serviceto create a new assessment instrument through LAMS core services.
Fig. 5. UML sequence diagram showing how a teacher completes the assessment of a student’swork by requesting the EvalComix service for an instance of the evaluation instrument that waspreviously created.
1st Reading
August 13, 2010 19:0 WSPC/117-IJSEKE - SPI-J111 0218-1940 00485
14 E. Ghiglione, J. M. Dodero & J. Torres
and not by the learning service, as stated in the second design principle.1
This procedure facilitates deploying self-assessments or peer assessments among2
students as readily as traditional teacher-based evaluations, since the LAMS3
Eval Svc (i.e. the VLE) manages the activities and knows which users have to4
execute them.5
(3) Enactment: Both teachers and students can complete the assessment forms that6
are delivered to them by LAMS. In Fig. 5, when the instrument is delivered for7
self-assessment, the student is also able to submit her own assessment values,8
according to the assessment interface of EvalComix.9
The integration described so far is simple, scalable, and allows for open integra-10
tion, fine-grained access control and user interface decoupling. Besides, LAMS core11
services mediate all user interactions with the assessment service. That enables the12
VLE to track and log all completed evaluations. However, since the learning assess-13
ment service model is shared, this model makes client activities tightly coupled. A14
change on a service interface, or its replacement by another functionally equivalent,15
can provoke undesired changes on a coupled implementation of activities. For that16
reason, the solution presented so far is enhanced, as described in the next section.17
4.2. Semantic integration of learning services18
This level of integration aims at decoupling further the VLE and the learning ser-19
vices, as required by the third design principle. Semantic integration is enabled by20
the SELA SWS layer. It has been tested on the EvalComix assessment service. The21
VLE and learning service models are integrated on the basis of LPCEL LSO and22
EvalComix SSO.23
The integration process is done in several steps. First, each LAMS activity is24
described based on the LPCEL LSO. For instance, the following is an RDF Turtle25
specification of a learning activity that has to be prepared to include an evaluation.26
@prefix: http://example.org/lpcel .27
<http://example.com/activity/11242>28
:identifier ‘‘11242’’ ;29
:title ‘‘Self-assessment activity’’@en ;30
:type :AssessmentActivity31
:description ‘‘This activity is used to...’’@en32
:parent <http://example.com/activity/11241> ;33
Second, the service is described by the assessment SSO. For instance, the fol-34
lowing specification describes a control list instrument, based on the EvalComix35
SSO:36
@prefix: http://example.org/evalcomix .37
<http://example.com/instrument/123456789>38
:identifier ‘‘123456789’’ ;39
1st Reading
August 13, 2010 19:0 WSPC/117-IJSEKE - SPI-J111 0218-1940 00485
Engineering the Life-Cycle of Semantic Services — Enhanced Learning Systems 15
:type ‘‘ControlList’’^^:InstrumentType ;1
:title ‘‘Evaluation instrument No. 1’’@en ;2
:title ‘‘Instrumento de evaluacion no 1’’@es ;3
:description ‘‘This control list is used to evaluate...’’@en4
:description ‘‘Esta lista de control sirve para evaluar...’’@es5
Third, the SWS layer maps the required assessment service to an actual service6
that is available through the LSB. The mapping must resolve the three phases of7
the service life-cycle, i.e. authoring, deployment and enactment.8
(1) Authoring: The following XHTML specification for a Create service operation9
is annotated with SA-REST to describe how an activity can apply for the author-10
ing facilities of the service and request an interface to design a new assessment11
instrument of a given type:12
<p about=’’http://example.com/assessment/create’’>13
<meta property=’’sarest:operation’’ content=14
‘‘http://example.com/evalcomix.owl # InstrumentAuthoring’’/>15
<meta property=’’sarest:lifting’’16
content=‘‘http://evalcomix.uca.es/api/lifting.xsl’’/>17
<meta property=’’sarest:lowering’’18
content=‘‘http://evalcomix.uca.es/api/lowering.xsl’’ />19
The logical input of this service is an20
<span property=’’sarest:input’’>21
http://example.com/lpcel.owl # ServiceType22
</span>23
object. The logical output of this service is an24
<span property=’’sarest:output’’>25
http://example.com/lpcel.owl # UserInterfacePage26
</span>27
object.28
This service should be invoked using an29
<span property=’’sarest:action’’>30
HTTP GET31
</span>32
</p>33
Lifting schema mappings transform XHTML annotations to a semantic model of34
the service (e.g. RDF instances of the EvalComix SSO), whereas lowering mappings35
transform data from that semantic model into an XML structure that is consumable36
by the applicant activity [22].37
(2) Deployment: After creating the assessment instrument, the VLE that hosts the38
activity can deploy as many service instances as required for the number of users or39
1st Reading
August 13, 2010 19:0 WSPC/117-IJSEKE - SPI-J111 0218-1940 00485
16 E. Ghiglione, J. M. Dodero & J. Torres
user groups involved. For that, the VLE can call the New operation of the learning1
service with the required parameters as described next:2
<p about=’’http://example.com/assessment/new/12345’’>3
<meta property=’’sarest:operation’’ content=4
‘‘http://example.com/evalcomix.owl # NewInstrument’’/>5
<meta property=’’sarest:lifting’’6
content=‘‘http://evalcomix.uca.es/api/lifting.xsl’’/>7
<meta property=’’sarest:lowering’’8
content=‘‘http://evalcomix.uca.es/api/lowering.xsl’’/>9
The logical input of this service is an10
<span property=’’sarest:input’’>11
http://example.com/evalcomix.owl # InstrumentIdentifier12
</span>13
object. The logical output of this service is a14
<span property=’’sarest:output’’>15
http://example.com/evalcomix.owl # Instrument16
</span>17
object.18
This service should be invoked using an19
<span property=’’sarest:action’’>20
HTTP GET21
</span>22
</p>23
(3) Enactment: the service enactment phase is not shown in this case since it is inter-24
nally managed by the VLE, which maps users’ and instrument identifiers. Should the25
VLE not manage users or user groups, enactment should be managed by an exter-26
nal server exploiting an additional service (e.g. http://example.com/assessment/27
assign/) that receives as input the assessment and users’ identifiers, and associates28
the instrument to each user or group of users.29
Finally, the service can be exploited by the activity. The following describes the30
functionality of the Grade operation of the service, to be called whenever the VLE31
decides it.32
<p about=’’http://example.com/assessment/12345/grade’’>33
<meta property=’’sarest:operation’’ content=34
‘‘http://example.com/evalcomix.owl # Grade’’/>35
<meta property=’’sarest:lifting’’36
content=‘‘http://evalcomix.uca.es/api/lifting.xsl’’/>37
<meta property=’’sarest:lowering’’38
content=‘‘http://evalcomix.uca.es/api/lowering.xsl’’/>39
The logical input of this service is an40
<span property=’’sarest:input’’>41
1st Reading
August 13, 2010 19:0 WSPC/117-IJSEKE - SPI-J111 0218-1940 00485
Engineering the Life-Cycle of Semantic Services — Enhanced Learning Systems 17
http://example.com/evalcomix.owl # AssessmentValues1
</span>2
object. The logical output of this service is a3
<span property=’’sarest:output’’>4
http://example.com/evalcomix.owl # Grade5
</span>6
object.7
This service should be invoked using an8
<span property=’’sarest:action’’>9
HTTP GET10
</span>11
</p>12
One advantage of this approach is that there is no need to hard-code the user13
interaction with the service in the activity. Instead the learning activity VLE man-14
ager can locate, replace or extend the learning service as semantically described by15
the SWS layer of SELA. Furthermore, the user interface generation can be improved16
by semantic annotations after the enactment phase. For instance, an improved user17
interface specially prepared for completing an assessment can be generated from18
SSO annotations of the previous example, since now the Grade operation is known19
to require an evalcomix:AssessmentValues type of input.20
If the service API changes, we only need to provide a new SA-REST specification21
of the service. From this specification, the SEE server can issue calls according to the22
new service interface. Since the user interface to consume the service is generated23
by the UIG, the VLE and their contained activities are not affected by the change24
on the service.25
5. Related Work26
The problem of integrating services in a learning experience has been thoroughly27
studied by the technology-enhanced learning community. First proposals were made28
on the bosom of the IMS Learning Design specification, which defines a set of roles29
to be played by users in groups to engage in learning activities using an environment30
with the required resources and services [23]. IMS LD-based CopperCore Service31
Integration (CCSI) [3] considers learning services as a type of functional concept32
supporting a user in the learning process. CCSI implements a run-time service that33
provides an API for interacting with e-learning applications, as part of the ELF34
e-learning framework (www.elframework.org). Although it provides the required35
service functionality, the life-cycle of a new service is very time consuming and not36
suitable for the integration of many different services. The tight coupling between an37
IMS LD-enabled VLE and the service does not allow an efficient creation and deploy-38
ment of learning experiences. Other generic approaches have defined an interaction39
protocol for information exchange among courses and services, but are especially40
1st Reading
August 13, 2010 19:0 WSPC/117-IJSEKE - SPI-J111 0218-194000485
18 E. Ghiglione, J. M. Dodero & J. Torres
committed to the IMS LD specification [4]. Aside from IMS LD, other approaches1
to service-oriented architectures for learning have been proposed [7, 24, 25].2
All analyzed approaches fall out when having to deal with authoring of learning3
services. A review of authoring issues can be found elsewhere [26, 27]. All of them4
complain about the difficulty of editing and adapting activities and resources. When5
such resources are also web services, the procedure has been extended by import-6
ing XML-based service specifications [8]. The latter approach delivers IMS Con-7
tent Packaging (CP) specifications of courses, which are thought for self-contained8
course contents that are seldom liable to changes. However, a learning service can9
change and evolve, either by altering its interface or replacing the whole service by10
another.11
Wilson [28] and the TENCompetence project [29] have explored the provision of12
widgets or small applications as a way to include service-based external functional-13
ity in a VLE. Learning environments that are based on the combination of widgets14
(known as mashups) [30] are recently accepted to build up PLEs for informal learn-15
ing. Widgets unify all functions required by the VLE/services integration in a single16
widget management component that must be installed in the LMS. The approach17
is simple and powerful enough, but precludes a solution to manage the learning ser-18
vices life-cycle that does not involve re-engineering the complete widget. Besides,19
current VLE implementations are not aware of widget-based extensions and do not20
expose a fine-grained control over learning services.21
At a more abstract level, the IMS Tools Interoperability specification (www.22
imsglobal.org/ti/) provides guidelines to integrate third party tools in a traditional23
LMS. But the emerging trend in web-based learning scenarios is toward orches-24
trating the functionalities and services provided by multiple sources, instead of25
unrealistically requiring all these services to be present in the LMS. The variety of26
Internet-based service functionalities make interoperability difficult to approach by27
simply providing a function-oriented API, such as the Open Knowledge Initiative28
(www.okiproject.org) or the IMS Abstract Framework (www.imsglobal.org/af/).29
Function-orientation involves the function-based access to a model of the service30
provider that is often not exposed but assumed to be shared by all learning activi-31
ties that require such services. The resource-oriented ReST API [19] and the loosely32
coupled architecture described in this work enhances the flexibility of service man-33
agement by keeping the connection between the VLE and learning services as simple34
and decoupled as possible.35
Ontologies and semantic web services have been thoroughly used in e-learning,36
among other things, to provide a richer framework for the expression of learning37
object metadata [31], to formally describe teaching processes and learning designs38
[32–34], to rank and match the web services that better adapt to a learning sce-39
nario [35] or enable selecting an educational offer [36]. Future challenges of the40
semantic web in education have to do with the social web [37]. In general, semantic41
web services are suited to build more loosely coupled systems that improve their42
modularity, interoperability and extensibility [38].43
1st Reading
August 13, 2010 19:0 WSPC/117-IJSEKE - SPI-J111 0218-1940 00485
Engineering the Life-Cycle of Semantic Services — Enhanced Learning Systems 19
6. Conclusions1
Engineering a collaborative learning experience involves a development life-cycle2
consisting of (1) authoring the learning activities, resources and services; (2) deploy-3
ment of such elements on a virtual learning environment by assigning them to the4
participants; and (3) enactment of the course to start interacting with available5
resources and services. The main issue of engineering internet-based services in6
such collaborative learning experiences is how these can be seamlessly integrated7
with the rest of the components of the learning environment in order to describe8
authentic, assessment-oriented learning experiences.9
This work relates a services-enhanced learning architecture that allows integrat-10
ing web services in the learning environment in a decoupled manner. Integration is11
accomplished in two levels across all phases of the service life-cycle. First level of12
integration makes the learning environment to have visibility on inbound learning13
resources through the service ReST-based API. Second level of integration is based14
on the provision of generic plus specific learning service ontologies that enable the15
learning environment to manage the service life-cycle in a more decoupled fashion.16
As further work, we aim at extending the SELA approach and its semantic model to17
integrate and engineer other types of web services to support advanced pedagogical18
strategies, such as web-based project management applications for project-based19
learning, and wikis for collaborative learning environments.20
Acknowledgments21
Research of this work is partly funded by the ASCETA Project of the Government22
of Andalucıa, Spain (ref. PR09-TIC-5230).23
References24
1. P. A. Kirschner, Using integrated electronic environments for collaborative teach-25
ing/learning, Research Dialogue in Learning and Instruction 2(1) (2001) 1–10.26
2. J. Torres, C. Cardenas, J. M. Dodero and E. Juarez, Educational modelling lan-27
guages and service-oriented learning process engines, in Advanced Learning Processes,28
M. B. Rosson (ed.) (InTech Pub, Vienna, 2010).29
3. H. Vogten, H. Martens, R. Nadolski, C. Tattersall, P. van Rosmalen and R. Koper,30
CopperCore service integration, Interactive Learning Environments 15(2) (2007)31
171–180.32
4. L. De la Fuente, Y. Miao, A. Pardo and C. Delgado, A supporting architecture33
for generic service integration in IMS learning design, Proc. of EC-TEL, 2008,34
pp. 467–473.35
5. J. Dalziel, Implementing learning design: The learning activity management system36
(LAMS), Proc. of 20th ASCILITE, Adelaide, Australia, 2003, pp. 593–596.37
6. V. Devedzic, J. Jovanovic, D. Gasivic, The pragmatics of e-learning standards, IEEE38
Internet Computing, May-June, 2007, pp. 19–27.39
7. D. Dagger, A. O’Connor, S. Lawless, E. Walsh and V. P. Wade, Service-oriented40
e-learning platforms: From monolithic systems to flexible services, IEEE Internet41
Computing 11(3) (2007) 28–35.42
1st Reading
August 13, 2010 19:0 WSPC/117-IJSEKE - SPI-J111 0218-1940 00485
20 E. Ghiglione, J. M. Dodero & J. Torres
8. P. J. Munoz-Merino, C. Delgado-Kloos and J. Fernandez-Naranjo, Enabling interop-1
erability for LMS educational services, Computer Standards & Interfaces 31 (2009)2
484–498.3
9. S. Wilson, O. Liber, M. Jonhson, P. Beauvoir, P. Sharples and C. Milligan, Personal4
learning environments. Challenging the dominant design of educational systems, Jour-5
nal of e-Learning and Knowledge Society 3(2) (2007) 27–38.6
10. G. Wiggins, Educative Assessment. Designing Assessment to Inform and Improve7
Student Performance (Jossey-Bass, San Francisco, CA, 1998).8
11. I. Foster, C. Kesselman, J. Nick and S. Tuecke, The physiology of the grid: An9
open grid services architecture for distributed systems integration, open grid service10
infrastructure WG, Global Grid Forum, June 22, 2002.11
12. H. Wang, J. Z. Huang, Y. Qu and J. Xie, Web services: Problems and future directions,12
Journal of Web Semantics 1(3) (2004) 309–320.13
13. M. Keppell and D. Carless, Learning-oriented assessment: A technology-based case14
study, Assessment in Education 13(2) (2006) 179–191.15
14. D. Carless, G. Joughin and M. M. C. Mok, Learning-oriented assessment: Principles16
and practice, Assessment & Evaluation in Higher Education 31(4) (2006) 395–398.17
15. E. Wilde and R. J. Glushko, Document design matters, Communications of the ACM18
51(10) (2008) 43–49.19
16. J. Torres, J. M. Dodero, I. Aedo and T. Zarraonandia, An architectural framework20
for composition and execution of complex learning processes, Proc. of the 5th ICALT,21
Kaohsiung, Taiwan, 2005, pp. 143–147.22
17. J. Torres, J. M. Dodero, I. Aedo and P. Dıaz, Designing the execution of learning activ-23
ities in complex learning processes using LPCEL, Proc. of the 6th ICALT, Kerkrade,24
The Netherlands, 2006, pp. 415–419.25
18. M. S. Ibarra-Saiz, G. Rodrıguez-Gomez, J. M. Dodero, M. A. Gomez-Ruiz, N. Gallego-26
Noche and D. Cabeza-Sanchez, Integration of EvalComix 1.0 into e-learning systems,27
Proc. of mICTE, Lisbon, Portugal, 2009.28
19. J. M. Dodero and E. Ghiglione, ReST-based web access to learning design services,29
IEEE Transactions on Learning Technologies 1(3) (2008) 190–195.30
20. C. Petrie, Practical Web Services, IEEE Internet Computing, Nov-Dec, 2009,31
pp. 93–96.32
21. R. T. Fielding and R. N. Taylor, Principled design of the modern web architecture,33
ACM Transactions on Internet Technology 2(2) (2002) 115–150.34
22. J. Kopecky, T. Vitvar, C. Bournez and J. Farrell, SAWSDL: Semantic Annotations35
for WSDL and XML Schema, IEEE Internet Computing, Nov-Dec 2007, pp. 60–67.36
23. B. Olivier and C. Tattersall, The learning design specification, in R. Koper, and37
C. Tattersall (eds.), Learning Design. A Handbook on Modelling and Delivering38
Networked Education and Training (Springer, 2005), pp. 21–40.39
24. C. Padron, J. Torres, J. M. Dodero, I. Aedo and P. Dıaz, Learning web services40
composition and learner communities support for the deployment of complex learning41
processes, Proc. of the 4th ICALT, Joensuu, Finland, 2004, pp. 390–394.42
25. Z. Jeremic, J. Jovanovic, D. Gasevic and M. Hatala, Project-based collaborative learn-43
ing environment with context-aware educational services, in Learning in the Synergy44
of Multiple Disciplines, LNCS 5794, 441–44645
26. J. M. Dodero, C. Tattersall, D. Burgos and R. Koper, Transformational techniques46
for model-driven authoring of learning designs, Proc. of the 6th ICWL, LNCS 4823,47
(2007) 230–241.48
27. I. Martınez-Ortiz, J. L. Sierra and B. Fernandez-Manjon, Authoring and reengineering49
of IMS learning design units of learning, IEEE Transactions on Learning Technologies50
2(3) (2009) 189–202.51
1st Reading
August 13, 2010 19:0 WSPC/117-IJSEKE - SPI-J111 0218-1940 00485
Engineering the Life-Cycle of Semantic Services — Enhanced Learning Systems 21
28. S. Wilson, P. Sharples and D. Griffiths, Extending IMS learning design services using1
widgets: Initial findings and proposed architecture, in Current Research on IMS Learn-2
ing Design and Lifelong Competence Development Infrastructures, Barcelona, 2007.3
29. S. Wilson, P. Sharples, D. Griffiths and K. Popat, Moodle wave: Reinventing the VLE4
using widget technologies, Workshop on Mash-up Personal Learning Environments,5
EC-TEL, 2009, pp. 47–58.6
30. M. Ebner, R. Klamma and S. Schaffert, Mashups for learning, International Journal7
of Emerging Technologies in Learning 5(1) (2009) 4–6.8
31. M. A. Sicilia and E. G. Barriocanal, On the convergence of formal ontologies and stan-9
dardized e-learning, International Journal of Distance Education Technologies 3(2)10
(2005) 13–29.11
32. V. S. Belesiotis and N. Alexandris, A scenario for the development and use of teaching12
oriented ontologies, International Journal of Metadata, Semantics and Ontologies 4(3)13
(2009) 183–195.14
33. M. A. Sicilia, Semantic learning designs: Recording assumptions and guidelines,15
British Journal of Educational Technology 37(3) (2006) 331–350.16
34. S. Dietze, A. Gugliotta and J. Domingue, Using semantic web services to enable17
context-adaptive learning designs, Journal of Interactive Media in Education, 2007/03.18
35. B.-Y.-S. Lau, C. Pham-Nguyem, C.-S. Lee and S. Garlatti, Semantic web service19
adaptation model for a pervasive learning scenario, Proc. of IEEE Conf. on Innova-20
tive Technologies in Intelligent Systems and Industrial Applications, Malaysia, 2008,21
pp. 98–103.22
36. S. Kumar, K. Kumar and A. Jain, A semantic web technology based framework for23
educational-offer selection in higher education, International Journal of Metadata,24
Semantics and Ontologies 4(3) (2009) 165–182.25
37. J. Jovanovic, D. Gasevic, C. Torniai, S. Bateman and M. Hatala, The social seman-26
tic web in intelligent learning environments: State of the art and future challenges,27
Interactive Learning Environments 17(4) (2009) 273–309.28
38. D. Millard, K. Doody, H. Davis, L. Gilbert, Y. Howard, F. Tao and G. Wills, Semantic29
web services for e-learning, International Journal of Knowledge and Learning 4(3&4)30
298–315.31