A Proposal for Recognizing Reputation within Communities of Practice

12
W. Shen et al. (Eds.): CSCWD 2007, LNCS 5236, pp. 443–454, 2008. © Springer-Verlag Berlin Heidelberg 2008 A Proposal for Recognizing Reputation within Communities of Practice Claudia C.P. Cruz 1 , Maria Teresa A. Gouvêa 1 , Claudia L.R. Motta 1 , and Flavia M. Santoro 2 1 Programa de Pós-Graduação em Informática, UFRJ, Brazil 2 Departamento de Informática Aplicada , UNIRIO, Brazil [email protected], [email protected], [email protected],[email protected] Abstract. One of the most critical points for Communities of Practice is shared information credibility. Professionals only feel secure to reuse solutions pre- sented and discussed within the scope of the community if a process of trusting is well established among its members at that moment. We propose the use of Reputation Systems to promote trust among participants and, consequently, stimulate participation through the recording of relevant information, such as ideas, opinions, solutions, and recommendations. Keywords: Reputation Systems, CSCW, Collaborative Learning, Organiza- tional Learning, Community of Practice, Social Networks. 1 Introduction Knowledge is becoming increasingly more important for competitive advantage achievement. The creation and dissemination of organizational knowledge is a social process by which knowledge is shared among members of an organization [16]. Thus, new methods for continuous learning become necessary, ensuring support to knowledge management. Good learning capacity enhances problem solution, by ren- dering design processes, which generally involve intensive knowledge, more efficient and effective [3]. In this context, the Community of Practice (CoP) is a means adopted by companies to stimulate learning and sharing of knowledge. A Community of Practice can be defined as a group of people who interact regularly to share work experiences, inter- ests or objectives [18]. In a CoP, professionals can describe daily work problems and share solutions with their colleagues. However, it is neither an easy task to stimulate the development of such communi- ties, nor to support and integrate them throughout the organization. Companies must implement a number of strategies for encouraging reliable and outstanding informa- tion for the community. One solution to this problem is based on the kind of loyalty mechanism, aiming at stimulating active member participation in CoPs. Loyalty mechanisms contribute to increase participation [6].

Transcript of A Proposal for Recognizing Reputation within Communities of Practice

W. Shen et al. (Eds.): CSCWD 2007, LNCS 5236, pp. 443–454, 2008. © Springer-Verlag Berlin Heidelberg 2008

A Proposal for Recognizing Reputation within Communities of Practice

Claudia C.P. Cruz1, Maria Teresa A. Gouvêa1, Claudia L.R. Motta1, and Flavia M. Santoro2

1 Programa de Pós-Graduação em Informática, UFRJ, Brazil

2 Departamento de Informática Aplicada , UNIRIO, Brazil [email protected], [email protected], [email protected],[email protected]

Abstract. One of the most critical points for Communities of Practice is shared information credibility. Professionals only feel secure to reuse solutions pre-sented and discussed within the scope of the community if a process of trusting is well established among its members at that moment. We propose the use of Reputation Systems to promote trust among participants and, consequently, stimulate participation through the recording of relevant information, such as ideas, opinions, solutions, and recommendations.

Keywords: Reputation Systems, CSCW, Collaborative Learning, Organiza-tional Learning, Community of Practice, Social Networks.

1 Introduction

Knowledge is becoming increasingly more important for competitive advantage achievement. The creation and dissemination of organizational knowledge is a social process by which knowledge is shared among members of an organization [16]. Thus, new methods for continuous learning become necessary, ensuring support to knowledge management. Good learning capacity enhances problem solution, by ren-dering design processes, which generally involve intensive knowledge, more efficient and effective [3].

In this context, the Community of Practice (CoP) is a means adopted by companies to stimulate learning and sharing of knowledge. A Community of Practice can be defined as a group of people who interact regularly to share work experiences, inter-ests or objectives [18]. In a CoP, professionals can describe daily work problems and share solutions with their colleagues.

However, it is neither an easy task to stimulate the development of such communi-ties, nor to support and integrate them throughout the organization. Companies must implement a number of strategies for encouraging reliable and outstanding informa-tion for the community.

One solution to this problem is based on the kind of loyalty mechanism, aiming at stimulating active member participation in CoPs. Loyalty mechanisms contribute to increase participation [6].

444 C.C.P. Cruz et al.

The vital point in participation is the process for establishing reliability, in which obstacles are the participants’ skepticism, isolation and delay in responding to group demands. It is hard to trust the competence of people we do not know, especially regarding the current Internet scenario, in which there is a great deal of information available through unreliable sources.

One possible approach is exploring the opportunities embodied in existing relation-ships, and motivate commitment, as well as honesty behind a broad range of informa-tion [18][10]. From this perspective, we suggest that Reputation Systems can be applied in order to promote trust and stimulate CoP member participation, through the recording of outstanding information, such as ideas, opinions, solutions and recom-mendations, guaranteed by individuals of “good reputation”, and as a result, promote use of its knowledge base. We argue that a professional feels secure to reuse a design solution presented and discussed within the scope of the community, if only a process of trusting is well established among its members.

In this paper, we propose the use of Reputation Systems to promote trust among participants and consequently stimulate participation through the recording of relevant information (ideas, opinions, solutions, recommendations).

The paper is organized as follows: Section 2 describes Reputation Systems concept and goals; Section 3 presents requirements for using Reputation Systems in the CoP context; Section 4 illustrates ActivUFRJ, an environment which implements such ideas; and, finally, Section 5 presents related works, and Section 6 discusses our con-clusions and future work.

2 Related Work

According to Lopes [11] and Jøsang et al. [9], Reputation Systems performance is based on two main concepts: reputation and trust. Reputation reflects a general opin-ion people have about someone or something. Generally, this opinion is built up from information provided by members of a community on experiences had with the entity in the past.

In virtual environments which utilize use Reputation Systems, users can decide whether to trust an individual or not from the moment he/she knows his/her reputa-tion. In addition to the need for an individual to be trustworthy, it is necessary that he/she have positive attitudes (honest and collaborative ones) concerning entities that depend on him/her.

Reputation Systems represent an alternative to help users themselves create reli-able relationships in the Internet [15], allowing them to evaluate the individuals´ ac-tions, view reputations based on community opinion and create their trust networks. In general, these systems attach values to individuals’ interactions to calculate their reputation. These systems also need to develop immunization mechanisms against the actions of individuals who make use of dishonest evaluations to enhance their reputa-tion and reduce other people’s reputations, so as to benefit from the services provided.

Such immunization mechanisms include: causing the reputation estimative to be less vulnerable to the actions of users displaying fraudulent behavior; preventing the use of anonymity, or allow its controlled usage in order to protect users from possible dishonest evaluations [1].

A Proposal for Recognizing Reputation within Communities of Practice 445

Currently, some reputation and immunization mechanisms are used in auction sites, e-commerce, news sharing, and expert-site services which need to motivate trust among users in order to ensure that more people use them.

2.1 Reputation Model in Auction Sites

Auction sites (Ebay.com, MercadoLivre.com.br) collect user feedback on the transac-tions performed. The buyer/seller evaluates his/her counterpart in a positive manner (+1) when the negotiation meets his/her expectation; and in a neutral manner (0) when a situation does arises in which he/she does not feel able to perform an evaluation, e.g., withdrawal from a deal.

A buyer’s/ seller’s reputation is represented by the balance of positive and negative evaluations one receives from different users with whom one has negotiated. For example, if a seller receives several positive evaluations from the same buyer, the system only takes account of one of these evaluations. This ensures that the seller’s reputation reflect of the opinion of several buyers who have negotiated with him/her, not a single one.

2.2 Reputation Model in e-Commerce

In e-commerce sites (Amazon.com, Epinions.com), users evaluate products available for purchase through ratings and comments. These evaluations are used for recom-mending similar products to the users themselves or for other users with similar tastes. In order to ensure the credibility of recommendations, the reputation system collects users’ opinions, in which they indicate whether an evaluation was useful to their purchase decision. Thus, reviewers gain reputation points for each positive re-turn and lose points for negative returns.

The recommendation system assigns priority to products evaluated by high-reputation people. In epinions.com, users can add reviewers to their “Web of Trust”. The system also allows users to block reviewers whose opinions they do not trust. Thus, the system can put forth customized recommendations based on users’ trust networks and avoid recommending items evaluated by people they have blocked.

2.3 Reputation Model in Newssharing

In the newssharing site Slashdot.org, users post and comment on news. Comments can be evaluated by all other users (moderators) through ratings, which count as posi-tive or negative points for the individual making the comment. A second evaluation layer was added in order to minimize the action of unfair or dishonest moderators, in which meta-moderators judge evaluations.

The meta-moderators comprise part of a select group of individuals with long-term registration in the system. The users viewed as unfair or dishonest by the meta-moderators lose reputation points or are banished from the system, depending on the seriousness of their unfair behaviors.

446 C.C.P. Cruz et al.

2.4 Reputation Model in Expert-Sites

In the expert-site AllExperts.com, users sign up as volunteers to answer questions on certain categories of knowledge. The service only accepts as specialists individuals who actually demonstrate having the necessary skills to answer the questions.

Users assign grades from 0 to 10 in different criteria for evaluating specialist ser-vices, such as: knowledge, cordiality, clarity of answer and response time. The spe-cialists accrue points for the grades received, and those with the highest scores make up the list of the best specialist in certain category of knowledge. The system also enables users to look up the history of the most recent evaluations made on the spe-cialists, as well as the breakdown of the scores given in each assessment criterion along time.

2.5 Considerations about Mechanisms Used in the Models Presented

The strategy of calculating reputation by means of the sum or average of ratings re-ceived is used in all models analyzed. However, it is important to notice the concern of certain environments in rendering the reputation system less vulnerable to the ac-tion of dishonest evaluators, by means of measures such as: care in selecting feedback from different users, aiming at establishing reputation measure precision; meta-moderator action and selection; creation of trust networks and blocking lists.

Some studies [1] suggest that an analysis of the frequency with which users are evaluated by some groups of people be done, so as to identify possible instances of attacks by groups, which exploit system vulnerabilities. This analysis can be performed by history of the evaluations received by users through time. For such purpose, the system should not allow users sign up more than once with different identities.

In the next section, we explain how those mechanisms can be applied to CoP. Fur-thermore, we present the proposal for implementing a reputation model in the CoP context, in which the focus comprises the interactions geared to learning and knowl-edge sharing. CoPs need a different reputation service because the community is much smaller than in e-commerce sites, the activity is intellectual, not commercial, and the system knows something about its users because it serves an existing commu-nity (university or company).

3 Reputation Systems Applied to CoPs

In a CoP, a Discussion Forum application may fit users’ needs. Nevertheless, content directed to group interests and previously validated by experts might attract much more attention, providing a differential factor in regard to other environments. There-fore, it is critical to assess computational environment infrastructure for this type of community.

The greater the level of information trustworthiness, the greater users’ approval will be. In other words, unreliable information can actually make users abandon the environment [5].

A Proposal for Recognizing Reputation within Communities of Practice 447

3.1 Problems Identified in CoPs

Facilitating localization of experts in CoPs can assist in keeping user participation, helping to minimize the following problems:

− Information Overload: The amount of information available on a daily basis to a professional can be staggering. The volume of information is more than any one person can consume, generating an information overload problem. It is sometimes necessary to make an enormous effort and ask for specialists’ aid to find safe information.

− Sensation of Wasted Time: Often, spending too much time to find desired infor-mation may cause one anguish and dissatisfaction, because of the inability to im-mediately perceive which way to proceed, in addition to his/her able to come across such information directly and quickly.

− Lack of Trustworthiness of the Information: Although a participant can find diverse documents shared within the community, the lack of specialists to validate these documents generates doubt as to how much this information can be trusted or to what extent it is a “safe source”.

− Deficiency in Recognizing Productive and Trustworthy Individuals: Since communities have a dynamic life cycle, the recognition of authentic specialists comprises a problem, in addition to the selection, editing and acceptance of the disclosed information.

3.2 Estimative Reputation Strategy

When CoP people themselves do not know, Reputation Systems can assist them in finding specialists through their contributions and from other members’ feedback on the extent to which those contributions were good solutions to a problem.

Another important issue to be observed is that the reputation concept is extremely context dependent. Firstly, having a “good reputation” can have different meanings from team to team [12]. Secondly, because individual reputation can vary according to one’s specific knowledge area [1]. Therefore, in situations in which individuals move from project to project, reputation can increase or decrease over time, due to one’s performance in the new contexts presented.

In CoP, two situations may occur. When the individual enters the community and nobody knows him, he needs to “build” his reputation. On the other hand, at times he/she already has a reputation in another context that can assist these new commu-nity activities. In this case, members who have known this individual in other contexts can present him/her as a specialist on a particular subject.

Recommendations explicitly made by specialists known within the community, or inferred implicitly through a social network, can be essential to deciding urgent prob-lems, such as an emergency situation [14]. Therefore, in the Reputation System model for CoP environments, mechanisms which facilitate the localization of experts through distinction (ranking), community feedback (aggregated ratings), and social networks must be available.

448 C.C.P. Cruz et al.

3.3 Unfair Ratings

Motivating users to contribute and carry out adequate ratings can be difficult when they are under observation by the company. Reputation Systems expose an individ-ual’s image to his colleagues and the company, which may bring about diverse prob-lems with unfair ratings.

In environments where employees rate their peers, adoption of a reward program based on reputation could create a competitive environment. Some employees may wish to take advantage of this situation for personal benefit, for example, by making agree-ments between themselves to always assign good ratings to their friends’ contributions and bad ratings to the others’ contributions. Thus, the attempt to increase user participa-tion could generate a set of unfair ratings, harming estimative reputation reliability.

Employees can also overload the environment with low quality information, in an attempt to accumulate more and more points. Furthermore, fearing exposure and re-taliation from colleagues, they may avoid assigning ratings, or provide only positive ratings. Such attitudes would generate an excess of non-validated or low quality in-formation, or an overload of positive evaluations. This would be as destructive for estimative reputations as lack of participation.

In order to avoid these situations, the environment must be able to detect undesirable behavior, punish it, or, at least, alert the community of its occurrence. Consequently, it is necessary for the system to apply immunization mechanisms against unfair ratings.

3.4 Immunization Mechanisms

Dellarocas studied effects from unfair ratings in some auction sites, and proposed immunization mechanisms to mitigate the vulnerability of Reputation Systems to false positive and negative ratings [4].

− Anonymity: Anonymity can be used in a controlled manner. In such a scheme, the marketplace knows the true identity of all participants, but keeps their identity concealed from buyers and sellers during a transaction, or assigns them pseudo-nyms that change from one transaction to the next. Thus, since the subject cannot be recognized, those injurious actions can be prevented. Therefore, anonymity helps to control bad-mouthing (false negative). However, this mechanism does not guarantee prevention of discriminatory behavior (false positive), because “friends’ pseudonyms” can always be identified.

− Median: The use of the average makes the system vulnerable to malicious users who try to strategically distribute their ratings to maximize their reputation and minimize the others’. The use of the median in reputation computation renders this kind of action more harder to carry out.

− Frequency Analysis: The frequency with which the same group rates users can point to the formation of unfair raters. However, in some cases, honest users keep a group of raters who can be their “loyal customers”. Therefore, individual reputa-tion histories must be considered. If the reputation of the user is to be calculated over particular periods of time, a sudden change in his/her more recent ratings can quickly be identified. In this case, possible unfair raters may be identified by fre-quency analysis. This mechanism can be used against discriminatory behavior and bad-mouthing.

A Proposal for Recognizing Reputation within Communities of Practice 449

4 Reputation System Model for CoPs

We propose a Reputation System Model for CoP which must consider possible inter-action types used in virtual environments, estimative strategies applicable to those interaction types, and adoption of immunization mechanisms against possible unfair ratings (Fig. 1 depicts the model).

Reputation System Model

Estimative Strategy

Ranking Aggregated Ratings

Social Networks

Interaction Types

Immunization Mechanisms

Anonymity Median Frequency Analysis

Fig. 1. Reputation System for CoP Model

This model is being developed in the context of ActivUFRJ (Collaborative Envi-ronment for Integrated and Virtual Work of UFRJ), which allows for the formation and maintenance of a public Federal University CoP in Brazil [7].

ActivUFRJ implements three main entities: User, Community, and Artifact, in which:

− “User” uniquely represents each person inside the system through his profile page;

− “Community” represents the space for meeting and sharing artifacts among members; and,

− “Artifact” represents any type of material indicated by a community member for consultation and evaluation. This material can be, for example, text files, databases, software, media, or sites about interesting subjects and projects.

The user profile page stores personal data, message history, published artifacts, and the list of communities in which he/she participates (Fig. 2).

The community page is the space where users can record events, acknowledg-ments, access other members’ profiles, and consult pages published for the commu-nity (Fig. 3).

The artifact page bears the following information: artifact name, name of the per-son who published page, publication date, update description, access or download links, and a rating form, on which members can give ratings and write comments about the artifact (Fig. 4).

450 C.C.P. Cruz et al.

Fig. 2. User Page in ActivUFRJ

Fig. 3. Community Page in ActivUFRJ

The ActivUFRJ proposal is to draw on users’ ratings in order to recommend arti-facts to the community. To motivate quality participation, a punctuation strategy is being implemented, in which users gain points for good participation and contribu-tion, and lose points for bad contributions (Table 1).

A Proposal for Recognizing Reputation within Communities of Practice 451

Fig. 4. Artifact Page in ActivUFRJ

Table 1. Punctuation Strategy in ActivUFRJ

Participation Punctuation for Participation

User rates artifact +1 User qualifies review

+1

Punctuation in grade scale Contributions 1

Very bad 2

Bad 3

Good 4

Very good 5

Excellent User has his artifact rated

-2 -1 +1 +2 +3

User has his review qualified

-2 -1 +1 +2 +3

However, in order to establish trust in the recommendations among users, it is nec-essary that they identify specialists in particular subjects, or, at least, people with similar interests.

Fig. 5 represents the Reputation System Model extended for the main types of interaction within ActivUFRJ.

4.1 Functionality

The following functionally addresses the issues discussed before.

− Recommendation by Specialists: When visiting a community page, users can find a ranking of the members that have achieved better participation ratings. Ranking serves as a mechanism for finding specialists in community interest areas.

452 C.C.P. Cruz et al.

User Visit

Community Profile User

Users’ Ranking General Punctuation

Ranking Position

Trust Networks

Artifact Amount

Review Amount

Artifact

Ranking Reviews

Fig. 5. Main Interaction Types in ActivUFRJ

− Users’ Reputation: Other information on reputation, such as general punctuation, position in ranking, amount of published artifacts and written reviews, assist users in deciding whether or not to trust that member. It is also possible, through the re-views, to find specialists or people with similar interests. On the artifact page, us-ers can find review ranking rated by the community and access authors’ profile pages.

− Trust Networks: In a CoP, a member may wish to know people with high reputa-tion who are developing projects on similar subjects. Thus, when visiting a mem-ber’s profile, users can add him/her to their own trust network, and visualize artifacts evaluated by people who are part of their trust network.

The use of the Median for reputation calculation and anonymity when making arti-facts or reviews available is being evaluated. Currently, the system is being developed to store individuals’ historical reputations, to enable future analyses of unfair rating occurrences.

5 Conclusions and Future Work

In this paper, we expected to address benefits that Reputation Systems can bring to CoPs, such as: recognizing individuals who holders of knowledge on specific subjects of community interest, making information and recommendations validated by them more trustworthy, facilitating localization of specialists in emergency situations through distinction and social networks, assisting new participants in identifying spe-cialists as well as building reliable relationships.

We also highlight the importance of developing safe and robust reputation systems, using immunization mechanisms to control the performance of possible dishonest raters.

A Proposal for Recognizing Reputation within Communities of Practice 453

The model proposed has been implemented in the ActivUFRJ environment, at which tests and cases studies will be carried out in order to identify potential solutions to existing problems within CoPs.

We strongly believe that appropriate strategies can influence the desired behavior of an organization and its community. For example, specialists can rate items, making them reliable and generating greater credibility among members, since they know that they have a “safe source” provided by people knowledgeable on the subject.

The next steps for this work are implementing functionality to fulfill the require-ment of preventing unfair ratings, and continuing using the environment and perform-ing case studies to secure greater feedback on Reputation Systems applied to CoPs.

References

1. Chen, M., Singh, J.P.: Computing and Using Reputation for Internet Ratings. In: EC 2001. October 14-17, 2001. Tampa, Florida, USA (2001)

2. Cruz, C.C.P., Motta, C.L.R.: Um Modelo de Sistema de Reputação para Comunidades Vir-tuais. XVII Simpósio Brasileiro de Informática na Educação. November 08-10, 2006. Brasília – DF, Brazil (2006)

3. Davenport, T.H.: Thinking for a Living: How to Get Better Performances And Results from Knowledge Workers. Harvard Business School Press (2005)

4. Dellarocas, C.S.: Building Trust Online: The Design of robust Reputation Reporting Mechanisms for Online Trading Communities. In: Building Trust Online, Chapter VII (2004)

5. Gouvêa, M.T.A.: Loyalty Model for Communities of Practice Master Dissertation, NCE/UFRJ – Rio de Janeiro, Brazil (2005) (in Portuguese)

6. Gouvêa, M.T.A., Motta, C.L.R., Santoro, F.M.: Recommendation as a Mechanism to In-duce Participation in Communities of Practice. In: Shen, W.-m., Chao, K.-M., Lin, Z., Barthès, J.-P.A., James, A. (eds.) CSCWD 2005. LNCS, vol. 3865, pp. 92–101. Springer, Heidelberg (2006)

7. Hildenbrand, B.A.: ActivUFRJ: Ambiente Colaborativo de Trabalho Integrado e Virtual. Projeto Final de Curso (Bacharelado em Ciência da Computação). Universidade Federal do Rio de Janeiro, Rio de Janeiro-RJ (2005)

8. Jensen, C., Davis, J., Farnham, S.: Finding Others Online: Reputation Systems for Social Online Spaces. In: Proceedings of the SIGCHI conference on Human factors in computing systems: Changing our world, changing ourselves, Minneapolis, Minnesota, USA (2002)

9. Josang, A., Ismail, R., Boyd, C.: A Survey of Trust and Reputation Systems for Online Service Provision, Distributed Systems Technology Centre and Information Security Re-search Centre, Queensland University of Technology Brisbane Qld 4001, Australia (2006)

10. Lesser, E., Prusak, L.: Communities of practice, social capital and organizational knowl-edge, White paper, IBM Institute for Knowledge Management, Cambridge (1999)

11. Lopes, A.C.F.: Um método para a geração de estimativas de reputação mais precisas per-ante a oscilação de comportamento das entidades avaliadas. Dissertação de Mestrado (Programa de Pós-Graduação em Computação). Universidade Federal Fluminense, Ni-terói-RJ (2006)

12. Mui, L., Halberstadt, A., Mohtashemi, M.: Notions of Reputation in Multi-Agents Sys-tems: A Review. In: AAMAS 2002, Bologna, Italy (2002)

454 C.C.P. Cruz et al.

13. O’Donovan, J., Smyth, B.: Trust in Recommender Systems. In: Proceedings of the 10th In-ternational Conference on Intelligent User Interfaces - IUI 2005, San Diego, California, USA (2005)

14. Resnick, P., Varian, H.R.: Recommender Systems. Communications of the ACM 40(3), 56–58 (1997)

15. Resnick, P., Zeckhauser, R., Friedman, E., Kuwabara, K.: Reputation Systems. Communi-cations of the ACM 43(12), 45–48 (2000)

16. Senge, P.M.: The Fifth Discipline - Art and Practice of the Organization that Learns, New York, Doubleday (1990)

17. Wenger, E.C., Snyder, W.M.: Communities of Practice - The Organizational Frontier. Harvard Business Review, 139–145 (2000)

18. Wenger, E.C., Snyder, W.M., McDermott, R.: Cultivating Communities of Practice - A Guide to Managing Knowledge. Harvard Business School Press, Cambridge (2002)