Information Management Best Pratices

323
BEST PRACTICES

Transcript of Information Management Best Pratices

Best Practices

Information Management Best Practices - Volume 1

“Regardless of the kind of information you need to manage, this book

will make your projects better.”

BoB BoIko, Washington University

“The TIMAF organization proves its dedication to raising the bar for

information management practitioners everywhere. It has assembled the

best thought leaders in the field to share insights from case studies which

are both actionable and universally relevant. Even the most experienced

IM professionals will learn something new with each turn of the page.”

Scott LIeWehr, President cM Professionals

“It is, quite frankly, the best collection of case studies and solutions I’ve

run across and will be an invaluable resource for our readers. The reports

are solid, practical and real-world examples of how to do things right.”

hUgh MckeLLar, editor in chief kM World Magazine

Colophon

titleTIMAF Information Management Best Practices – Volume 1

editorsBob Boiko – USA - Erik M. Hartman – NL

copy-editorsJonah Bull – USA – Jenny Collins – USA - Elishema Fishman – USA

PublisherErik Hartman Communicatie – NL

editionVolume 1 – 1st impression – November 2010

ISBN978-94-90164-03-4

Design & Layout Nevel Karaali – NL

PrintWöhrmann Print Service – NL

© 2010, tIMaF

All rights reserved. No part of this publication may be reproduced in any form by

print, photo print, microfilm or any other means without written permission by the

publisher. Although this publication has been composed with much care, neither

author, nor editor, nor publisher can accept any liability for damage caused by possible

errors and/or incompleteness in this publication.

traDeMark NotIce

TIMAF ® is a Registered Trade Marks and Registered Community Trade Marks of the

Office of Government Commerce, and is Registered in the U.S. Patent and Trademark Office.

Please contact the editors for ideas, suggestions and improvements at [email protected].

5

About the TIMAF Library

This ‘Information Management Best Practices’ book is a publication in the TIMAF Library. The publications in the TIMAF Library are aimed at promoting Information Management and are published on behalf of TIMAF. TIMAF, the Information Manage-ment Foundation, is an initiative of information management practitioners to provide a strong and clear foundation of information management.

TIMAF encourages authors from around the world, who are experts in their particular information management sub discipline, to contribute to the development of the TIMAF publications. Are you interested in sharing your ideas and experiences online with the TIMAF Community? Visit www.timaf.org and join the discourse. Have you experienced the merits of a specific approach to information management strategies, methodologies, technologies or tools? Submit a proposal according to the requirements listed in the ‘Call for Best Practices’ at www.timaf.org.

The following publications are available in the TIMAF Library:

Introduction booksInformation Management Framework paper edition - release: September 2011

Best Practices booksInformation Management Best Practices 2009 Sneak Preview online edition - www.timaf.org

Information Management Best Practices – Volume 1 paper edition - ISBN 978-94-90164-03-4

Pocket guidesInformation Management Framework – A Pocket Guide paper edition - release: November 2011

Social NetworksInformation Management Framework Wikiwww.timaf.org/wiki

We will publish new books and other instruments on a regular basis. For further enquiries about the Information Management Library, please visit www.timaf.org or send an email to [email protected].

6 TIMAF Information Management Best Practices Vol. 1

7TIMAF Information Management Best Practices Vol. 1

Foreword by Bob Boiko

Every profession has to start somewhere. Before aerospace engineers called themselves that, they were still doing aerospace engineering. They simply did not differentiate themselves from the mechanical, electrical, and the other sorts of engineers that were around. What happened at that point where they became a profession? A few of them got together and said “aren’t we more like each other than we are like them?” And, a new profession was born.

Who are “we” and are we also more like each other than like “them?” A few of us think so. We think “they” are the managers, programmers, network professionals, writers, architects, designers and other professionals we used to be. “We” are the people who believe that whatever else we do, we are really managing information.

Just as the aerospace engineer is still an engineer, we are still the professionals we once were. But we have decided that the creation, storage, distribution, and use of information drives and guides the rest of what we do.

So we have decided to try to be information managers. But what does an information manager do? How does she do her work? Where, when and with whom does she work? If we really are more like us than like them, we need to answer these questions. That is what this book is about.

What does an information manager do? Read on and you will find out. How does she do it? There are step-by-step instructions; read on. What we have complied is not exhaustive, but it is a start. It covers a wide enough range of “our” activities to show different projects are all part of the same profession.

Along with the practice of a profession come its principles. We have tried to provide just enough of the principles of information management to tie our practices together. However, this is not the place for abstract discussions. This book is focused on showing who we are by describing exactly what we do.

If you are or want to be an information manger, you will find projects in this book that are like the ones you do or want to do. If you are “one of us” you will see far more similarity than difference in the projects we present. And, if you are like us, as you read, you also be reaching for what it means to be an information manager and will join us in figuring that out.

Bob BoikoSeattle, USA 2010

8 TIMAF Information Management Best Practices Vol. 1

9TIMAF Information Management Best Practices Vol. 1

Foreword by Erik M. Hartman

“Man has always been, and will always be, in search of information. It’s a primary objective. Because people want to know.” (G.C. Nielen, The purpose of information, 1976)

Personally I have always been fascinated with the phenomena of Information. It’s the main reason why I studied Information at the university. It’s the reason why I made Information my profession. It’s the reason why I’m always keen on meeting other people who share the same fascination, be it IT or information professionals, artists, writers, or philosophers. I have learned so much from them and I hope I will continue learning from them.

The Information Management Foundation (TIMAF) is an initiative that tries to connect people with a fascination for information and to help them in sharing their insights, experiences and vision. So we can all learn.

The outcome of this sharing will hopefully be a better understanding of infor-mation, information management, the use of information and how to add more value to information for people and organizations. I think this is important because since we are living in the Information Age we spend a lot of time, money and energy on managing information.

I can only agree when in his best practice article Alan Pelz-Sharpe says: “Information management technology often impacts a much broader array of processes and people within an organization than other types of techno-logy.” As information management professionals we see the reality of this claim every day.

This TIMAF Publication presents nineteen information management best practices where we can all learn from. It’s a first but important step. It took us two years to get this book out, but we succeeded. I hope you will enjoy the reading as much as we enjoyed the writing.

Erik M. HartmanUtrecht, Netherlands 2010

10 TIMAF Information Management Best Practices Vol. 1

11TIMAF Information Management Best Practices Vol. 1

Contents

Foreword

acknowledgements

the team

Introduction

generic Principles of Information

authors

call for contributions

19 Best Practices

35 Creating a Desired IM State - Understand, rationalize and take control of the state of information management in your organization By: Boiko, Bob & Mary Lee Kennedy57 Building a Distributed Metadata Standards Organization - Technical and Organizational Lessons in Information Management from the Dublin Core Metadata Initiative By: Crandall, Michael83 Building and Maintaining a Knowledge Transfer Toolkit - Transferring detailed, technical knowledge from one part of an organization to a non-technical audience By: De La Rue, Keith93 Do Standards in Content Technologies Matter? - How to choose when and if to use standards By: Durga, Apoorv103 Coordinating Enterprise Information Management Investments - Battling Organizational Entropy with Flexible Information Management Standards By: Gollner, Joe131 Creating a Product Information Strategy for your Corporate Website - Managing Product Information in an International and Decentralized Web Environment By: Grünauer Volker & Drazen Djukic139 Enhancing Unity and Creativity With an Internal Social Media Platform - How Using Social Media Can Help Strengthen Your Organization By: Haan, Peter147 Creating a Web Analytics Strategy for Your Organization - How to Build a Web Analytics Plan and Roadmap By: Kemelor, Phil159 Managing Media Creation to Meet Business Requirements - Using Videos to Enhance a Software Product By: Larkin, Jim & Bill Birney171 Integrating Taxonomy and Content Management - Dynamic Content Through Taxonomy By: Lemieux, Stephanie & Seth Earley & Charlie Gray

07

13

15

17

19

23

33

35

12 TIMAF Information Management Best Practices Vol. 1

191 Revolutionizing Product Content - Using Social Content and Media to Revitalize Prodoct Documentation By: Myers, Kelley203 Creating Clear Governance - How to enable good decision making in complex organizations By: Oakes, Graham221 Auditing Your Information Management Practices - How to Make Information Auditing Part 4. of Your Repertoire for Strategic Information Management By: Orna, Liz & Jane Jennings241 How to Successfully Procure Information Management Technology - And not End Up With a Lemon By: Pelz-Sharpe, Alan253 Streamlining Your Path to Metadata - Develop a Metadata Strategy in Eight Steps By: Swart, Stacey & Charlotte Robidoux271 Managing Content - An Intelligent Content Strategy By: Rockley, Ann281 Managing Information: From Reluctance to Acceptance - How to get users to use an information management system By: Vednere, Ganesh297 Gathering Information Requirements - Using semi-structured ethno- graphic interviews to determine user information requirements By: White, Martin307 Governing Data Management - Creating a Constitutional Democracy for Data Management By: Yock, Bill

13TIMAF Information Management Best Practices Vol. 1

Acknowledgements

This first volume of the TIMAF ‘Information Management Best Practices’ is the result of the mutual effort of subject experts from around the world. We are happy we found people from various sub disciplines ready to share their knowledge and experience for such a ‘holistic’ initiative as a framework for information management. This enthusiasm only confirmed us that we chose a right approach. With already receiving some proposals for the second volume, we are confident the information management market will be provided with a steadily growing body of knowledge around information management best practices.

26 authors contributed to this first volume. They sent in a proposal and were quickly confronted with a critical reply from the editorial board, quite a strict template and challenging deadlines. We are very grateful for the efforts they made in balancing this time-consuming process with their already busy lifes. We feel we became travelling companions in a quest we all believed in. Thank you for your confidence, your hard work and your endless patience: Mary Lee, Michael, Keith, Apoorv, Joe, Volker, Drazen, Peter, Phil, Jim, Bill, Stephanie, Seth, Charlie, Kelley, Graham, Liz, Jane, Alan, Charlotte, Stacey, Ann, Ganesh, Martin, and Bill.

Three people spent a lot of time in getting the editing process done, and they can not be praised enough for spending so much energy. Thank you Jonah, Jenny and Elishema for doing a great job!

For the editing environment, Lon and Laurens were so generous to let us use their XML editor Xopus and to support us with the installation and configura-tion. Thanks!

Without Nevel Karaali this first volume would never have been possible. Thank you for the book design and the help with getting everything in perfect state to the printer.

Last but not least we would like to thank all TIMAF community members for their contributions, the inspiring enthusiasm, the critical comments, the great discussions and the willingness to share their time, knowledge and experiences with us. It was a long way with a lot to learn but we are sure everybody involved is as happy with the outcome of this work as we are.

TIMAF Editiorial Board

Bob Boiko & Erik M. Hartman

14 TIMAF Information Management Best Practices Vol. 1

15TIMAF Information Management Best Practices Vol. 1

The Team

An enthusiastic group of editors across the world worked hard on this TIMAF ‘Information Management Best Practices – Volume 1’ edition.

editorial Board

The Editorial Board consists of two people who strongly believe in the goals of TIMAF and the value of building a strong and clear foundation for information management. They were responsible for preparing this book, commenting on the proposals, and coaching the authors.

BOB BOiKO (United States) is founder and president of Metatorial Services Inc., and is a faculty member of the University of Washington Information School. In 2004, Bob sparked the creation of CM Professionals, the first and only content management organization for practitioners. Recognized world-wide as a leader in the field of content management, he has almost 20 years of experience designing and building Web, hypertext and multimedia systems and tools for some of the world’s top technology corporations (including Microsoft, Motorola, and Boeing). Bob has sat on many advisory boards and is the recipient of many awards including the 2005 EContent 100 Award for leader-ship in the content management industry. He is author of two editions of The Content Management Bible and Laughing at the CIO: Leading Information Management in the Information Age. Bob is internationally known for his lectures and workshops.

ERiK M. HARtMAn (the Netherlands) is owner of the independent consultancy firm Erik Hartman Communicatie, and founding member and former President of CM Professionals and the eGovernment Community. He is author of several content management publications (both paper and online) and a frequent speaker at international events. Erik is an AIIM certified trainer in Business Process Management (BPM) and Information Organization and Access (IOA). As an independent consultant in the field of information architecture, content management strategy and communications strategy Erik is involved in several information management projects for the top 500 organizations in the Benelux. Erik lives on a houseboat and has his office in the historic centre of Utrecht (NL). His favorite role is being a coach and discovering with other practitioners how to find a better way in dealing with information challenges.

16 TIMAF Information Management Best Practices Vol. 1

copy-editors

The Copy-editors assisted the Authors and the Editorial Board with editing the proposals, the drafts and the final edits.

JOnAH BuLL (United States) is a student in the MLIS program at the University of Washington. His interests include: information architecture, metadata design, typography and web development.

JEnnY COLLinS (United States) has more than a decade of technical writing and editing experience. She has worked on documentation for many different types of software, including content management, business process manage-ment, health information exchange, and speech recognition software.

ELiSHEMA FiSHERMAn (United States) is a research assistent at the Institute of Translation Health Sciences at the University of Washington, where she conducts user research and assists with project management. Elishema graduated with a BA in English from UCLA in 2005 and is currently working on a MS in Information Management at the University of Washington’s Information School.

Publisher

This Best Practices edition is a TIMAF publication, published by Erik Hartman Communicatie (Netherlands).

17TIMAF Information Management Best Practices Vol. 1

Introduction

InForMATIon? MAnAge!Information is the term we use to stand for all forms of preserved communica-tion that organizations care to produce, store and distribute. If we communi-cate it and record it, it is information. So, for us, information is anything from sales figures in a database to a video on philosophy viewed on a mobile phone.

We define information management as the organized collection, storage and use of information for the benefit of an enterprise.

Our definitions are intentionally wide enough to cover content, document, asset, data, records and all other ‘information managements’ that organizations do. We believe that while each of these “sub-disciplines” has its own tools and types of information, there is much more that unites them than divides them. Our definitions are intentionally quite practical. For us, information manage-ment simply means moving pieces of recorded communication from creation to consumption to retirement. Our definitions are crafted to carve out a niche for the information manager. Information managers make sure that recorded communication can be amassed and distributed in a way that benefits their organization. Finally our definitions are crafted to be a simple guiding principle. Any person working in any information project can use this defini-tion to remain focused on the ultimate aim of their particular kind of work.

InForMATIon MAnAgeMenT? TIMAF!The field of information management is currently fractured and incoherent. Each sub discipline (content, document, asset, data, records management to name just a few) has its own practitioners, applications and professional communities. We believe that behind the seeming differences between these ‘managements’ there is a deeper unity that will eventually define a strong and clear foundation for all of them.

We do not believe that all managements will or should merge, but rather that just as business underlies a variety of business practices including accounting and finance, there is a common foundation for the various forms of informa-tion management.

The Information Management Foundation (TIMAF) tries to provide this foun-dation by publishing these information management best practices. In addition, TIMAF develops and maintains an information management framework that brings the commonalities between sub disciplines to light and helps to organize the best practices that we publish.

18 TIMAF Information Management Best Practices Vol. 1

BesT sTArT? BesT PrACTICe!Just as business is practiced within a more specific context, information management is also practiced in context. Thus, we believe that the best way to illustrate the concepts and practices of information management is within the context of one or more sub disciplines. So, this best practices book tries to show global principles of information management in the context of projects in one or more of the sub disciplines.

This is the first volume of ‘Information Management Best Practices.’ In future publications we will provide an ongoing compilation of high quality best practice guidance, written for and by experts in the Information Management field from around the world. These best practices are designed to help professionals overcome their information management challenges. They bring complex models down to earth, with practical guidance on tough problems.

In this volume, practitioners describe nineteen projects that you can learn from, In return, we ask that you let us learn from you! Please let us know what your experiences are with these or other projects at www.timaf.org.

19TIMAF Information Management Best Practices Vol. 1

generic Principles of Information

Whether your project manages records, content, pages, assets or anything else, it manages information. To provide a context in which to view the projects described in this edition, we present this brief overview of our perspective on information and information management. It is not intended to be the last word on the subject. Rather, we intend it to begin a very long and productive conversation on the nature of what we do. With the generic principles below we can more easily discuss the common-alities between all the sub disciplines and from there focus on the added value of an informa-tion management framework.

InForMATIon Is CoMMunICATIonWe begin with the idea that before and behind information there is communication. Ourinformation management systems are simply our latest attempt to talk with each other. The channels and volumes have changed but the basic desire to inform, educate, provoke and persuade by communicating are the same for us as they have been for humans across all time. The theme of communication is echoed again and again in these best practices:

• JoeGollner(‘CoordinatingEnterprise Information Management Investments’) makes it clear that communication is the only way to “overcome the inherent individualism (often bordering on isolationism)” within the organisation. • PeterHaan(‘EnhancingUnityand Creativity With an Internal Social Media Platform’) describes how his organization facilitated communication and the sharing of information to increase social cohesion and productivity.• AlanPelz-Sharpe(‘HowtoSuccessfully Procure Information Management Techno-

logy’) states that communicating, or “conduc- ting different conversations at technical, commercial, and business process levels,” is the most important thing in procurement projects. Information management techno- logy impacts a much broader array of processes and people within an organization than other types of technology, so communi- cation is vital.

InForMATIon hAs VALueThe reason we care to amass and deliver infor-mation at all is because it is valuable. While it is notoriously difficult to measure the value of information, as our authors show, it is none-theless critical to establish a basis for assessing the worth of the information you provide:

• KeithDeLaRue(‘BuildingandMaintaining Knowledge Transfer Toolkit’) shows that you must focus on the value of information if you want to establish knowledge transfer between groups from different backgrounds. • ApoorvDurga(‘DoStandardsinContent Technologies Matter?’) is our guide in the jungle of standards. He examines the value to be gained from using one standard over the other.• PhilKemelor(‘CreatingaWebAnalytics Strategy for Your Organization’) shows that Web analytics is all about proving value. Phil describes how to setup an analytics structure that helps measuring which kinds of informa- tion have value and which do not.• JimLarkinandBillBirney(‘MediaCreation to Meet Business Requirements’) clearly articulate the relationship between media and business value.• GrahamOakes(‘CreatingClear Governance’) states that well-defined gover- nance allows people to focus their energy on the value of the decision, not on the decision-making process itself.

20 TIMAF Information Management Best Practices Vol. 1

InForMATIon hAs AudIenCesInformation, like communication is a trans-action, the organization, as we have said, must derive some value from providing the information. And as our authors point out, the consumer (or audience) of the information must also derive value.

• MichaelCrandall(‘BuildingaDistributed Metadata Standards Organization’) makes it clear how important it is to focus on the motivations of your stakeholders if you want your information strategy to be successful. No matter how relevant your goals, if your stakeholders are ignored your chance of success is low.• GaneshVednere(‘ManagingInformation: From Reluctance to Acceptance’) describes this phenomena as well. He offers a set of steps to prevent your information manage- ment system from not being used at all due to lack of user adoption.• KelleyMyers(‘RevolutionizingProduct Content’) shows how a focus on “the voice of the audience” to create a unique model of acquiring and publishing content.• LizOrnaandJaneJennings(‘AuditingYour Information Management Practices’) dive deeper into audiences, They show that the essential focus lies in how human beings really think, feel, and behave in relation to information in their organizations. They chart the effects of this relationship on the success of an organization in meeting its goals.• MartinWhite(‘GatheringInformation Requirements’) takes user requirements to a new level. He uses semi-structured ethno- graphic interviews to determine user information requirements’. His ethnographic approach focuses not on what users say, but how they behave.

InForMATIon hAs A LIFeCyCLeWhether your information comes in records, files, documents, or any other forms, it goes (or more to the point should go) through a well defined and targeted series of phases between creation and eventual retirement. Our authors spend considerable time on elucidating this lifecycle processing.

• AnnRockley(‘ManagingContent-An Intelligent Content Strategy’) shows the importance of not only focusing on the creation of content, but also on the tagging, conversion, publication and other processes that support the lifecycle of information.• DrazenDjukicandVolkerGrünauer (‘Creating a Product Information Strategy for your Corporate Website’) show how they created lifecycles for their product information that vary from country to country.• BillYock(‘GoverningDataManagement’) introduces a unique data management approach that democratizes information management throughout its lifecycle. His article shows how to make a repeatable and sustainable decision making process.

InForMATIon hAs sTruCTureWe are used to thinking of database records as highly structured information. But in fact, all the forms of information have (or again should have) their own well controlled structure. And, as our authors point out, defining the structure of your information is a key part of any information management project.

• BobBoikoandMaryLeeKennedy (‘Creating a Desired IM State’) show that structuring information is the key to getting ahead of information projects and finding a place at the information technology table.• StephanieLemieux,SethEarleyandCharlie Gray (‘Integrating Taxonomy and Content Management’) show that a taxonomy is not

21TIMAF Information Management Best Practices Vol. 1

only a way to organize information but is also a foundation for increasing levels of content management functionality and maturity in an organization.• StaceySwartandCharlotteRobidoux (‘Streamlining Your Path to Metadata’) provide a successful approach to metadata strategy development, which can help an organization untangle their information.

Humans will always record their communications with each other to inform, provoke, and persuade. Today’s powerful and ubiquitous technologies make communication more complex but also much more powerful. But to obtain the value of recorded communica-tion you must be clear and disciplined about its value to the providers and consumers. You must create information transactions that bring value to both sides. You must also define a consistent and enforceable information structure andlifecycle process that assures a steady supply of valuable information to all your audiences. This, in our view, is information management. And this is what the best practices in this book will help you achieve.

22 TIMAF Information Management Best Practices Vol. 1

23TIMAF Information Management Best Practices Vol. 1

Authors

The following Authors wrote a best practice on information management for the Volume 1 edition.

Birney, BillBill Birney (United States) is a technical writer, audio engineer, and audio and video producer. Before joining Resources Online, he spent five years on the Windows Media team at Microsoft Corporation, and has written, produced, and done post-production on numerous projects throughout his career. He currently leads video production at Resources Online.

Boiko, BobBob Boiko (United States) is founder and president of Metatorial Services Inc., and is a faculty member of the University of Washington Information School. In 2004, Bob sparked the creation of CM Professionals, the first and only content management organization for practitio-ners.

Recognized world-wide as a leader in the field of content management, he has almost 20 years of experience designing and building Web, hypertext and multimedia systems and tools for some of the world’s top technology corpora-tions (including Microsoft, Motorola, and Boeing). Bob has sat on many advisory boards and is the recipient of many awards including the 2005 EContent 100 Award for leadership in the content management industry. He is author of two editions of The Content Management Bible and Laughing at the CIO: Leading Infor-mation Management in the Information Age. Bob is internationally known for his lectures and workshops.

Bob is an extremely skilled analyst, facilitator, teacher, designer, and architect and hasextensive expertise in content and knowledge

management, authoring, multimedia design, Web publishing and tool construction. He has undergraduate degrees in physics and oceano-graphy and a graduate degree in human communication.

University of Washington’s Information School is a group dedicated to helping shape the emerging electronic information discipline. At the iSchool, Bob has trained hundreds of information professionals and designed programs in content management. Metadata, business analysis, information initiative planning, information architecture, information system design, and public access to information are some of the skills Bob brings to the table.

Before Metatorial Services, Bob co-founded Chase Bobko Inc. and built it into a leading content management service provider. Within Chase Bobko, Bob served as President, Head of Development, Head of Sales and Marketing, and Head of Operations. Bob and his twopartners grew Chase Bobko from a staff of to a staff of over 100.

crandall, MichailMichael Crandall (United States) is Chair of the Master of Science in Information Management program and a Senior Lecturer in the Informa-tion School of the University of Washington. Prior to joining the Information School in January of 2005, he spent four years as Techno-logy Manager for the Bill & Melinda Gates Foundation’s Libraries and Public Access to Information Program, with responsibility for software development, technical support and network deployment for over 40,000 public access computers in more than 11,000 libraries across the United States. As part of this project, Michael also initiated and managed the program grant for development of Web Junction, aninternational public access computing portal.

Before this, Michael was Manager of the

24 TIMAF Information Management Best Practices Vol. 1

Knowledge Architecture Group in Microsoft Information Services, responsible for design and management of the technical infrastructure and search services for Microsoft’s intranet portal site, MSWeb (2.2 million requests and 31,000 users per month), and for the design, development, and management of a corporate taxonomy project to support content manage-ment and retrieval throughout the company. Prior to Microsoft, he worked at the Boeing Company on multiple projects related to infor-mation management and information architec-ture, including an internal real-time newsfeed, the intranet search engine and portal subject access tools, and the company library’s Website. He was a member of the Boeing Information Management Standards Board, the Web Adviso-ry Board, the Knowledge Management team, and the Structured Information Objects (meta-data schemas) Technical Working Group.

Michael is a frequent speaker at industry and academic conferences (including KMWorld, Intranets, Internet World Forum, Asian Digital Libraries Conference, European Business Information Conference), and has participated in roundtables and symposia sponsored by the Montague Institute, Urban Libraries Council, Intel, and TFPL. He has provided consulting services for the National Agricultural Library, Harvard Business School, the Council on Library and Information Resources, Caterpillar Corporation, CDM, and others. He is a founding member of the Dublin Core Metadata Initiative Board of Trustees, a member of the Washington State Access to Technology Justice Strategy Board, and active in the American Society for Information Science and Technology and other professional organizations.

De La rue, keithKeith De La Rue (Australia) works as an independent consultant in knowledge manage-ment, communication and learning at AcKnow-ledge Consulting in Melbourne, Australia. He

previously spent eight years working in Know-ledge Management at Telstra, Australia, where he led a team engaged in the transfer of product and service knowledge to Telstra’s enterprise and business sales force.

Djukic, Drazen Drazen Djukic (Austria) is responsible for the content management strategy and development of Wienerberger’s websites. He managed the roll-out of the company’s websites and online product catalogs to more than 25 countries and participated in building up a corporate intranet. Drazen holds a Bachelor’s degree in Informa-tion and Communication Systems and Services from the Austria’s Technical University for Applied Sciences.

Durga, apoorvApoorv Durga (India) is an analyst with The Real Story Group where he covers Portals and Web Content Management. Before joining this company, Apoorv was a Practice Head for the Portals and Content Management Practice of Wipro Technologies where he helped clients with their web strategies. Apoorv has a Bach-elor’s degree in Engineering from the Institute of Technology, Varanasi (India) and a Master’s degree in Management from the Indian Institute of Management, Calcutta (India). Apoorv is based in New Delhi, India.

earley, SethSeth Earley (United States) got his start in consulting in graduate school working for the SBA sponsored Small Business Development Center. He continued to work with a variety of businesses in the technology, real estate development and hospitality sectors. He founded Earley & Associates, an information management consulting firm in 1994. Some of his initial clients included analyst and consult-ing organizations for whom he developed some

25TIMAF Information Management Best Practices Vol. 1

of the first document and content management applications based on Lotus technology. He performed research projects on success factors for knowledge management and collaboration tools which included surveys of Fortune 500 comapnies embarking on early forays into content technologies that preceded widespread intranet and web technology adoption.

His company performed work for the IBM Office of the CIO after the company acquired Lotus, helping to analyze their global deploy-ments of Notes technology. Seth was also author or courses and workshops on knowledge management that applied KM principles to marketing, selling, architecting and delivering KM projects using IBM’s array of technologies including instant messaging and awareness-based tools, workflow systems, document/content management applications, collaborative workspaces, portal tools and search systems. He personally delivered these workshops to IBM business partners and customers throughout Asia-Pacific, Latin America, North America and Europe.

Recognizing that taxonomy and metadata was a unifying principle throughout the various projects he worked on, and sensing a need in the marketplace for deep expertise in this area, Seth bean specializing in the field and carving a niche for his organization as a world renowned expert. In that time he has led and delivered projects in content strategy, metadata design and taxonomy integration for IBM, Textron, Aetna, AIG, Parametric Technologies, Progress Software, State Street Bank, Motorola, Hart-ford Insurance and many others. He has also founded communities of experts in KM, taxono-my, search, and most recently in SharePoint technologies.

gollner, JoeJoe Gollner (Canada) is the Director of Gnostyx Research (www.gnostyx.com), an

initiative focused on advancing open content standards and on leveraging the intelligent content technologies that these standards have made possible. In order to help organizations make the most of these important trends, Gnostyx Research evaluates emergent standards and technologies, performs program reviews, authors case studies, prepares business cases and plans, conducts educational programs, develops enterprise content strategies, proto-types possible technology solutions, oversees acquisition efforts, and guides implementation investments. Connecting these research activities has been the growing recognition that open content standards offer organizations the opportunity to address long-standing challenges in how technology investments are managed.

In addition to his activities with Gnostyx, Joe is also the Chief Solutions Architect (Intelligent Content Technologies) for Stilo International (www.stilo.com), where he works with leading organizations around the world in the design, development and deployment of content man-agement and publishing systems that demand high levels of precision and performance. Previously, he was the president of XIA Systems Corporation, a highly profitable and widely respected XML solution integrator that he founded in 1998 and that he sold to Stilo International in 2004. As can probably be guessed, he was the independent consultant that was engaged by the DND CALS Office in 1994 and that served as a core team member of the CALS initiative with specific responsibil-ity for overseeing the development, deployment and management of the information manage-ment standard and the shared enterprise solu-tion.

The lion’s share of Joe’s career has been dedicated to helping large-scale organizations manage and publish content more effectively. During the last twenty years, he has guided the implementation of dozens of content manage-ment and publishing solutions in such industries

26 TIMAF Information Management Best Practices Vol. 1

as aerospace, defense, education, healthcare, transportation, engineering, energy, legal, publishing, software, telecommunications and government.

A former Artillery Officer in the Canadian military, Joe was educated in a variety of subjects at Queens University (Bachelor’s of Arts, Mathematics and Literature) and the University of Oxford (Master’s of Philosophy), and he has also completed graduate programs in project management, business analysis and knowledge management. He is a popular speaker at industry events that focus on content management and publishing, and for several years he chaired the XML World series of conferences.

gray, charlieCharlie Gray (United States) has 24 years of industry experience in everything from robotic systems to sales and marketing. Charlie has a BSEET from DeVry and a MBA from Lake Forrest Graduate School of Management. His career with Motorola spans over 15 years, and over the last 6 years Charlie has been focused on content management strategy and has successfully deployed a single content manage-ment system that is used across all business units within Motorola today for all their public websites.

Currently, Charlie is managing the day to day operations of www.motorola.com in roughly 45 countries and 25 languages, along with the content management strategy for Motorola. Charlie’s next major focus within this role is to drive the web towards a mobile experience that includes personal and relevant content for the user.

grünauer, VolkerVolkerGrünauer(Austria)isVolkerisbasedin Austria and responsible for the marketing

strategy of all 37 websites in 25 countries at Wienerberger, the world’s largest manufacturer of bricks, clay roof tiles and clay pavers. In this function he also developed a new brand and domain management strategy. Together with the IT department he managed the rollout of the CMS into new Wienerberger markets.

Previously, Volker worked several years as art director at elements.at, a web-design firm located in Salzburg, where he created interac-tive experiences for customers in different business sectors. In this position he was also responsible for the topics Search Engine Marketing and Search Engine Optimization for some large scaled websites.

Volker holds a Master’s degree in E-Business from the University of Applied Science of Upper Austria. In his spare time Volker works as a skiing-instructor in the mountains of Salzburg.

haan, PeterPeter Haan (Netherlands) is Peter Haan is Communications Manager Online Media for KPN and Getronics at KPN’s Corporate Communications. His main focus is on the application of new and online communication tools and techniques for both internal and external communications. Peter developed the concept for the social media platform at KPN and was responsible for the realization and implementation of this platform, including communications and governance. He has previously held various communications positions within the Ministry of Transport, Public Works and Water Management.

Jennings, JaneJane Jennings (United Kingdom) started work for Essex County Council in 1966 as a library assistant. With an ambition to qualify as a librarian, she set out to acquire the qualifica-

27TIMAF Information Management Best Practices Vol. 1

tions required to gain a place at university. In 1979 she began a degree course in History at Essex University and graduated in 1982. She took her postgraduate diploma in librarianship at the Polytechnic of North London in 1984/85, and gained chartered status in 1987. Her most recent foray into the world of qualifications has been to gain an MSc in Information and Know-ledge Management from London Metropolitan University (2004).

Following the first information audit, Jane has gone on to conduct three other significant audits for Essex County Council. She has also used the approach in other information management activities which require finding out how information is used and managed. Jane is currently secretary for the Network of Infor-mation and Knowledge Exchange (NetIKX), a national group of people who are interested in putting information and knowledge management into practice, and seeing their organisations benefit from the results.

kemelor, PhilPhil Kemelor (United States) is Vice President, Strategic Analytics for Semphonic, and he manages the company’s government and non-profit practice. In his role with Semphonic, Phil leads web analytics, strategic planning and governance projects, and web metrics develop-ment and vendor selection engagements for organizations such as the National Cancer Institute, National Heart Lung and Blood Institute, Consumers Union, AARP and National Geographic. Through this project work, Phil has refined Semphonic’s strategic planning framework to encompass analysis of organiza-tions’ relationships with web analytics based on management commitment, communication, organization, metrics, process and solution assessment.

Phil also serves as the lead analyst for The Real Story Group Web Analytics Report, an annual

evaluation of web analytics software vendors. Throughout Phil’s 13 year career in webanalytics he developed one of the world’s first web analytics program offices for Bell Atlantic (currently Verizon), composed The Executive’s Guide to Web Site Measurement and Testing, and served as a Principal Consultant forNetGenesis – a first-generation web analytics software company.

Larkin, JimJim Larkin (United States) is founder and CEO of Resources Online, a Seattle-based content and production company established in 1995. As CEO, Larkin maintains strategic direction over all projects the company undertakes. The company currently employs 45 people, and engages in content architecture and execution projects, content management consulting, and content creation that includes thousands of instructional and marketing videos, books, commercial products, computer applications, web sites, interactive titles, documents, and other deliverables.

Prior to founding the company, he worked as a technical writer and editor. In the 1980’s Jim worked on the research staff at the Institute for Research and Information in Scholarship, Brown University, which, among other activi-ties, built and deployed an early forerunner of the World Wide Web.

Lemieux, StephanieStephanie Lemieux (Canada) is passionate about taxonomy - she helps organizations get the most out of their content by making it findable and usable. A graduate of the Masters in Library and Information Studies (MLIS) program at McGill University, Stephanie was one of the first through their knowledge management (KM) specialization. She started her career as a KM consultant working for an electric utility company, helping with various

28 TIMAF Information Management Best Practices Vol. 1

programs aimed at capturing and managing knowledge from a large population of special-ized workers facing retirement. In this role, she helped implement and explore the use of social software for knowledge sharing such as wikis and blogs, and developed information architec-ture and taxonomy for specialized intranet and portal sites.

After a brief stint as a freelance consultant working for a technology firm specializing in entity extraction and auto-categorization, Stephanie joined Earley & Associates to focus on her passion - taxonomies and content management. Among her first clients was Motorola, with whom she had the pleasure to work for over 3 years develping and implement-ing a global taxonomy and governance (partly detailed in this book.) She has since worked with many organizations, including American Greetings, AstraZeneca and the Ford Founda-tion, helping develop and integrate taxonomies, governance processes and improving search tools. Stephanie has recently worked more heavily with SharePoint information architec-ture and taxonomy for digital asset manage-ment.

Over the years, Stephanie has had the opportunity to speak at conferences worldwide on a broad range of topics, from search strategy to taxonomy/folksonomy hybrids. She has specialized in taxonomy for faceted search/guided navigation. She continues to help manage the Taxonomy Community of Practice.

Currently, Stephanie is the Director of Search & Content at the Yellow Pages Group in Canada. She and her team are responsible for ensuring a relevant and intuitive search experience on www.yellowpages.ca, accomplished through many avenues such as search engine tuning, taxonomy optimization, SEO and content management.. As a second job, Stephanie is learning how to be a step-mom to two wonder girls and in her mythical spare time enjoys

crafts from days past such as knitting and making jam.

Myers, kelleyKelley Myers (United States) has worked over the ten years to acquire content through creative channels without spending a dime. She began her career as a writer and associate publisher for a small magazine in NorthernColorado. In 1999 she switched from print media to online content development when she launched a web-based journal dedicated to military and defense issues in modeling and simulation. Her experiences in Washington, D.C. helped shape a thrifty approach to acquiring content for business purposes—often with no budget and limited licensing options. Over the course of six years, Kelley continued to hone her online publishing practices using a limited budget, spotty technical support, and sheer determination. In 2005 she began working for Microsoft Corporation as an online con-tent manager for the Work Essentials Website, formerly part of Office Online. In this role, Kelley acquired content that provided occupa-tion specific advice and tips, written by industry experts. Kelley measured the impact of published content, and she implemented methods to improve content and streamline processes.

Today Kelley manages content development and online publishing for the Microsoft Advertising Website. In her current role, Kelley manages the strategy for how to aggregate a wide variety of content, integrate social media elements into web sites, and manage daily operations for online publishing. Kelley lives in Bellevue, WA with her two children, her husband, and an aging overweight Weimaraner named Zoe.

29TIMAF Information Management Best Practices Vol. 1

kennedy, Mary LeeMary Lee Kennedy (United States) is Executive Director of Knowledge and Library Services. The group is responsible for enabling the exchange of ideas, expertise and information in support of research and teaching at Harvard Business School. She oversees four practice areas: Research and Course Support, Informa-tion Management, Knowledge Dissemination, and Content Sourcing. Prior to Harvard Mary Lee led global knowledge management teams at Microsoft Corp. and Digital Equipment.

oakes, grahamGraham Oakes (United Kingdom) helps people untangle complex technology, relationships, processes and governance. As an independent consultant, he helps organizations such as Sony Computer Entertainment, The Open University, the Council of Europe and the Port of Dover to define strategy, initiate projects and hence run those projects effectively. Prior to going independent, Graham held positions including Director of Technology at Sapient Limited, and Head of Project Management for Psygnosis Ltd (a subsidiary of Sony). His book Project Reviews, Assurance and Governance was published in October, 2008.

orna, LizLiz Orna (United Kingdom) began her career in technical-book and magazine journalism; she went on to set up and manage information services for organizations, and to edit their publications. Studying information science at University College London with B C Brookes in the early 1970s brought all these strands together and set the course of her future work.

She became an independent information consultant, writer, and lecturer in 1979. An assignment in 1981 gave her the opportunity of developing an information policy for an organization, and she became identified with

the subject, and with information auditing, in the 1990s through Practical Information Policies (1990 and 1999) and Information Strategy in Practice (2004). Collaboration from the 1960s with information designer Graham Stevens led to her doctoral research on how organizations create their information products, and to Making Knowledge Visible (2005). The experience of working with first-time researchers, as a visiting teacher in the UK, Australia, and the US, resulted in Managing information for research (2009).

Pelz-Sharpe, alanAlan Pelz-Sharpe (United States) is a Principal Analyst and Partner at Real Story Group, an independent industry advisory firm that specializes in information management techno-logy. A 20 year veteran of the information management industry, Alan has written extensively on document, web, and records management topics and delivered keynote presentations at major industry events around the world.

He began his information management career in 1989 working for Shell, controlling engineering documentation on an offshore oil project. After a number of years working in the Oil and Gas industry, latterly as an independent consultant focused on document and records management for clients such as Shell, BP, Saga, and Halliburton, he moved into more general IT business consulting, before joining European research firm Ovum in 1999.

While there, he lead Ovum’s research into Integrated Document Management and Work-flow, and published Ovum Evaluates Integrated Document Management in late 99, Web Content Management—Strategies, Technologies and Markets in 2000, and E-process Workflow for the E-business in 2001.

In 2002 he moved to Boston to become Vice

30 TIMAF Information Management Best Practices Vol. 1

President of Ovum’s North American operations. He oversaw all Ovum’s research and consulting activities in the region. In 2004 he moved to Wipro, the Indian outsourcing firm, to lead their global ECM Consulting practice, working primarily with Fortune 1000 companies. In 2006, he joined Real Story Group as a Principal and Partner of the firm.

robidoux, charlotte Charlotte Robidoux (United States) is a Content Strategy Manager at Hewlett-Packard Company (HP) and has over 17 years ofexperience in technical communication. At HP, she oversees the single sourcing strategy and implementation for the StorageWorks Division. Charlotte holds a Ph.D. from the CatholicUniversity of America in rhetoric and technical communication. She is the author of Rhetorica-lly Structured Content: Developing a Collabo-rative Single-Sourcing Curriculum published in Technical Communication Quarterly. She is co-editor of Collaborative Writing in Virtual Workplaces: Computer-Mediated Communica-tion Technologies and Tools.

rockley, annAnn Rockley (Canada) is President of The Rockley Group, Inc. She has an international reputation for developing intelligent content management strategies and their underlying information architecture. Rockley is a frequent contributor to trade and industry publications, and a keynote speaker at numerous conferences in North America and Europe. She has been referred to as a “luminary” in the contentmanagement industry.

Ann has been instrumental in establishing the field in online documentation, single sourcing (content reuse), unified content strategies, and content management best practices. Rockley led Content Management Professionals—a member organization that fosters the sharing

of content management information, practices, and strategies—to a prestigious eContent 100 award in 2005. She currently co-chairs the OASIS DITA for Enterprise Business Documents subcommittee.

Rockley is a Fellow of the Society for Technical Communication and has a Master of Informa-tion Science from the University of Toronto. She is the primary author of the new book DITA 101: Fundamentals of DITA for Authors and Managers and of the best-selling book Managing Enterprise Content: A Unified Content Strategy.

Swart, StaceyStacey Swart (United States) is the Content Management System Administrator and Strategist at StorageWorks Division of Hewlett-Packard Company (HP). She has over 16 years in the tech industry in areas ranging from technical support to technical communication, and is certified by HP as a Lean Sigma Green Belt. Stacey holds a B.S. from the University of Kansas in Education and English.

Vednere, ganeshGanesh Vednere (United States) has been working in the content and records manage-ment space for a number of years. He has been involved in all aspects of content and records management programs including strategy, policy, compliance, records retention research, and technology implementation.

Ganesh is an active participant within the ECM/ERM communities, and is a member of several industry committees including AIIM Emerging Technologies Advisory Group and other records management working groups.

He serves on the board of AIIM Garden State Chapter, and is a certified electronic records management master from AIIM International.

31TIMAF Information Management Best Practices Vol. 1

Ganesh has a Bachelor’s degree in Computer Science.

White, MartinMartin White (United Kingdom) is Managing Director of Intranet Focus Ltd. and consults on the design and management of intranets, the development of enterprise information management strategies, and the implementation of content management and enterprise search software. He has been a Visiting Professor at the Department of Information Studies, University of Sheffield since 2002 and is also a member of the Publications Board of the Royal Society of Chemistry. Martin is a Fellow of the Royal Society of Chemistry and an Honorary Fellow of the Chartered Institute of Library andInformation Professionals. Martin is the author of the Eureka column on enterprise search issues for EContent magazine, for which he is a Contributing Editor. He was Chairman ofthe Online Information Conference from 1999-2006 and is the author of The Content Manage-ment Handbook and Making Search Work, and (with Stephen Arnold) Successful Enterprise Search Management. He has worked for clients in the USA, UK, Europe and the Middle East.

Yock, BillBill Yock (United States) is an Associate Vice Provost in the Office of Information Manage-ment at the University of Washington. He leads the Enterprise Information Services division, which provides services for information intelligence, delivery, design, integration, and definitions. Bill is a member of the UW Data Management Committee and the Privacy Assurance and Systems Security council. He is currently the chair of the Kuali Foundations Technology Roadmap Committee (www.kuali.org) and a board member of the Kuali Rice Project.

32 TIMAF Information Management Best Practices Vol. 1

33TIMAF Information Management Best Practices Vol. 1

Call for Contributions

send In A BesT PrACTICeHave you experienced the merits of a specific approach to information management strategy, methodology, technology or tools? If so, here is an opportunity to share your knowledge with a global audience in ‘Information Management Best Practices’.

To contribute to the book, simply submit a proposal, according to the requirements listed in the ‘Call for Best Practices’ at www.timaf.org.

We will accept proposals on an ongoing basis. Email your proposal to [email protected].

sPonsor The TIMAF BesT PrACTICesTIMAF and its publications are a volunteer based initia-tive. Your financial support is more than welcome. As a sponsor your name and company logo will be published on the TIMAF website and on the back cover of the ‘Information Management Best Practices’ edition. All sponsors receive a free copy of the paper edition and free access to the online edition.

For more details about sponsoring, please visit www.timaf.org/sponsoring or send an email to [email protected].

JoIn The TIMAF CoMMunITyInterested in sharing your ideas and experiences online with the TIMAF Community? Visit www.timaf.org and join the discourse.

Buy our PuBLICATIonsYou can now order publications from the TIMAF Library at www.timaf.org/library. With every book you buy, you support the TIMAF initiative.

TIMAF Information Management Best Practices Vol. 134

013501 Bob Boiko, Mary Lee Kennedy

Understand, rationalize and take control of the state of information management in your organization

Bob BoikoMary Lee Kennedy

BOB BOiKO ([email protected]) is founder and president of Metatorial Services Inc., and is a faculty member of the University of Washington Information School. In 2004, Bob sparked the creation of CM Professionals, the first and only content management organization for practitioners. Recognized world-wide as a leader in the field of content management, he has almost 20 years of experience designing and building Web, hypertext and multimedia systems and tools for some of the world’s top technology corporations (including Microsoft, Motorola, and Boeing). Bob has sat on many advisory boards and is the recipient of many awards including the 2005 EContent 100 Award for leadership in the content management industry. He is author of two editions of “The Content Management Bible” and “Laughing at the CIO: Leading Information Management in the Information Age”. Bob is internationally known for his lectures and workshops.

MARY LEE KEnnEDY ([email protected]) is Executive Director of Knowledge and Library Services. The group is responsible for enabling the exchange of ideas, expertise and information in support of research and teaching at Harvard Business School. She oversees four practice areas: Research and Course Support, Information Management, Knowledge Dissemination, and Content Sourcing. Prior to Harvard Mary Lee led global knowledge management teams at Microsoft Corp. and Digital Equipment.

creating a Desired IM State

o

36

AbstrAct

TIMAF Information Management Best Practices Vol. 1

The Knowledge and Library Services (KLS) team was tasked

with overseeing information at Harvard Business School (HBS).

Specifically, KLS needed to build an information management

(IM) practice that could integrate scholarly assets (information

produced by faculty) from across HBS, so that they could be

optimally available to a variety of users for a variety of uses.

For the foreseeable future, the assets will remain in separate

management systems. However, KLS must unify access to

assets, create efficiency gains and increase access by important

users.

To meet this challenge, a current IM state analysis was

conducted followed by a desired state analysis of scholarly

asset management at HBS. The current state analysis included:

1) an assessment of information creators and consumers with

their business and information needs; 2) current tagging

schemas and storage locations for information; 3) outputs

which are publications and other uses of the information.

The analysis provided our team with a definitive source of

knowledge about how IM was practiced at HBS. We used

the current state as a springboard to a desired state where

important information types are well structured, well created,

well managed and well delivered. The desired state included

a full model of the information types and organizational

structures required to meet user needs as well as the

management infrastructure needed to implement the model.

We used the current/desired state analysis as the basis of our

IM practice in these ways: 1) it provided a solid, agreed to

information and business value model behind IM at HBS;

2) it formed the basis of how we interact with IT projects;

and 3) it allowed us to specify a set of specific and achievable

recommendations to improve IM practices at a pace that

worked for the organization.

3701 Bob Boiko, Mary Lee Kennedy

Background

The project described here was part of a wider project that determined the governance and technology infrastructure for HBS information management. In addition, the project was done in the context of the following guidelines and constraints:

Asset management approach. Because the items we are managing are mostly files, we choose to treat information items as an asset management system would. In other words we control metadata but not item content. Our items are a mix of media files (video, sound and image) database records, HTML pages, paper and office files (word processing, PDF, spreadsheets and so on). Our metadata defines and organizes items but does not contain them.

informational not transactional use. Often IM projects will draw a division between trans-actional and informational data. They will say either “We are a transactional system” or “We are an informational system.” In our system, we instead distinguish between informational and transactional data use. In other words, it does not matter what the purpose of the data was when it was collected, we are interested in any information that can be discovered and used (non-transactionally) by our most important constituents.

We used an XML Schema and instance to define and display the current and desired state of IM at HBS.

The layouts and about elements are not part of the model. The other elements will be described in more detail below. We created XML transforms to display the information we gathered on a web site we titled the Harvard Business School Information Management Framework.

Figure 1: The top level elements that show our model of the current and desired state of IM at HBS

38 TIMAF Information Management Best Practices Vol. 1

The global navigation of the web site reflects the major elements of our model. The pages of the site were designed both for the architects to use as a reference and for the stakeholders to use to understand their roles in IM. The sections of the site will be detailed in the sections below.

The skILLs needed To PerForM ThIs BesT PrACTICeTo complete the procedure detailed in this work, one will need staff that possess the following skills:

• Datamodeler: Someone with strong relational database skills will need to lead the effort to collect, understand and work with the database schemas and other inputs that form the informa-tion landscape.• Informationmodeler: Someone will have to lead the construction of the desired state model. This person can be the same as the data modeler, but this is not just a data modeling task. In addition to relational

database or XML modeling skills, the infor- mation modeler must understand user needs and be able to translate them into a logical design. There are more details on this subject later in the article.• Business/useranalyst: Someone will have to collect and analyze user input, including developing use cases and confirming business requirements. • Businessliaison: Someone will need to work with the repository owners and operators to negotiate agreements, communicate the benefits and risks to them and others for cross-organizational work and manage relationships with them as individuals and as a cross-organizational team.

Figure 2: The web site produced by the XML system shows the current and desired IM state at HBS

3901 Bob Boiko, Mary Lee Kennedy

sTeP 1: ConsTruCT A CurrenT IM sTATeOur first major task was to build a solid under-standing and visualization of the current state of scholarly information management at HBS. Without an ability to see and understand the current state, we would not have been able to construct a realistic picture of the state we desired. In addition, no other group had a comprehensive view of how information was originated and distributed. With such a view, our group would be in a position to have well-informed and credible opinions on the current systems.

By the end of the current state analysis, we had a deep understanding of the way scholarly information was originated, stored and used at HBS. This understanding was fundamentally different from the understanding our Informa-tion Technology Team peers had. IT pursues a system-centered approach where the focus is on hardware and software needed to house and deliver information. We pursue an information-centered approach where the focus is on the creation and use of information. The two approaches are of course highly complementary. With the development of our information focused approach we are able to bring new expertise and advice to the table that was previously missing. In addition, we could provide a high-level comprehensive view of information management that highlights issues for groups as diverse as marketing and our office of research.

Model repositoriesWe began our process with the notion of a repository. A repository is a self-contained information system with limited ability to share information with other systems. In essence, our job was to find the repositories that were most important to scholarly information and integrate them into wider systems that unify their data. Operationally, a repository has:

• Sources of information. People can author information directly into the repository and/ or the repository can acquire information from existing sources.• Storage. Information is stored in the repository as any combination of text files, binary files and relational or hierarchical database records.• users. The system serves a set of users who depend on its outputs.• Outputs. The system can provide information to users through a variety of channels including Web pages, printed pages, client forms and email.

Information exists in many places at HBS. It was not realistic and not useful for us to chart all the places. Instead, we focused on a smaller set of obviously significant systems. Within that list, we used these criteria to choose the repositories we needed to pay attention to:

• Theimportanceoftheinformationcontained in the repository.• Theimportanceoftheusersofthe information.• Thedegreetowhichwecouldimpactthe repository information management practices and technological infrastructure. Repositories that will soon be revised and where we can be part of the revision process are higher priority.• Amountofeffort(time,money,skills)to acquire and process information from the repository. Repositories with easy connectivity and a minimum of data cleanup and additional tagging are higher priority as they are early wins.

Once we had a list of important repositories we collected enough information about them to understand their current use and the possibilities for integration.

40 TIMAF Information Management Best Practices Vol. 1

We gathered these kinds of information about each repository:

• AuniqueIDandtitlefortherepository.• Themanagers(ifany)whooverseethe collection and management of information in the repository.• Wheretherepositoryislocated(atHBS, at Harvard or outside Harvard).• Adescriptionfortherepositoryincluding: - Data schemas in the form of relational database diagrams or XML Schemas, - Directory structures for the major file storage of the repository, - The locations of the files where item content is stored, - How information is added to the system.• Acquisitionsourcesfortherepository.These are the places (i.e., locations and other applications) where repositories get information in bulk. For each source we collected: - A name for the source, - The types of information they supply (and the approximate rate at which they supply it), - The metadata standards they follow, - The way information is transferred from the

source to the repository and the owner of the source.• Thetableandfieldstructureoftherepository and how it maps to the information types we designed.

In addition, we related repositories to other parts of our IM model including:

• Theinformationtypesarestoredinthe repository.• Themetadataisstoredforeachinformation type.• Theuserswhocontributetoandconsume from the repository.• Outputs(publications,reports,decisions and so on) created or augmented by the repository.

Figure 3: Schema showing the information we collected for repositories

4101 Bob Boiko, Mary Lee Kennedy

ModeL PeoPLeTo understand the current state of IM, we needed to first understand the people who interact with our information. In a series of interviews we collected information about the key users and uses of scholarly information. These interviews provided the following results:

• Asetofusecasesthatspecifiedthekey information-related activities of users. For HBS the users were individuals who used scholarly information to make decisions or create publications that others consumed.• Listsofthekindsofinformationcreated, consumed and repurposed by users.• Problemswiththecurrentstateandneeds that the current state could not meet.• Desiresfornewinformationtypesand functionality.• Identificationofkeyentitiesinthecurrent state (authors, users and outputs, for example).

We used the analyses to identify these needs and information interactions:

• UserNeeds: At this stage, in the current state analysis, we simply noted and logged the needs that our users expressed. Later, in the desired state we categorized them as either business needs or information needs and linked them to the information and organizational structures that could meet those needs.• InformationCreation: At HBS people create (that is key-in or otherwise originate) information into repository databases. They may also create information directly into publications. We discovered and logged both types of interactions.• InformationConsumption: At HBS people consume information from repository data- bases or from publications. We discovered and logged both types of interactions.

Figure 4: A Webpage showing detail of one repository

42 TIMAF Information Management Best Practices Vol. 1

In the later desired state analysis we augmented this model to link consumers to the informationtypes they consume and the needs that con-sumption meets. The following figure shows a sample of the information creation profile of our “Analyst-Evaluator” role.

Like most organizations, people at HBS naturally divide themselves by their job title and department. However, for the purpose of defining desired IM practices, people also divide into wider classes based on their interactions with information. Thus, while we captured job title and department for our users, we also categorized them on a more abstract level of “role.” The roles allowed us to group user types by the way they interact with information at HBS. For example, we defined an Agent role for users who work on behalf of a faculty or staff member to enter data, search for information and perform other administrative tasks. User types form across organizational divisions play the Agent role.

The role model included:

• AnID,titleanddescriptionoftherole.• Animagefilecontainingsampleface-shot for use in presenting the role.• Thelistofstaffpositionsthatare encompassed by the role. One position type (Admin), for example, could be part of multiple roles.

Roles helped our architects group users based on their needs and interactions. However, they were abstract and not widely known. People outside the team only think in terms of staff positions. In addition, not all the staff positions within a role have the exact same interactions.

Figure 5: A Partial detail of one user type

4301 Bob Boiko, Mary Lee Kennedy

Thus, we linked user types, not roles to the other entitled in our model. Still, the roles provided us with a clear way to group users functionally rather than by job position.

Model outputsOutputs are the web sites, reports and other publications as well as the activities and decisions that are supported by HBS information systems. For each output type we collected this basic information.

• AnID,name,briefdescriptionandoptional URL and screen shot for the output.• Thetypeoftheoutput(Website,email, report, etc.).• Thegrouporgroupsthatmanagetheoutput.• Theproductionmethodsandcycleforthe output.• Laterwerelatedtheoutputstothe repositories, users and info types they interact with.

ConsTruCT The CurrenT sTATe ModeLThe current state comprises a tremendous amount of detailed information. Each repository with all its interactions had to be diagrammed and the current state as a whole had to be diagrammed. A computer screen was simply not the place to view such large and detailed diagrams. Instead, we chose to devote a large wall in our office to the task. We organized our information and its display as follows. We created the following paper outputs:

• Roledescriptionsforeachcontributorand consumer.• Adescriptionforeachdatasource repositories that information is acquired from.• Adescriptionforeachoutput(website,print pub, etc.). • Adiagramfortheinternaldatastructureof each repository (schema, folder structure, data dictionary and so on).

We laid out all the paper printouts on a large roll of paper, adhering the printouts with removable gum so that the diagram could be modified over time. We used string to indicate where repositories are connected and exchange data.

This method, while “retro” and physically bound to one location, provided huge advantages over viewing the model on the computer screen. Using the wall, we were able to:

• Zoominandouteffortlesslyovertheentire model. • Collaborateeasilyandcontinuouslyonthe model. • Rememberandinternalizethemodelwe had created. When presented as a physical landscape, the model engaged tactile and spatial memory. The position on the wall and the memory of placing objects helped us remember and discuss the model even

Figure 6: The top level elements of our model of an output

44 TIMAF Information Management Best Practices Vol. 1

when it was not in front of us. • Presentthemodeltoothersinaneffective and compelling way. We had many great conversations in front of the wall with executives, staff and peers.

Each repository has its own way of describing the information it contains. With the whole system laid out in one place, we were able to both learn the vocabulary of each repository and contrast it with related terms in other repositories. The fully described current state lead quickly and naturally to deep discussions and decisions about the desired state.

In addition to the physical model, we logged all the same information in our XML Framework. The framework system helped us make the wall more consistent and complete and also allowed us to produce custom views on the current state. Later, we added our desired state analysis to the system so that our complete IM Frame-work could be maintained and presented in any way we needed.

sTeP 2: ConsTruCT A desIred InForMATIon sTATeConceptually, our desired state is one in which information is cleanly divided into information types based on the value of the information to HBS. In the desired state each information type has a metadata model that is broad enough to fully represent the type but simple enough to

be consistently applied to all items of the type. Each type has items that are consistently tagged and well organized to enable our management needs and user’s access needs.

More specifically, in the desired state each item of information is:

• Strictlytyped.• Originatedonlyonetimebyqualified authors.• Fullytagged.• Distributedinthechannelsbestsuitedto each of its users.• Organizedtobequicklyandconsistently managed by us and discovered by our users.• Storedauthoritativelyandhandledvia known information lifecycle management practices.

Of course, there is a long way between the concept of a desired state and the reality of creating one. In the short term, we had little ability to change the input, structure or processes of a repository. We could only suggest changes to repository owners and participate on system maintenance and upgrade projects. However, in addition to impacting continuing repository development, we could also define and propagate a unified information model and user experience behind cross repository information finding and use at HBS.

To move toward both a set of change recommendations and a unified finding and use specification, we combined the output of our current state analysis with the highest priority information needs of HBS to create a desired information state model. This model described the most important information needs, informa-tion types that meet those needs and how items of each type were to be organized for access.

By the time we arrived at a desired state model, a lot of time had passed since our original user interviews. During that time, we made a lot of

Figure 7: The entire current state model on a wall in our office

4501 Bob Boiko, Mary Lee Kennedy

assumptions about and extensions to what we heard. To validate our assumptions and assure that our model could really meet the needs of our intended users, we conducted a set of follow-up interviews where we got feedback on our model, presented the changes we would propose and revised our model accordingly.

Model InformationOur framework model represented these information-related entities and relationships:

• information types: The named kinds of information that we intended to manage. • Elements: The parts (like title, date and ID) that information type has. We named the elements and logged the repositories and where each element is stored.• Access structures: The organizational structures (indexes, hierarchies, cross- references and sequences) that allow users to find information. • Controlled vocabularies: The lists users will choose from to tag information when they create it and specify search criteria when they are looking for information.

Define Information typesLooked at from the perspective of the current state, information is just fields in database tables. Faculty member profiles, for example, are simply the collection of all the fields in all the tables of all the repository databases where anything about a faculty member is stored. In a desired state though, the situation could be much improved. For example, based on user needs, we can define a “Personal Profile” information type that has all the elements needed to describe each faculty member in a way that is most useful and findable to users.Regardless of where in the current state Personal Profile elements are (or are not) currently stored users need them. Thus, our desired state model included the currently

stored as well as the currently ignored elements of all information types.Most information types, like Personal Profile suggest themselves naturally from the purpose and structure of the current systems and the known needs of our users. Most of the remain-ing information types were suggested by our user analysis. For example, we identified a need for a type called Popular Press which contains editorials, articles about faculty and other items that appear in non-academic publications. Few of the elements of this information type are in the current state, it is mostly a new type.

For each information type, we collected this information: • AnID,descriptionandnameforthetype (for example Personal Profile).• Thegrouporgroupsthatmanagethe information type. • Themetadataelementitemsthetypeshould have. For example a Personal Profile should have a title, date, abstract, an optional video of the presentation and other elements.• Informationflowdiagramsthatshowthe current and proposed origination, storage and distribution of items of the type.• Theoutputsthatcontainitemsofthetype.• Theusersthatwantorneeditemsofthetype and the business purposes to which they put items of the type was tracked in information

Figure 8: The top level elements of our information type model

46 TIMAF Information Management Best Practices Vol. 1

interaction elements. • Wetrackedhowtheinformationtypeswere stored in the current-state repositories in the repository element.

To begin to unify information across repositories, we defined a base metadata representation that spans all types. In the current state, almost all repositories represent an item’s unique identifier and title in some form. Unfortunately there is little additional commonality in the current state that we could depend on. Still, we defined these other base fields that each item must have in the desired state. In addition to the base metadata that all items regardless of their information type must have, we defined extended metadata representations that are particular to each type. Much but not all of the extended metadata was already captured in one or another repository.In the end, we arrived at a reasonable set of information types. Each type shared the base tags and had its own set of extended tags. Most of the base and extended tags already existed in some repository of the current state. The tags that did not already exist, of course, would eventually have to be added either in the existing repositories or in some other over-arching system.

Define access StructuresInformation types define the kinds of information we intend to deliver. Access structures define the schemes we intend to use to organize and present items of information

such as in Web search, or as part of a newsletter. Access structures fall into these basic categories:

• Hierarchieswhereinformationitemsare organized into an outline. • Indexeswhereinformationitemsare organized into alphabetical and numerically sorted lists. • Crossreferenceswhereitemsarelinked directly to other items. • Sequenceswhereitemsarelistedintheorder they are intended to be consumed. • Fulltextsearchwhichisanindexofallwords in the items indexed.

We captured this information about access structures:

• AnID,nameanddescriptionfortheaccess structure. • Atype(hierarchy,index,crossreferenceor sequence) for the type. • Thestatusofthetype(currentordesired).• Theusesoftheaccessstructuretoorganize information types by one of their elements.

Access structures can organize items of a single information type (as in an index of Faculty Presentation titles) or multiple types (as in an index to all faculty information by keyword). To decide what access structures to define for

Figure 9: A partial list of our information types

Figure 10: The top level tags of our access structure model

4701 Bob Boiko, Mary Lee Kennedy

which information types we used these criteria:

• Whatstructuresarealreadyinuse?For example, there is already an index of faculty members by name.• Whatstructureshaveusersaskedfor?For example, users overwhelmingly asked for a comprehensive keyword index that spans information types.• Whatstructurescanweassumeuserswant even if they have not explicitly asked for them? For example, users will undoubtedly benefit from cross references between related presentations.

Model Information InteractionsThe major premise of our work is that the business needs of HBS can be better met with well managed information. If we can create and manage the right set of information types and provide the right access to them, users’ immediate needs and the wider business needs of HBS will be met. We began with a simple model of business needs.

We defined a business need as follows:

• AnID,titleanddescriptiontoidentifyit.• Referencesfromoneneedtoanother.The references allowed represent a hierarchy of needs rather than a simple list.

Business needs are forwarded by the creation and consumption of information. We called these acts “information interactions.” We considered that business needs are supported by a variety of information interactions where one or more types of users create or consume one of more types of information.

Figure 11: A partial list of the access structures we arrived at

Figure 12: The basic definition of a need

Figure 13: The top level elements of our model of an information interaction

48 TIMAF Information Management Best Practices Vol. 1

An information interaction had the following parts:

• AnID,titleanddescription.• Atype(informationconsumptionor creation). • Astatus(currentordesired).• Mandatorylinkstooneormoreusersand information types. Every interaction was assumed to include at least one user consuming or creating at least one type of information.• Optionalrepositoryandoutputlinks.Ifthe user is creating or consuming information from a repository or output these elements are filled in.• Areasonthattheinteractionisimportant. Interactions are important either because they directly support a business need or because it supports other interactions that, in turn, support business needs.

Model governanceThe desired IM state has well developed, documented and enforceable lifecycle manage-ment rules for each information type backed by policies that require participation. To model this governance context, we defined the concept of a governance rule. Rules are:

• Policies: the requirements or “must” rules. For example, users must store assets at a named stable accessible location.• Standards: the benchmarks or “measurable” rules. For example, users need to supply a complete copy of materials in a specific format with specific metadata to a specific place, when an item of information is purged.• Guidelines: the procedures or “how-to” rules. For example, users need to follow specific steps to secure copyrights and inform the organization of usage rights.

We created one rule model that could handle all three types.

Figure 14: One interaction and its relations to other interactions and to business needs

4901 Bob Boiko, Mary Lee Kennedy

In addition to an ID, title and description, rules have:

• A type: The type is either policy, standard or guideline. • A lifecycle stage: This element specifies where in the information lifecycle the rule applies (origination, tagging, storage, etc.). • A status: Statuses state whether the rule applies in the current or desired state. • info type references: These state which info types the rule applies to. • Rule reference: References link one rule to another. • Applications: These describe the details of how the rule applies to a particular group or other context.

Figure 15: The schema for our concept of a governance rule.

50 TIMAF Information Management Best Practices Vol. 1

Define Information PolicyIn our model, policies are requirements for participation in or activities within information management. Our overall information policy statement sets the context for more specific policies.There are types of information that HBS produces that are extremely important to our success as an institution. These types of information must be created and retained in a consistent and formal manner so they can be available for the wide range of uses that contribute to our success. This statement defines information types and mandates that important ones be taken seriously. It creates the environment in which we could further specify stewardship requirements in a clear, concise and comprehensive way.

In addition to the IM policies we created, Harvard University and HBS have existing policies on information assurance (security), faculty approval of information and legal council approval of information.Our main policies aim to establish steward-ship for each information type throughout its lifecycle. In the current state, many of the types are not authoritatively defined or stored. For example, course materials were managed in a very decentralized way.

Different parts of course materials are spread out in a variety of repositories and there was no

overall coordination of the pieces. There was no overall oversight of all documentation produced in the context of course; course materials were handled in several repositories.We considered authority to be the foundation on which standards and guidelines are built. We divided authority into these three parts of the information lifecycle:

• Definitive origination. Items of each type need to be created in a consistent and controlled way. Each item must be tagged with the same metadata using the same tagging guidelines.• Definitive storage. One authoritative copy of an item must be established. Both the metadata about an item and the item itself must be in a known and stable location with appropriate security and backup.• Definitive retirement. The end of the active life of each type must be defined. What happens to items that reach the end of active life must also be defined. Procedures must be put in place that identify and move items into retirement.

Define Information Standards and guidelinesStandards specify targets for users to meet and guidelines are the procedures users should follow. To unearth the variety of standards and procedures we would need, we created

Figure 16: A partial list of the rules we defined.

5101 Bob Boiko, Mary Lee Kennedy

information flow diagrams for each info type. Our standards and guidelines followed naturally from discussion of the flow diagrams and the policies we had established for definitive origination, storage and retirement.

• Origination guidelines. How are items expected to be authored and acquired? What quantities are expected and how will quality be assured? Guidelines for establishing the base metadata values for each item including how to create definitive IDs and useful titles. Specific guidelines for each metadata element that has a controlled vocabulary including how taggers will know what values to apply and what automation is needed to help them.• Storage guidelines. What database will house the definitive copy of items and where will non definitive copies be stored? Will the definitive representation be in a single database (preferred) or spread amongst two or more databases? How will changes be propagated from the definitive to non definitive locations?• Retirement guidelines. What happens to items that are retired? Are they archived, deleted or retagged and allowed to remain in their original system. What triggers retire-ment? How are links to retired items from active items handled?

Define current and Desired Information FlowWe created two flowchart-like diagrams showing the current and desired and flow of each type of information. The diagrams were instrumental in helping us to:

• Understandthecurrentlifecycleforeach type. • Decidehowtoenhancelifecycle management. • Communicateourideastotheorganization.

sTeP 3: IMPLeMenT The desIred IM sTATeWith our desired state IM model in place defining what we would like to do, we were ready to put it to use. We used the desired state in two ways:

• Producingasetofrecommendationsthat could be implemented in the short-term by managers of repositories and information types.• Definingaunifiedexperienceforpeople finding and using information across repositories.

roll out recommendationsWe identified these primary stakeholder types to whom we provided recommendations:

• Repositorymanagers: These groups have responsibility for one or more repositories in the current IM state. Our goal is to ensure that we model their repository correctly and that they understand and accept the changes we foresee for their inputs, data models and outputs. •information type managers: These groups have responsibility for one or more types of information in our information model. Our goal is that they understand and accept our model of their types and the governance rules we have created for their types.

We began by identifying problems in the current state that could be addressed and opportunities for enhancing the current state so that it could do more. We then determined how best to approach various groups to account for the easiest changes and the dependencies between changes. Finally, we met with each stakeholder group to present our recommend-ations and a plan for their implementation.

52 TIMAF Information Management Best Practices Vol. 1

Identify DeficitsIf items of a given information type were currently being created, tagged and delivered in a way consistent with the desired state, there was no reason to recommend any changes to the current repository or process. If, on the other hand, a repository had any of the follow-ing deficits, we noted them:

•Multipleconflictingsources,wheretwoor more sources for the same information are redundant or provide the information in different ways to different repositories.• Taggingdeficits,wheremore(orsometimes less) metadata is required to completely describe a type.• Inconsistenttagging,wheresomeitemsof an information type are tagged differently than others.• Lackofcontroloverthevocabularyused in a metadata element.•Noormultiplemasterdatasourcesforan information type.

Likewise, we noted the enhancements to gover-nance processes that we came up with in our desired IM state including:

• Lackofunderstandingortraininginthe people who create items. • Insufficientqualitycontroloverthevalues applied to a metadata element. • Noorpoorlydefinedproceduresforretiring an item that is no longer active. • Storageofitemsthatarenotsignificantand are unlikely needed by users.

We divided these recommendations by stake-holder group, so that a particular group need only see relevant recommendations.

Identify opportunitiesIn addition to finding areas of concern in the current state, we also identified opportuni-ties for the current state to better meet user

needs. During our user interviews, we identified a comprehensive set of user information needs. Some of the needs were being well met by current systems, some were being met, but only with a lot of extra effort on the part of the users. Others were not being met at all. Combining this user analysis with our under-standing of the current IM state we formulated this set of opportunities for improvement:

• Changestorepositoryoutputsthatwould decrease the amount of additional work needed to use the output. • Givingusersmoredirectconnectionstodata sources so they could query for just the information they needed. • Integrationswheresignificantefficiencygain could be achieved by eliminating duplicative storage.

create an approach PlanRecognizing that not all repository teams could or would implement all of our suggestions immediately, we formulated a plan for how to stage the changes across time and teams. The plan considered these factors:

• Theimpactachangehasonuserneeds.The more positive benefit a change brought, the sooner we tried to make it happen.• Eachrepositoryhasitsownschedulefor maintenance and revision. We tried as much as possible to fit our suggestions into these existing channels for change.• Eachrepositoryteamhasmoreorlessability to make changes. Many repositories rely entirely on our central IT team for technical resources and so must share resources with many others. We worked with both the repository team and central IT to decide which changes could be scheduled first.• Metadatachangesrequiredmoreorless review and revision to existing items. As it was our team who would make these changes,

5301 Bob Boiko, Mary Lee Kennedy

we took our own capacity into account.• Somemetadatachangescouldnotbeeasily accomplished in the existing systems but rather would be much easier in an over- arching data warehouse. For these changes we weighed their immediate importance against the increase in difficulty in implementing them in the current state.

Work with Stakeholder groupsOur success depended in large part on the understanding and confidence we could inspire in stakeholder groups. So, we began by provid-ing a simple but comprehensive list of services we could provide them. This list included the following categories:

• Wehelpyoufindinformationby: - Identifying the information you need, - Defining how can you best find that information, - Making sure it is the most up-to-date and authoritative version.• Wehelpyoumanageinformationby: - Creating information in a way that allows it to be maximally useful, - Storing information in a way that assures it can be found and retained, - Distributing information thorough the channels that will benefit you most, - Retiring information in a way that will allow it to be used as long into the future as necessary. • Wehelpyouaccountforinformationinyour projects by: - Making sure the next version of your system better serves user needs, - Drawing upon existing sources of information, rather than creating new ones, - Making improvements or efficiencies in the tagging and storage of the information in your system.

We met with each stakeholder group first to discuss our process, then to present recommend-ations and an implementation plan and then as needed to complete the plan. Our framework web site system included a page for each group that detailed their interaction with our model.

For the recommendation meetings we provided the following to the stakeholders:

• Ourservicelistwithanemphasisontheones appropriate to each group. • DiscussionofhowthegroupfitintotheIM landscape we had produced (what info types and repositories they managed, what rules applied to them, what needs we believed them to have, and so on). • Alistofactionsandprojectswe recommended. •Proposedtimelinesforimplementingthe recommendations with deliverables, time and resource estimates.

Following the recommendations meeting, we worked with the groups to form what we called an “Information Stewardship Agreement.” The agreement specified and expressed mutual agreement to:

• Theinfotypesandrepositoriesthatthegroup is responsible for. • Pointpeopleandescalationprocessinour group and theirs for issues that arise. • Governancerulesthatapplytothegroup.• Changestothecreationandstorageof information. •Assumptionsaboutresourceallocationsand skills needed to complete the changes.

The stewardship agreements formed the basis of a number of projects that we moved forward on with the groups.

54 TIMAF Information Management Best Practices Vol. 1

Specify an Integrated User experienceIn addition to helping us formulate current state recommendations, our desired state also served as the starting place for specifying a way to integrate information creation and access across repositories. More specifically, we detailed:

• Acomprehensivemetadataschemathat encompassed the information model in our desired state.• Definitionoftheprocessesandabilities we want to have in place for information creation. • Definitionofaunifiedsearchandresults experience that can appear on any web page to give users access to assets across repositories.

We remained agnostic as to whether this unified experience could take place within the current state or would need a new data warehouse to function. Our job was not to specify the implementation of the experience but the information model and user experience in front of the implementation.

Set goals and constraintsTo begin we needed to frame the driving factors behind the application in a way that could help the development team keep an eye on key success factors. We began with the original goals of our initiative:

• Unifyaccesstoassets.• Createefficiencygains.• Increaseaccessbytargetusers.

Our user analysis added to the work we did on authors and sources and outputs also helped us formulate more specific success criteria as follows:

• Ouruseranalysiscreatedtheusertypesthat were of concern.

• Ourinformationneedsanalysistoldus: - What information was the most important for which users, - How the search experience should be structured. • Ourmetadatamodeltoldus: - What sort of functions would be needed for authoring and acquiring information, - What information should appear in any results or details display. • Ourinfoflowdiagramstolduswhatwork- flows to expect.

Specify the creation experienceWith the goals defined, we described the func-tion of the major subsystems of any application that could meet these goals. These subsystems included:

•Authoring: Our desired state information model told us what metadata had to be created for each information type. From this we defined the data entry, workflows and quality control features that were needed to support metadata creation.• Storage: Our Desired Information State naturally yielded a logical model for meta- data storage. In addition, we specified the metadata management tools and workflow support that needed to be built into any system.•Administration: Based on the other features we specified, and also upon standard system features, we detailed our needs for reporting, user management, import/export, versioning and backup/recovery.

Specify the Finding experienceBased on our user analysis we expected that users would benefit most if business owners who need to expose assets on a web page could simply provide a search feature on their pages.We specified minimal interface to support the following features:

55 01 Bob Boiko, Mary Lee Kennedy

• Federatedfull-textsearchacrossourmeta- data as well as across as much of the text in documents and other files as possible.• Metadatasimplesearchwithonetextbox.• Advancedmetadatasearchwithmultiple controls that customize depending on the information types being searched. • Browsenavigationthroughinformationtypes based on the access structures we stored for each. • Aresultslistcompressed(titleonly)and elaborated (title, metadata and file preview) with click-through to download or view the full source file or further query the source system where possible.

As appendices to the specification we also provided our full desired state information model as well as complete sets of user needs and IM policies, standards and guidelines.

Summary and ConclusionOur mandate was to get the most value out of the most important information assets at Harvard Business School. To determine which information types were most important, we conducted a series of interviews with key users. Then, we created a survey of the current state of information management at HBS to determine how our key information types were currently being managed. This current state analysis taught us an enormous amount and gave us a unique and powerful perspective to add to any conversation about information management at HBS. From the current state and the user information needs we had gathered, we constructed a desired state model that described the way our key information types should be originated, tagged and stored so that they could be best delivered to the people who need them. The desired state, in turn, led to both near- term recommendations for changes to information management in the current systems and a specification for a longer-term unified creation and finding experience where data is aggregated for all systems and accessible from any web page.

The project has consolidated our position as valued experts on information collection and use and has given us a central position in the projects that create or revise information management systems at HBS. We are currently engaged with a range of repository teams to explain and advocate for our recommendations and with the HBS IT team to move forward on the long-term unification.

56 TIMAF Information Management Best Practices Vol. 1

5702 Michael Crandall

MiCHAEL CRAnDALL ([email protected]) is a Senior Lecturer at the University of Washington Information School, and current chair of the Masters of Science in Information Management program. Prior to coming to the University of Washington, he was technology manager for the US Library program of the Bill & Melinda Gates Foundation. Before moving to the foundation, Mike worked on search technology and knowledge organization for the intranet at Microsoft (MSWeb), and on information architecture and online library services at Boeing. He has served on the Dublin Core Metadata Board of Trustees (now the Oversight Committee) since its inception in 2001, and is active in the American Society of Information Science and Technology. His current interests include public access computing, ICT in developing countries, metadata and knowledge organization, social dimensions of knowledge transfer, and large scale information systems.

Technical and Organizational Lessons in Information Management from the Dublin Core Metadata Initiative

Building a Distributed Metadata Standards organization

Michael Crandall

7 02

58

AbstrAct

TIMAF Information Management Best Practices Vol. 1

Information management through a widely distributed network

is becoming the norm in modern organizations. Methods and

processes for dealing with both the technical and organizational

issues associated with this reality are in their infancy, and there

are few good examples of how standards organizations have

developed over time to deal with some of the challenges posed

by this fact of life.

Over the past 15 years, the Dublin Core Metadata Initiative

(DCMI) has created an organizational infrastructure and

substantive work products that allow its members to effectively

collaborate on metadata standards. Operating with a very light

infrastructure in combination with open collaborative tools for

community dialog, DCMI has been able to achieve a remarkable

international consensus on some very difficult problems. Other

organizations may find the processes and methods instructive

for their own efforts in developing common information

management practices.

59

Background

02 Michael Crandall

One of the major challenges that organizations face in the modern world is finding ways to manage and provide access to the information that they need to accomplish their objectives. Success in this arena involves a complicated interplay between technical solutions and the organizational structures necessary to support those solutions. A well-crafted technology is of no use if users refuse to adopt it, or resources cannot be found to support it through the life-cycle of its implementation (1, page 149-165).The Dublin Core Metadata Initiative (DCMI) provides a remarkably open and available example of how an internationally distributed network of individuals approached the difficult problems associated with resource discovery of materials on the open Web, both from a technical and an organizational perspective. With a 15 year history to examine, the organization and its products provide a rich example of an approach that has worked to provide an increasingly rigorous technical solution, as well as an effective supporting organization.DCMI arose early in the development of the World Wide Web as a result of frustration with the difficulty of discovering resources in the rapidly growing hyperlinked information environment. The intent was to build a simple metadata schema that could be used for machine discovery of related resources, through standardization of metadata elements and their expression in machine- readable language. Over the years, the evolving complex-ity of this task has led to the growth of an orga-nization to support ongoing development work, several national and international standards, and a host of technical artifacts needed for the effective use and adoption of the standards in information systems.While the future for DCMI (or any organi-zation dealing with the complex problems of information management in a distributed,

open environment) is still unfolding, a look back at the practices and systems that have evolved over the last 15 years provides some useful insights to anyone working in this area. These lessons may help with some of the tougher problems many of us have faced or will face in implementing metadata initiatives in our own organizations, and provide at least one roadmap for a successful approach to building consensus around difficult information management issues.

60 TIMAF Information Management Best Practices Vol. 1

The skILLs needed To PerForM ThIs BesT PrACTICeThis type of initiative requires two distinct skill sets: one focused on leadership and manage-ment, the other on technical innovation and development. Successfully integrating the skills in these sets is a challenge, especially when you are working in a distributed, mostly volunteer, environment. Since these skills are not likely to be embodied in any one individual or role (and in fact may cross over between the two areas—for example, a technical expert might need to exhibit leadership skills at some point), I have listed them under the two categories, without assigning attributes to a specific job title. It is also important to realize that not all of these skills are needed at the same time. However, having resources available that you can draw upon when needed is a critical component of success.

The skill sets are as follows:

• ManagementSkills - Flexibility: In any organization that evolves over time, the ability to adapt to changing conditions in a way that preserves the organization’s core values while responding appropriately to changes is a major asset. This is particularly true in the digital world, where advances in thinking and technology make rapid change a fact of life. - Leadership: Without someone to provide vision, guidance, and the necessary connections and visibility to support an initiative, it is difficult to get started or to keep going. These intangible qualities that make it possible to inspire people to gather together for a cause, and are critical in creating and maintaining momentum in the project. - Communication: At all levels, keeping the dialog going is essential for success. Whether with key stakeholders, competitors, funders, users, or within the organization itself, effective conversations

are the life-blood of the initiative. This quality is important not only within management, but also in the technical arena. Being able to describe technical advances anddifficulties in a way that resonates with multiple audiences is im- portant in an initiative like this, because the abstract nature of the work makes it hard to see the value without effective communication vehicles. - Marketing: Marketing is closely allied with communication, but is specifically targeted at critical partners such as funding agencies, users, and any group that is crucial to the ongoing operation and success of the initia- tive. Being able to describe the value of the product and the work in terms that a par- ticular audience finds useful is the key to ongoing support and recognition. - Process management: The best idea in the world will go nowhere if you cannot effectively manage the work process that grows the project over time. Being able to document, integrate, plan, and allocate resources to all of the initiative’s needs makes or breaks the work that is done. - negotiation: Although not often called out as an essential skill, the ability to deal with multiple voices and develop accept- able compromises is a major component of a successful effort. There are always competing demands for resources and attention, and having people who are able to forge a consensus and move ahead with- out alienating factions or key individuals are a major asset for an organization. - Patience: The ability to take the long view and proceed step by step toward your goal is important, particularly in a complex, distributed, abstract effort like developing a standard approach to information management. Changes in the technical and political landscape, shifting user needs, funder demands, and a host of other variables will inevitably slow progress and introduce unanticipated complexities.

6102 Michael Crandall

Keeping your original goals in mind helps to maintain focus on the long-term results, and managing the interim steps is smoother and more productive with patience as an underly- ing quality.

• technical Skills - Balance: Making the right tradeoff between practical solutions and innovation is a particular challenge in the technical realm. Knowing when and where to develop a practical solution (even if it is not elegant), and when to drive into new territory and expand the technical possibilities is one of the hardest choices to make on an on- going basis. Keeping both going in the technical realm is a rare skill, and requires both technical insight and a good sense of what’s possible (and needed) at the moment. - Semantic theory: A deep knowledge of the use of language to describe objects is essential to working in the area of metadata and information organization. Background in library and information science, linguis- tics, graph theory, and modeling languages is necessary. Much of this work is exploratory in nature, so it is likely that someone who has these skills will have a strong network of colleagues who they use for validation and extension of their own work. - Abstract thinking: The ability to think in abstract terms about large information spaces, and to turn those abstractions into reality, is another essential quality for development of solutions that can scale and be extended over time. But there also has to be rigor in this thinking, because the ultimate goal of the work is to trans- late the abstract ideas into concrete re- presentations that are consistent and usable in a mechanical way. Background in struc- tured programming languages, data model- ing, and logical theory are all important skills.

- Structured markup languages: The ability to write well-crafted machine readable code in one of the semantic markup languages associated with web technology (for example, RDFS, XML, XHTML and all their associated languages) is a critical component in turning abstractions into concrete reality. Understanding the limit- ations and possibilities within these languages drives further development and progress. - Web technologies: Being familiar and competent with the tools that support web services is important. Facility in the use of web development tools, authoring plat- forms, and emerging web technologies are all integral in supporting and advancing the work. - Software development methodology: As in any development project, good management of the development process itself is critical. Use of version control soft- ware, structured development cycles, documentation best practices, and so on, are all critical in maintaining a record of the code and the evolution of the results over time.

sTeP 1: IdenTIFy ProBLeMs And BuILd InITIAL soLuTIonsIn the first stages of any information manage-ment initiative, several things that need to happen for your plan to become a reality. These early efforts include:

• Definingtheproblem.• Identifyinganddevelopingsolutions.• Testingthosesolutionsthroughapplication and use.

The Dublin Core Metadata Initiative (DCMI) provides a good example of these activities. Conceived as an international effort to help improve resource discovery on the World Wide Web, DCMI was one of the first attempts to

62 TIMAF Information Management Best Practices Vol. 1

provide a standard metadata schema for discovery of web content in a global context.

Define the ProblemThe first item of business for any new initiative is defining the problem to address. Without a clear statement of what you are trying to accomplish, you will not get anywhere. But it is also important to make sure that you realize the full scope of the problem, and do not focus solely on one piece of it. DCMI recognized right from the start that problems have multiple dimensions, and made those dimensions explicit.

A hallway conversation at the 2nd International World Wide Web Conference in October 1994 provided the first problem statement for what eventually became DCMI: the sheer size of the World Wide Web (only 500,000 objects at the time!!) was making it harder and harder to find resources (2). From this problem, the idea of a simple way to discover the web’s resources by providing standard, agreed-upon, metadata elements for all of its content was born.

This conversation resulted in the National Center for Supercomputing Applications (NCSA) and OCLC Online Computer Library Center, Inc. (an international member-owned library organization) sponsoring a joint work-shop to discuss these issues in Dublin, Ohio, in March 1995. About 50 people (self-described as “geeks, freaks, and the people in sensible shoes”), met to formulate the problem, to come up with an initial solution, and to develop principles that would guide further exploration. The goals of the workshop included “(a) foster-ing a common understanding of the needs, strengths, shortcomings, and solutions of the stakeholders; and (b) reaching consensus on a core set of metadata elements to describe networked resources” (3).

These dual goals reflect the two-pronged nature of the DCMI effort. These two perspectives—

organizational and technical—allowed DCMI to grow and develop over time, rather than disappearing as so many similar initiatives have. Throughout this article we’ll alternate between the two perspectives, as each is critical for success.

When you define your problem, think about the organization as well as the technology. Focusing on both the human and the technological side of the project of DCMI’s early emphasis on the dual importance of technology and people has served them well throughout their history, and has provided a rich network of problem-solvers at all levels of the organization, from technical gurus to those who have to figure out how to raise money. When you are thinking about the problem you are trying to solve in your organi-zation, remember that it’s not just the technical problem that you need to address, but also the organizational and managerial structure necessary to support the technical solution over the life of the initiative. Too often, the technical problem is what drives the approach and the supporting network that can ensure the adoption and support of the technology is left off the table.

Find Something You can accomplish QuicklyIn any complex problem there is a danger of trying to solve too much too soon. If, on the other hand, you can find a quick win and develop something that people can use immediately, you will keep them engaged and interested in your work, and can then move on to more complex solutions. Future work can be based on the experience gained from deployment and use of the organization’s first technical product.To satisfy their second goal, “[creating] a core set of metadata elements to describe networked resources”, the first workshop participants agreed upon a set of 13 metadata elements that could be used for resource discovery across all

6302 Michael Crandall

objects currently found on the world wide web. The participants also drafted initial guidelines for further extension and use of these ele-ments (3). As described in the initial report, these elements were intended to provide a bridge between the highly structured metadata representations used in the pre-web world by information providers (libraries, for example), and the automatically generated indexes created by the newly developed web retrieval tools such as Lycos and WebCrawler.

After much discussion and deliberation, this simple solution became the basis for the 15 elements (listed in Table 1) that were formalized into the Dublin Core Metadata Element Set (4) (5) (6). This element set has become the signature of Dublin Core, and we will come back to it in later stages to revisit how this simple first solution proved both a catalyst and a barrier to the evolution of the initiative over time.

Identifier

Title

Creator

Subject

Description

Publisher

Contributor

Date

Type

Format

Identifier

Source

Language

Relation

Coverage

Rights

A name given to the resource.

An entity primarily responsible for making the content of the resource.

The topic of the content of the resource.

An account of the content of the resource.

An entity responsible for making the resource available.

An entity responsible for making contributions to the content of the resource.

A date associated with an event in the life cycle of the resource.

The nature or genre of the content of the resource.

The physical or digital manifestation of the resource.

An unambiguous reference to the resource within a given context.

A reference to a resource from which the present resource is derived.

A language of the intellectual content of the resource.

A reference to a related resource.

The extent or scope of the content of the resource.

Information about rights held in and over the resource

Definition

taBLe 1 The DCMI Metadata Element Set (7, page 25-35)

64 TIMAF Information Management Best Practices Vol. 1

As with most ambitious projects, the initial work focused on what was attainable—and found that even the simple solution involved compromises and unfinished business. By putting an early success on the books, however, there was immediate progress and recognition of the effort, and the interest and engagement of the people involved could be maintained over time. This initial excitement carried forward into solving later problems, and the outside world had something to use that was a concrete product of the organization, giving DCMI recognition as the organization that was working on the problems associated with defining metadata standards and practices for the open web.

Iterate Based on PracticeIt is important to realize that the first answer is only the beginning, however. By getting something done, you provide a seed for further work—where the real contributions will come. DCMI was able to use its initial momentum to explore the areas that the first solution did not address. By creating something that was a reference point, developers had a solid start to build from and to work against. Subsequent products and ideas could always be tied back to the early products, and used to extend and grow the project’s reach and impact. By always keeping the initial product in mind, however, the early adopters were not left behind, since later work always fit in with the earlier legacy. Again, because the organization had been given equal weight with the technical solution, the infrastructure was in place to continue work on the loose ends, and to move forward with further iterations.For DCMI, the evidence of this approach is shown in the first workshop report’s lengthy exploration of the underlying principles, un-resolved issues, and steps for the development needed to move the solution forward. This for-esight was one of the key components that led to the ongoing evolution of DCMI in the years

ahead. That early report foreshadowed many of the downstream steps that played out over the subsequent years.The recognition that the solution needed to be flexible enough to accommodate unforeseen needs and new developments in technology was codified in the principles of simplicity, semantic inter-operability, international consensus, inter-disciplinarity, extensibility, and modularity. As time passed, each of these principles was tested and expanded, and provided useful boundary markers for further developments.Over the next several years, we developed principles for extending and refining the elements, defined rules for encoding the values to be used in the elements (or adopted and adapted rules from other standards), and created an encoding vocabulary for the Type element. The technical efforts in this stage of development were focused on creating a com-mon vocabulary for shared metadata, and estab-lishing the rules and procedures for extending and using that vocabulary.Although the focus of this first workshop (and much of the subsequent work) was on developing agreement on the vocabulary and definitions of the core metadata elements, we also had to show that people could use these elements to manage information. This meant producing examples of machine-readable encodings for the elements. In keeping with the principle of syntax independence, we employed multiple encoding methods. The first iterations were simple HTML, but as more sophisticated encoding syntaxes were developed for the web, DCMI followed. Over the years, our practice has been modified to keep up with changes in the environment, and new encoding guidelines have been developed for XHTML, XML, and RDF (8).All of this work was the result of continued dialog and discussion throughout the late 1990s. When tackling your own project, remember that the first technical solution is not a one-time effort, but needs continuous refinement and adjustment as the external

6502 Michael Crandall

environment changes, and as lessons learned from practice are incorporated into the overall design.

sTeP 2: BuILd An orgAnIzATIonAL sTruCTureAlong with the technical work goes something more intangible—the development of an organization to support that work, and ensure that the necessary resources for keeping it going can be provided. Building a massive organizational infrastructure at this stage is usually counter-productive; keeping things small and flexible allows the organization to test various models before settling into a more formal structure. DCMI’s early organization was a loosely coupled communal structure, driven by technical needs associated with the development of the schema and the need to provide a continued forum to support this work.

organize the StakeholdersFledgling initiatives are often fragile, and need help from larger organizations to provide shelter and support until they can grow strong enough to fend for themselves. It is important to find and engage with others working in the same field, and to encourage their participation in activities so that ideas and developments can be shared. Engaging parallel organizations and their members early on helps to build recognition as well as cooperation, and can provide people and ideas that become key players in an initiative’s efforts. This cooperative work with others has been a constant through-out DCMI’s history, and has been an effective way to leverage scarce resources, using the work of other bodies to further DCMI’s work (and vice versa).Remember the initial goal of that very first workshop—“fostering a common understand-ing of the needs, strengths, shortcomings, and solutions of the stakeholders”. Recognizing the importance of the parties involved in the

creation and use of the solutions has been a key to success throughout the evolution of DCMI, and continues to provide a driving force in decisions made by the initiative. In the first years, the stakeholders were usually technical developers, who wrestled with the hard problems of defining and developing the infra- structure necessary to make the technical solutions work. The fact that this effort occurred within a distributed, rapidly shifting landscape made it all the more important that there was regular and open communication among the principals involved.DCMI’s series of workshops (9) was a key component for this ongoing involvement. Regular meetings allowed key stakeholders to meet face to face and share ideas; they also provided a way to make the effort visible within the broader standards community, and to forge connections with others working on similar or complementary efforts. The list of early implementers shows a wide reach across industry and academia. This widespread usage was a critical component for the adoption of the solution, and one that DCMI has been conscious of throughout the organization’s lifespan.Finding a fiscal and organizational sponsor for the fledgling initiative was key to our success, and Online Computer Library Center (OCLC—a nonprofit, membership, computer library service and research organization) stepped up to this role. Although the bulk of the effort came from the voluntary contri-butions of those who were involved in the development and growth of the standard, a minimal level of financial and organizational support was essential in keeping the basic work going. Having a strong sponsor early in the game was a major component of DCMI’s success.In any large-scale information management effort, paying attention to sponsors and stake-holders pays off in the end. The sponsors and stakeholders determine the ultimate success or failure of your work, and can make or break an

66 TIMAF Information Management Best Practices Vol. 1

initiative. Cultivating relationships with them, paying attention to their needs, communica-ting clearly and often with them, and bringing their strengths into the picture early, all help to ensure long-term success.

Instill a Vision and Find a LeaderAs with most nascent efforts, one of the key elements for success is finding visible leaders who can provide excitement, keep the conver-sation going, and serve as lightning rods for publicity and controversy. Finding people who can listen well, be firm when necessary—keep-ing activities focused and on track—and attract the attention of potential stakeholders goes a long way toward giving an initiative credibility and capital.

DCMI was fortunate to have Stu Weibel, from OCLC’s Office of Research, as its Director. Early in the life of the initiative, Stu emerged as a visible spokesperson. In his address at the Fifth International World Wide Web Conference (10), he described the web as “a library with the roof torn off, books piled on the floor and no way to find anything”. This controversial, thought- provoking, and engaging metaphor aroused interest in the efforts of DCMI and other similar initiatives.As Associate Director, Eric Miller (also with OCLC) provided the technical vision and leader- ship that ultimately led to the creation of both the Resource Description Framework (RDF) and the framework for DCMI’s own abstract model. See Steps Three and Five for further discussion of these frameworks.Throughout the early years of DCMI’s develop-ment, these two leaders provided guidance for many decisions about the initiative’s scope and direction. Weibel’s occasional “over my dead body” was an effective voice in keeping the initiative on a focused track, and helped to keep to a direction that made sense. The vision that they articulated in meetings and workshops allowed the initiative to move forward, while

the loosely structured nature of the organiza-tion ensured that new ideas and other voices were encouraged and brought to the table to help flesh out the vision.This leadership was a critical factor in the early days of the initiative; keep the need for strong leadership in mind in any similar initiative. It takes someone with passion, vision, and engagement to move a complex, distributed initiative forward. The flip side is to this is to allow the people with ideas to work on and develop those ideas within a loose framework. This was the key to success in the early years of DCMI—the combination of strong leaders with a dynamic, loosely coupled organization that allowed innovation and ideas to develop quickly.Every new initiative needs an identifiable leader who can carry the message out to those un-familiar with the territory, and serve as a focal point for those working on the problem. The role of the leader evolves over time, however. See Steps Four and Six for more detail.

Provide ongoing communication toolsYou must address two communication streams throughout any initiative’s life—external and internal. It is important to facilitate the free flow of information and ideas to instill confi-dence, build trust, and reach the necessary human scale to achieve ambitious goals. But it is not enough to have the conversations; the conversations need to be available for reference. DCMI was fortunate in having people who understood this, and who provided the methods necessary to both capture discussions and provide organized access to them for future work. These early ingredients were very important in allowing DCMI to grow and prosper, and extend beyond a one-time effort into the independent organization it is becoming today.For external communications, the DCMI web-site (11) was an obvious place to start. Other

6702 Michael Crandall

venues included presentations at conferences, published papers, and working documents that were captured and made available through the website. This central repository of information about the initiative’s work remains one of the core communication vehicles, and is used by implementers and developers throughout the community.

Early on, we recognized that internal communi-cations were equally important. This led us to establish formally recognized Working Groups (12), set up to facilitate discussion among people interested in specific issues related to the initiative. Over the years, these working groups have been one of the most effective methods of organizing the discussions related to DCMI, and of making progress available to others interested in the areas of work. A combination of listservs, Websites, wikis, teleconferences, and physical meetings served to keep communication flowing and visible. By providing a focused set of conversations we could manage important threads separately, but everyone could see what was happening across the board.The frequent workshops also provided face-to-face opportunities to bring these virtual communities together with each other for cross-pollination and information dissemination. The workshops served as a way for the different groups to report on activities, share current thinking, and interact with others working in areas that might have important consequences for their own work. Building these activities into the workshops provided checkpoints in the discussions and enabled participants to document their progress against previously set goals. The visible presence of DCMI’s technical and organizational leaders in these events and activities also kept conversations flowing and provided a face to the initiative.Being conscious of your audiences and staying in touch with them is a critical factor in success. But remember that the people who you talk to every day are not the only ones who need

to hear about you—you must work to inform those who are unaware of your organization’s existence. Being aware of and developing communication channels that reach not only your supporters, but also those you want to be your supporters, is an important component in keeping your initiative alive and thriving. Do not forget to provide multiple ways (face-to-face, virtual, and any other way you can think of) to allow your most important members—the people working on and using your products—to stay in touch with each other and with you.

SteP 3: StaBILIze techNIcaL actIVItIeSOnce the early excitement and enthusiasm associated with a new project wear off, you may need a more process-oriented stage to keep things moving and provide the structure to support longer-term activities. This is true on the technical side and on the organizational side; both aspects of the project are equally important. After the flurry of development and innovation at DCMI in its first five years, it became apparent that we needed additional structure, both in the organization itself, and in the technical work underpinning the develop-ment. We made a series of evolutionary changes to stabilize the progress to date, and to provide a solid foundation for future work.

Stabilize the FoundationsWhatever context you are working in, be sure to find the people who will put a stamp of approval on your efforts, and take advantage of their blessing to provide visible evidence of your success. Building something that works well is one thing, getting others to believe that it works well is another. Having a respected outside voice say that you know what you are doing can be a major asset for the future development of an initiative.

68 TIMAF Information Management Best Practices Vol. 1

A number of national and international standards bodies formalized the 15 “Simple Dublin Core” elements that were the outcome of the initiative’s early years of work. The first step in this process was submission to the W3C Request for Comment process, with the publi-cation of RFC2413 in 1998 (13) (subsequently updated in RFC5013 (14). This was followed by approval by the American National Standards Institute/National Information Standards Organization in 2001 (ANSI/NISO Z39.85-2001 (15), superseded in 2007 by ANSI/NISO Z39.85-2007 (16), and the International Standards Organization in 2003 (ISO Standard 15836-2003 (17), superseded by ISO Standard 15836-2009 (18).

These international standards are now the basis for national standards in many countries, including Australia, Denmark, Finland, the Netherlands, and the UK. This codification of the early elements into a standard format provided legitimacy and stability that allowed further work to proceed, and gave users a powerful statement that this was an endorsed approach to the problem of resource discovery on the web. The standardization process also forced agreement on contentious issues and pro-vided a clear product for the organization that could be used in discussions around next steps.

Recognizing the importance of external vali-dation early on, and getting your early work recognized outside your immediate circle, are important steps toward providing a stable base for any effort that hopes to gain buy-in from multiple audiences. Since most information management work is built on consensus, having a way to show that you have achieved that level of agreement goes a long way toward building credibility and support for your efforts.

extend the technical FrameworkMaking it possible for others to build upon your work increases the chances of adoption

and brings in even more good minds to solve problems (and to provide feedback on is and is not working). Once people start using your products, you become an indispensable part of the picture, and this builds your capital and provides a stronger platform for growth and expansion. This is particularly important in a global, language-based initiative that has to reach native speakers in many different parts of the world.

At this stage of development, it was becoming increasingly clear to DCMI that the 15 elements defined in the standard were not enough to satisfy all needs. As early as the second work-shop in Warwick, England, the concept of a container architecture was proposed (19), as a way to logically extend the Dublin Core elements to specific applications needing other metadata. This concept grew into the idea of an “application profile” which allowed a metadata creator to mix and match metadata elements from different sources (20), and towards implementation through the development of the W3C Resource Description Framework (RDF) (21) standard. Because of this early emphasis on extensibility and interoperability, DCMI was well-positioned to follow both these threads, making continued exploration of the technical implications possible.At the same time, the international focus of the initiative meant that we needed translations of the English language elements and associated documentation. We accomplished this through volunteer effort in non-English-speaking countries. To provide access to these trans-lations, we developed a metadata registry that held the formal definitions and attributes of the original elements and refinements, as well as their translations. The current registry is an open source project hosted at the University of Tsukuba (22).

Building on (and encouraging others to build on) early technical successes allows your initial work to expand and grow. The trick is to keep

6902 Michael Crandall

a clear focus on the activities that are essential for your technical framework to grow and develop, and to make sure the work you directly support advances your initial goals. Encouraging innovation and experimentation is essential, but nurturing only those directions that provide direct support of your main reason for being keeps the initiative on track. Using partners to take up enhancement activities that are not central to your core mission lets your organization use resources efficiently, while still bringing new ideas into the mix.

Develop Support MechanismsThe technical landscape changes rapidly these days, and documenting your responses to these changes is important in building your users’ trust. Providing a way to support legacy applications, tracking changes in your documen-tation and products over time, and making all of this visible to your users, goes a long way toward keeping people engaged and interested in your work over time. With the codification of the 15 elements as a standard, it was clear to DCMI that we needed a mechanism for managing continued changes and updates. To address this issue from a technical perspec-tive, we formed a Usage Board to provide a process for tracking and acting upon changes to the elements and their associated properties. Thanks largely to another leader who emerged in this arena (Thomas Baker— now the Chief Information Officer of DCMI), the Usage Board emerged as a key driver in helping the standard to mature over time. Terms submitted to and approved by this board were moved into one of the officially maintained DCMI namespaces and assigned a permanent URI, ensuring that others would be able to access and use the terms in their metadata efforts. This codification of pro-cess and provision of a stable resource allowed the initiative to build upon a secure base in the future.

In addition to changes in process management,

as the standards evolved, it became clear that prior versions of the standard needed to be supported to avoid breaking existing applica-tions. This required careful attention to incorporating the requirements of those legacy implementations as the new changes were made. In some cases, this meant freezing and maintaining old namespaces. In others, it was a matter of incorporating flexibility in the logical design of new versions, allowing older versions to work in concert with new applications—though usually at reduced functionality. Again, the original principles of simplicity, semantic interoperability, inter-national consensus, interdisciplinarity, extensibility, and modularity were invaluable in thinking through these changes and providing the necessary support mechanisms to keep existing applications stable while advances were made in thinking. Leaving your users behind is not a good idea in any situation, particularly when they are depending upon you as their primary provider for technology underpinning their production applications. By providing clear legacy support and documentation of migration paths for those who are ready, you’ll go a long way toward keeping your users and stakeholders satisfied with your work.

sTeP 4: sTABILIze orgAnIzATIonAL sTruCTureAlong with advances in technical development,the organization itself needs to make a transition from a working coalition to a formally structured body at some point. In order to support more complex technical activities, it is necessary to provide a clear assignment of responsibilities and develop a governance structure that is transparent to both members and users.

Provide a technical ForumWhen the first thrill of discovery wears off, it

70 TIMAF Information Management Best Practices Vol. 1

is tempting to move on to the next exciting challenge. If you are interested in building an organization that will last, however, you need to move through the changes that accompany the shift from initiation to sustainability. Providing mechanisms that allow the increas-ingly complex technical work to proceed. while still helping new users get up to speed, requires adjustments in how your group works and communicates. Being sensitive to these changes and reacting appropriately is an important component of long-term success. If your original work is of value, use your changes to celebrate that work, while including a broader audience.

For DCMI, many of these organizational changes came to a head at the 2001 Conference in Tokyo, Japan. A number of formal papers were presented at the previous workshop, and it was clear that there was a need to expand this aspect of what had been a more informal working gathering. As a result, DCMI instituted a broader and more organized meeting on an annual basis to move the initiative into its next stages. The Tokyo conference was the first of these more formal meetings, with tutorials, formal paper sessions, and workshops to address the many audiences that formed around the initiative as deployment increased. The scope of the conference also expanded to look more broadly at other metadata initiatives and standards, as well as applications using them.

This transition from a working group to a formal convening body was driven by the recognition that the audiences served by DCMI had shifted from the technical people who developed the standard to the users of the products that the initiative had created. The need to provide more context and education for users, to give implementers a chance to share their experiences and learning, and to provide formal recognition of the organization’s maturity, all played a part in the decision to move to the conference format.

All of this activity provided yet another exter-nal communication opportunity for DCMI, and raised the organization’s visibility to a new level. With this added exposure, we rolled out a new website containing articles and papers by key players in the organization. This kept our message in front of the public. We worked out a formal structure for our conference, providing the opportunity for those most deeply involved in the technical and organizational side of the initiative to hold separate meetings, while using the conference as a way to reach the audience of users and developers who were key to the organization’s growth.

Recognizing the need for change and acting upon this need are essential for organizational success. Assuming that existing management and supporting activities will work well without attention inevitably ends in disaster. Responding to growth, changing external conditions, and internal developments in a timely way—using effective mechanisms for communicating with users and stakeholders about those changes, and involving them in the change process— is as important to long-term success as attention to technical development and support.

Differentiate Business from technical StructureIn the initial stages of an organization, it is easy to dismiss “management” as unnecessary overhead —in a small and enthusiastic group, communication happens spontaneously, and the excitement of innovation drives the work. But at some point, the reality of managing different people with different tasks sets in. Finding a lightweight way to handle this challenge can make or break an initiative. Starting small, building on already existing organic processes, and knowing where not to interfere, are all important considerations in building a management structure. To keep the organization healthy and happy, pay attention to these considerations over time, and adjust

7102 Michael Crandall

them as the external and internal environments change.

During the year preceding the Tokyo conference, DCMI moved to build a separate structure for the business side of the organization, in tandem with making changes in the technical side of the organization, such as the formation of the Usage Board. The process of building out this structure included hiring Makx Dekkers as a paid project manager. This move was both a major change in resource allocation, and a recognition that the initiative needed a formal management structure at this stage of its evo-lution.

The organizational structure envisioned in the 1999 proposal (23) shows that DCMI’s members (DC General and the Working Groups) are a strong voice in the organization, but that an executive group and advisory committee have been added to provide overall direction and to manage the initiative See Figure 1, taken from Ianella and Heery (23). These first steps toward a formal business model for the orga-nization grew rapidly, with the formation of a Board of Trustees shortly after the conference, and a formalization of the structure in bylaws and voting procedures.

To make it through the transition from startup to stability it is important not to be afraid of administrative structure, but essential to use

this structure carefully. Communicating with both users and stakeholders is critical here. Making sure that both groups’ voices are in the structure helps to gain buy-in and support for the changes. It is never easy to impose structure on a loose arrangement, but by engaging those involved in the process, you are more likely to succeed. DCMI’s model— making the broader community an integral part of the organizational structure, while still adding needed governance elements—is a good one to follow as your initiative matures.

Bring key Stakeholders in as ownersAt some point, you will have key stakeholders who have a vested interest in your success. You must bring these people into the ownership of the organization in a way that contributes to the initiative’s long-term financial and politi-cal success. Having the people who care most about what you are producing involved in your organization—both technical people and end users—goes a long way toward building a sustainable model for your work. You must balance their self-interest with some indepen-dent eyes that can bring in fresh perspectives and ideas, however, and help to mediate the inevitable conflicts that arise as different needs compete for attention.

One of the Board of Trustees’ first actions was to explore revenue generation models for the fledgling organization. It was clear that the financial sponsorship of OCLC would not continue indefinitely; the group had to find alternate mechanisms for supporting the growing operations of the initiative. Although we explored a number of proposals during those first years, the one that fit best with the current model of strong member participation was the Affiliate program, which started in 2003 with the first Affiliate member, Finland.

Engaging the standard’s users as owners of

Figure 1: Dublin Core Organizational Structure

72 TIMAF Information Management Best Practices Vol. 1

the business was consistent with the initiative’s early values, and provided a direct linkage between the activities of the organization and the ultimate consumers of its products. Owner-ship brought both the responsibility for the organization’s future direction, and the ability to influence where the resources would be used. As additional affiliates were added, their role in the Board of Trustees grew, although key stakeholders with expertise in areas that were critical to the organization’s functioning were also included on the Board to ensure diversity of input.

sTeP 5: exPAnd The TeChnICAL ModeLOnce the organization establishes a stable technical and administrative base, it can extend the depth and breadth of the activities that it supports. On the technical side, this may mean broadening your outlook to dive deeper into conceptual underpinnings of the early solutions that you developed, and finding ways to engage with parallel efforts that extend and broaden the reach of your work.

extend the technical Solution to a conceptual ModelThe original vision is rarely achieved in the first attempt, but sometimes an early success can blind you to what you actually started out to do. If you go back to your early goals frequently and reassess your progress against those goals, you may find that the goals that you started out with are not enough to accom-plish what you want. Keeping that original vision in your mind and nurturing the longer-term work that is necessary to make it a reality are not easy things to do, but they are important for real success. This may mean evolving past what you originally thought was the solution to a problem, explaining to your users why it is not the final solution, and convincing them that the solution that

you are offering now is important for them. The new solution is undoubtedly going to be more complex than what you started with, so a good picture of why it is important is critical.

In the early stages of development at DCMI, the emphasis was on creating a common vocabulary and standard for metadata elements. But as early as 1997, we begin thinking about a more systematic way to model the logical under-pinnings of our work. This lead to the formation of the the Data Model working group. Much of the activity in this area ran in parallel with efforts in the W3C community to develop a mature model for representing the entities and relationships of the DCMI standard in a web environment, using the W3C Resource Descrip-tion Framework and its associated activities.

The effort to underpin the initial Dublin Core element set with a logical model marked another major turning point in DCMI. While critical to further development of the standard, this modeling became increasingly abstract and difficult for many of the interested parties to follow. It is one thing to see a list of metadata elements and use those in any arbitrary system, but quite another to be asked to do logical modeling before you can build a system around them.

This work proceeded through a number of forums, moving into the Architecture Forum in 2006. This change underscored how central the modeling was to the initiative’s evolution. In spite of the complexity, this work provided a direct link to the W3C activities, and to other associated modeling work that proved essential for true interoperability in an open environment.

While not all users of the Dublin Core standard cared about this (since applications working in closed environments or limited reach did not need to worry about interoperability except within their own boundaries), future extensions in this area depended upon a clear model for

7302 Michael Crandall

representation of the characteristics of the metadata in a logical framework, and the ambiguity of the verbal representations were not adequate for this purpose.

Again, the initiative was faced with the task of moving ahead with new work while still main-taining the functionality of the early implement- ations to make sure that those who adopted early were not left behind. Over a period of time, as the Abstract Model (24) was developed and formalized, care was taken to preserve the functionality of existing services, while develop-ing the new capabilities and richness provided by a more formal specification of the properties associated with the elements.

As this process occurred, it became clear that the Dublin Core Metadata Initiative was more than the vocabulary that had been developed in its early stages; it also provided a model for using that vocabulary to make metadata independent of the particular implementation, and usable in many contexts. This realization allowed a move from focusing on terms, as in the early days, to the ability to accommodate any terms created in conformance to the Abstract Model to interact with the Dublin Core. See Figure 2, taken from Crandall (25).

The opportunity to return to your original ideas and expand them can be a double-edged sword. On the one hand, it opens up much

more interesting territory for those who are intimately familiar with the initiative. On the other hand, the increasing complexity becomes somewhat overwhelming to audiences who are just discovering your work. While it is important to dive deeper into issues that underpin your initial activities, do not forget your early adopt-ers. Make sure that they are able to understand why you made the choices that you did, and what difference these choices can make to them in their work.

Provide Documentation and examplesOne of the best ways to help your audience understand your work is to develop clear and complete documentation. Good documentation makes it possible for users to grasp the intrica-cies of your work, and to see how they can apply it to their problems. Examples, documen-tation, guidelines, and other resources that help users understand and implement your offerings become more and more important as time goes on and you release multiple products and ver-sions. Sometimes it takes a few tries to get the right level of explanation for different audienc-es, so you may have to build several different versions of your documentation and examples before you get it right, or provide different materials for different audiences. For DCMI, the additional complexity introduced by the Abstract Model required substantial documen-tation. We made our documentation available on the DCMI website and in numerous techni-cal papers and articles.

The change in terminology from earlier descrip-tions of the grammatical principles of Dublin Core will take time to get used to, and intro-duces a potential source of confusion among implementers. This is yet another challenge for the organization to overcome, and one which is being addressed through the same interactive process that worked so well for the first stages of its development. Documentation is a key part

Figure 2: Dublin Core Metadata Element Set plus Abstract Model

74 TIMAF Information Management Best Practices Vol. 1

of this conversation, and it is important to make it public so that discussion can point out areas for improvement or further explanation.

Examples allow us to demonstrate the power of the Abstract Model by showing how it can be used to fully specify a metadata implemen-tation via incorporation of administrative and technical elements. This is a an evolution of the application profile concept developed in Step Three, and an important part of DCMI’s work. Comparing Figures 3-5, taken from Powell (26), with the simple list of elements in Table 1 makes it clear how far DCMI has moved in the evolution of its thinking about metadata and its representation, a conceptual leap that would be hard to make without clear documentation.

As your organization develops more complex products (and their associated documentation and guidelines), it may become dependent on many parallel efforts that are working on similar, yet different, activities. Good descrip-tions of your work make sure that these other efforts are aware of what you are doing and can take advantage of your efforts, and vice versa. Recording and explaining what you are doing has multiple purposes, and becomes even more essential as your work becomes more intricate and varied.

sTeP 6: esTABLIsh orgAnIzA-TIonAL IndePendenCeEventually, your initiative must become self-supporting. Although you may have started out under the wing of a larger organization or department, if you are successful, you will outgrow that home and either begin looking

Figure 3: DCMI Resource Model

Figure 4: DCMI Vocabulary Model

Figure 5: DCMI Description Set Model

7502 Michael Crandall

for another one or decide to strike out on your own. Preparing for that transition early on makes any changes much easier. Recognizing that independence is a logical step in your or-ganization’s growth, and treating that step with the same care and respect that you give your technical work, are important for a successful transition.

Move toward Independent Status as an organizationIt not enough to react to change, although reacting well is important—you must also plan for change. Change will come, and if you are not ready, you may be at a loss to deal with it. By anticipating changes in leadership, organi-zational directions, and funding sources, you have a much better chance of coming through those changes stronger and ready for the next ones. DCMI has been able to grow and evolve in part because scenario planning and option exploration is an ongoing part of its governance process, and provides clear pathways when events transpire.

Initially, OCLC supported the DCMI initiative. After 2001, as DCMI moved from project to organization (and OCLC moved away from primary sponsorship), the initiative had to develop new funding sources, administrative structures, and supporting communication tools for its new environment. As the technical product became more structured, so did the organization. In July of 2003, Stu Weibel stepped down as the Director of DCMI and Makx Dekkers, the Managing Director, took his place. This change in leadership marked the transition from a fledgeling organization to one that had reached maturity and needed to find a long-term model for existence.

One requirement for long-term viability was a steady source of funding. As noted previously, one of the first activities the Board of Trustees took on was identifying potential revenue

sources. This task became more critical as OCLC set a deadline of 2008 for the long-overdue termination of its sponsorship. The solution to the funding problem was two-fold: DCMI continued to grow its Affiliate Program, and developed a Partner Program which brought in funding from organizations that wanted to be recognized as Dublin Core supporters in their products or services. These changes in funding did not happen in a vacuum —they were a result of years of planning, as the organization explored possible scenarios for supporting its future growth.

Developing a formal identity was even more important than securing funding, since DCMI would no longer be a project of OCLC. This required exploring alternate governance strate-gies, and deciding where and how to make these changes happen. Engaging Affiliates as owners of the organization turned out to be critical here, as the Affiliates who sat on the Board of Trustees concluded that the only viable strategy was to assume ownership of the initia-tive themselves. The National Library Board of Singapore became the host member under the new arrangement (with the National Library of Finland as the second founding member) and began the process of restructuring the organization as an independent non-profit entity under Singapore law.

Planning ahead, engaging stakeholders in the coming changes, and exploring your options before you need to move, are all critical elements in making a successful transition to independence. There will be bumps along the way, but if you have thought ahead and have your key partners lined up to take responsibility for changes, you are much more likely to be successful in moving to a new home and a new way of doing business.

76 TIMAF Information Management Best Practices Vol. 1

Build Self-Sustaining administrative Structures and ProcessesThe organization is not the only thing that needs thought and care in the transition to independence; you must also consider the infrastructure that provides the communication and management of the resources that you use in daily activities. Making sure that you have a good plan for moving critical supporting systems to reliable homes can be as challenging as redesigning the administrative processes. Users will notice quickly if you are not main-taining your website or other communication vehicles, and poor response times or sporadic updates can do major damage to your reputa-tion.

With DCMI’s move to independence, both the organization and its entire supporting infra-structure had to find a new home. Again, the decision to involve Affiliates as owners proved to be a wise one. The National Library of Korea took responsibility for hosting and maintaining the website, and gathered all of the conference papers into a single repository which they will maintain over time (26).

With the incorporation in Singapore, the web-site and conference proceedings in Korea, the registry and supporting functions in Japan, and owners in Finland, New Zealand, Singapore, Korea, and the UK, DCMI has an international infrastructure for the next stage in its evolution. Crucially, the infrastructure is formalized in its bylaws and supported by its members. This dis-tributed structure reflects the nature of DCMI itself, and provides the best chance for future growth of the initiative given the environment that it lives in.

The early emphasis on international collabora-tion, the engagement of stakeholders as owners, and the visibility of the initiative through confe-rences, the website, numerous publications, and involvement with parallel initiatives, are clearly the reasons that this transition is even

possible. DCMI identified its major principles early in the game, and consistently used them as the foundation of its work. These principals are displayed prominently on the DCMI website as the Principles of Operation (27):

• Openconsensusbuilding.• Internationalscopeandparticipation.• Neutralityofbusinesspurposesandbusiness models. • Neutralityoftechnology.• Crossdisciplinaryfocus.

Knowing what you are good at, and using those things as the founding principles of your organi-zation, helps to ensure that you will have the support you need to achieve your goals. Trying to be something you are not, or saying you are doing something that you cannot do, is a quick way to weaken your support and derail your activities. The lesson here is that starting prin-ciples can be the key to success of an organiza-tion. Thinking through these principles early in the building process can pay off in major ways as time passes.

sTeP 7: InTegrATe TeChnoLogy And ProCessOnce you have worked through the evolution of your technical solution and established a solid base for its use, you may find that the processes that wrap around the technology become a focus of attention. This is another indication of a mature technology, and it provides an opportunity to establish a more robust scenario for your audiences to use in applying your work in their own settings. Moving beyond the technology to the adminis-trative functions needed to support its use within an organization provides another way to increase the value of your product to your users, and helps them understand how best to use it to accomplish their goals.

7702 Michael Crandall

Map the Processes Necessary for technical SuccessTying your products together into a coherent offering helps your initiative to keep a sense of identity, and gives your users a context to understand how each part is important. It is hard to do this in a fast-moving environment with many external dependencies, but making the attempt provides another starting point for refinements. It also provides a convenient self-check on whether what you are doing still makes sense and fits what you started out to accomplish, after all the side roads that you have explored. The exercise allows you to separate productive changes from useless ones, to clarify purpose and mission, and to help users understand how everything fits together.

As DCMI moves into a new phase of its life, the contributions it makes to the technical world are expanding to include process aspects of metadata management. This confluence is another indicator of the maturity of the organi-zation and its products, and provides a glimpse into a future that is promising in its scope and impact.

The Singapore Framework—see, Figure 6 taken from Nillson (28)—and unveiled at DC2007, provides a multi-layer model for developing and maintaining an application profile. This profile consists of functional requirements, a domain model, a description set profile, and optional usage and encoding syntax guidelines. As described in the documentation, “An application profile describes the set of guide-lines, description rules, and constraints used in creating a specific set of metadata records. As semantic interoperability is provided by a correct use of terms defined in one or more vocabularies, application profiles are about providing high-level syntactic or structural interoperability in addition to the semantic interoperability.”

This framework brings together the multiple strands of technical development over the past 15 years into a single, elegant application based upon the core standards of the Semantic Web. The integration is a major step forward in understanding how to move from vocabulary-level interoperability, through machine-level interoperability, to a sustainable model for building metadata systems incorporating both. By bringing in the external environment surrounding a metadata model and incorpor-ating both the domain and local standards into the framework, documentation and application merge into a single entity.

The Singapore Framework is young, so only time will tell where this new development will head. Yet even now, the framework illustrates the power of an ongoing organization that has kept rigorous documentation of its technical work, and shows how that work can be built upon and extended over time to create a much richer product.

Build tools to Support the Process, and Document BothAlthough some might argue that metadata will never be easy, having tools that help implementers turn ideas into reality helps to make common standards digestible. Even with a model such as the Singapore Framework in hand, someone still has to turn out the code that makes things work. Providing toolsets and

Figure 6: The Singapore Framework

78 TIMAF Information Management Best Practices Vol. 1

examples that facilitate code creation increases the chances of widespread adoption, which supports further activity and development. Again, paying attention to the people who are the ultimate consumers of your product will help focus you efforts and keep you “honest.”

Some work has already been done to provide examples of how the Singapore Framework can be used, for example the work described in a recent article on the Scholarly Works Application Profile (29). Documentation supporting Application Profile (30) and Description Set Profile development (31) is in working draft form, which allows developers to explore these new models and take the initiative further down the road. Discussion of these new tools is already in full swing, so more there will be more changes and develop-ments in the future (32). One of the advantages of exposing these early iterations of document-ation is that this strategy allows users to generate rapid feedback for improvement.

In addition, we developed a formal review process for any changes to the DCMI standards. These standards are documented clearly, so that anyone wishing to develop their own application profile can work against them (33). DCMI also recently published a document for developers that describes the multiple options for interoperability that have developed over the course of the initiative’s lifespan (34). These tools for using and integrating the various com-ponents of the DCMI model reduce confusion and allow greater consistency in their applica-tion.

It is remarkable to reflect on how far the technical standards within Dublin Core have come from the first 13 elements defined in that workshop back in 1995. The pathway is one that many information management applications take during their growth and development, but DCMI’s growth has played out on a worldwide scale, with some of the

world’s best minds behind it. Watching the growth in understanding, the continual refine-ment and integration of early ideas into workable tools, and DCMI’s increasing ability to handle more and more of the complexity surrounding the use of interoperable metadata, provides an inspiring and informative glimpse into how much effort and thought this kind of initiative requires. While most of us will never use all of what DCMI has developed, we can all learn from its insights and take advantage of its efforts.

sTeP 8: exPAnd orgAnIzATIonAL reACh And IMPACTWith success comes changes—some welcome, and some unwelcome. As we have seen in the preceding steps, being prepared to deal with those changes is key to your organization’s success. Flexibility, preparation, and attention to organizational needs are all prerequisites for transitioning from a start-up to a more stable, ongoing organization. As complexity and size increase, the opportunity to do more significant and lasting work also increases, but you must also pay attention to the entity sup-porting that work. Getting big does not mean you have to stop being innovative, but it does mean you have to work harder at getting the basics right.

Formalize IndependenceRather than panicking when major changes come to your organization, use the changes as opportunities find the strengths you have, focus on those strengths, and build on them as you move ahead. There will undoubtedly be lean times and stressful occasions, but if you can stay focused on your mission, prioritize activities that will have the most impact, and build strategies to move ahead when you have the chance, your core can stay strong and provide an anchor.

7902 Michael Crandall

DCMI has the challenge of meeting both technical and organizational difficulties in the coming years. Incorporating the organization as a stand-alone entity at the end of 2008 marked another new transition. The future depends on DCMI’s success in forging ahead in a new, more formal environment. As DCMI acclimates itself to its new role as an indepen-dent entity with responsibility for generating revenue and managing its own future, there are bound to be organizational challenges, in addition to the technical ones described in the preceeding sections. One of the first tasks faced by the newly-independant organization was constructing a new governance model that fit the legal requirements for incorporation. As shown in Figure 7, taken from DCMI (35), DCMI is a much more complex entity than the first formal organization mapped out in 1999.

That complexity brings additional organizational opportunities, just as it brought new opportun-ities in the technical realm. DCMI has been well served by the investment of a variety of stakeholders, by the continued involvement of those stakeholders in both the evolution of the organization and the evolution of the standards that the organization produces, and by fact that the stakeholders feel a sense of ownership towards the initiative. Only time will tell if they will continue to do so in the future.

Survive and thriveIt should be evident that many of these steps are iterative, and you will come back to them again and again no matter what you do. Particularly in the modern world, counting on something staying the same for more than a few moments is a sure recipe for disaster. If you are conscious of both your technical and organizational focus and can keep returning to the reasons for your existence as changes occur, however, you should be able to move ahead just as DCMI has done for the past 15 years.

No organization lives in a vacuum, of course, and DCMI is no exception. Our continued growth and success depend upon maintaining sources of revenue in a tight global economy, cooperating with other standards developing organizations that have become more closely aligned with our efforts over the years, and effectively communicating the initiative’s goals and objectives to the implementers who use its products.

None of these are easy challenges, but meeting them will determine the next stage in the organization’s organization. Success will bring continued contributions to a complex area that underpins much of our modern work, failure will mean a new start by some other organization, but the legacy of DCMI will remain no matter what, providing the found-ation for advances to the next level of accom-plishment in organizing information for better use and interoperability in the growing digital infrastructure.

Figure 7: Current DCMI Organizational Structure

80 TIMAF Information Management Best Practices Vol. 1

Summary and ConclusionIn some ways, the Dublin Core Metadata Initia-tive has been too successful. The initiative’s early emphasis on defining a simple metadata schema for interoperability on the web attained rapid recognition and adoption, and established the organization’s reputation as the leader in this area. But as time passed and the organiza-tion realized the depth of the task it undertook, the complexity of the work has sometimes confused and baffled users. Today’s challenge is to help users catch up with the thinking that produced the Singapore Framework and its sup-porting infrastructure.

Finding financial sponsors who are willing to fund this kind of complex infrastructure activ-ity is a challenge, and will be more so in the current fragile global economy. Yet in many ways, this is the most fruitful investment that any organization can make in the future of their information systems. The ability to create interoperable, open, international mechanisms for information exchange is the underpinning of any modern information system. DCMI has led the way in exploring these foundations, defin-ing what it means for machines to interpret our

information so that we can derive the maximum benefit from the results.

The gradual build up of technical infrastruc-ture from the original simple set of metadata elements to today’s sophisticated process model and underlying syntax descriptions for constructing interoperable metadata has been a long journey, and the trip is certainly not complete. But this path should be of interest to anyone working with information that needs to be shared among distributed users through the digital pathways that we have created. DCMI’s organizational and technical journey provides insights into workable models for user and stakeholder engagement, financial stability, and communication among multiple types of users and consumers.

Learning from the experience of others can be a powerful shortcut to quick results. Add to that the myriad products and insights that DCMI has offered to the world, and the last 15 years have been a remarkable journey for all of those engaged in the information universe. Thanks to this organization for giving us a model for an organizational structure that can support its efforts, for providing an open and transparent record of its progress, and for moving us ahead on the path to truly interoperable information.

8102 Michael Crandall

references

1. Millerand, Florence and Geoffrey bowker. Metadata standards: trajectories and Enactment in the Life of an

Ontology. standards and their stories: How Quantifying, classifying, and Formalizing Practices shape Everyday

Life. Ed. Lampland, Martha and susan Leigh star. thaca: cornell Press, 2009.

2. Dublin core Metadata Initiative. History of the Dublin core Metadata Initiative. 2009.

retrieved on March 20, 2009. http://dublincore.org/about/history.

3. Weibel, stuart, Jean Godby, Eric Miller and ron Daniel. OcLc/NcsA metadata workshop report. 1995.

retrieved on 3/20/2009. http://www.ifla.org/documents/libraries/cataloging/oclcmeta.htm.

4. Dublin core Metadata Initiative. Dublin core element set, version 1.0 - reference description. 1998. retrieved

on March 20, 2009. http://dublincore.org/documents/1998/09/dces/.

5. Dublin core Metadata Initiative. Dublin core element set, version 1.1 - reference description. 1999. retrieved

on March 20, 2009. http://dublincore.org/documents/1999/07/02/dces/.

6. Dublin core Metadata Initiative. Dublin core metadata element set, version 1.1. 2008.

retrieved on March 20, 2009. http://dublincore.org/documents/dces/.

7. sugimoto, shigeo, thomas baker and stuart Weibel. Dublin core: Process and Principles. Digital Libraries:

People, Knowledge, and technology. Ed. Lim, Ee-Peng, et al. berlin: springer, 2002. retrieved on April 7, 2009.

http://www.springerlink.com/content/9lx5fx94743269pp/.

8. Dublin core Metadata Initiative. DcMI Encoding Guidelines. 2009. retrieved on March 20, 2009.

http://dublincore.org/resources/expressions/.

9. Greenberg, Jane. tutorial 1: Dublin core History and basics. berlin: 2008. retrieved on March 20, 2009.

http://dc2008.de/wp-content/uploads/2008/09/tutorial-1.pdf.

10. Weibel, stuart. Mending Our Net: Gathering, Describing, and Preserving Information in the Digital World.

Ed. Fifth International World Wide Web conference. 1996. retrieved on April 7, 2009.

http://www.w3.org/conferences/WWW5/fich_html/invited-speakers.html.

11. Dublin core Metadata Initiative. Homepage. 2009. retrieved on March 20, 2009. http://dublincore.org/.

12. Dublin core Metadata Initiative. DcMI Work structure. 2009. retrieved on March 20, 2009.

http://dublincore.org/groups.

13. Weibel, stuart, John Kunze, carl Lagoze and M. Wolf. Network Working Group request for comments: 2413.

1998. retrieved on March 20, 2009. http://www.ietf.org/rfc/rfc2413.txt.

14. Kunze, John and thomas baker. Network Working Group request for comments: 5013. 2007.

retrieved on March 20, 2009. http://www.ietf.org/rfc/rfc5013.txt.

15. Dublin core Metadata Initiative. ANsI/NIsO Z39.85 - the Dublin core Metadata Element set. NIsO

standards - National Information standards Organization. 2001. retrieved on March 20, 2009.

http://www.niso.org/kst/reports/standards?step=2&gid%3Austring%3Aiso-8859-1=&project_key%3Austring

%3Aiso-8859-1=354a3ff3bbaf74d3279272636fb6c36783dfad50.

16. Dublin core Metadata Initiative. ANsI/NIsO Z39.85 - the Dublin core Metadata Element set. NIsO standards -

National Information standards Organization. 2007. retrieved on March 20, 2009. http://www.niso.org/kst/

reports/standards?step=2&gid=&project_key=9b7bffcd2daeca6198b4ee5a848f9beec2f600e5.

17. Dublin core Metadata Initiative. IsO 15836:2003 - Information and documentation - the Dublin core metadata

element set. International Organization for standardization. 2003. retrieved on March 20, 2009.

http://www.iso.org/iso/iso_catalogue/catalogue_ics/catalogue_detail_ics.htm?csnumber=37629.

18. Dublin core Metadata Initiative. IsO 15836:2009 - Information and documentation -- the Dublin core meta

data element set. International Organization for standardization. 2009. retrieved on March 20, 2009.

http://www.iso.org/iso/iso_catalogue/catalogue_tc/catalogue_detail.htm?csnumber=52142.

19. Lagoze, carl. the Warwick Framework: A container Architecture for Diverse sets of Metadata. D-Lib Magazine.

TIMAF Information Management Best Practices Vol. 182

(2(7)) 1996. retrieved on March 20, 2009. http://www.dlib.org/dlib/july96/lagoze/07lagoze.html.

20. Heery, rachel, and Manjula Patel. Application profiles: mixing and matching metadata schemas. Ariadne. (25)

2000. retrieved on March 20, 2009. http://www.ariadne.ac.uk/issue25/app-profiles/intro.html.

21. World Wide Web consortium. resource Description Framework (rDF) / W3c semantic Web Activity. 2004.

retrieved on March 20, 2009. http://www.w3.org/rDF/.

22. Dublin core Metadata Initiative. the Open Metadata registry. Dublin core Metadata registry. 2008. retrieved

on March 20, 2009. http://dcmi.kc.tsukuba.ac.jp/dcregistry/.

23. Ianella, renato, and rachel Heery. structure and Operation. Dublin core Metadata Initiative. 1999. retrieved on

March 20, 2009. http://dublincore.org/documents/1999/05/31/dcmi-structure/.

24. Powell, Andy, et al. DcMI Abstract Model. Dublin core Metadata Initiative. 2007. retrieved

on March 20, 2009. http://dublincore.org/documents/abstract-model/.

25. crandall, Mike. Dc Education: Product, Process, and People. International conference on

Dublin core and Metadata Applications. colima, Mexico: 2006. retrieved on 6 October 2006.

http://www.w3.org/conferences/WWW5/fich_html/invited-speakers.html.

26. Dublin core Metadata Initiative. DcMI conference Papers. 2009. retrieved on March 20, 2009.

http://dcpapers.dublincore.org/ojs/pubs/index.

27. Dublin core Metadata Initiative. About DcMI. 2009. retrieved on March 20, 2009. http://dublincore.org/about/.

28. Nilsson, Mikael, thomas baker and Pete Johnston. the singapore Framework for Dublin core Application

Profiles. Dublin core Metadata Initiative. 2008. retrieved on March 20, 2009. http://dublincore.org/documents/

singapore-framework/.

29. Allinson, Julie, Pete Johnson and Andy Powell. A Dublin core Application Profile for scholarly Works. Ariadne.

(50) 2007. retrieved on April 6, 2009. http://www.ariadne.ac.uk/issue50/allinson-et-al/.

30. coyle, Karen and thomas baker. Guidelines for Dublin core Application Profiles (Working Draft). Dublin core

Metadata Initiative. 2009. retrieved on April 6, 2009. http://dublincore.org/documents/profile-guidelines/index.

shtml.

31. Nilsson, Mikael. Description set Profiles: A constraint language for Dublin core Application Profiles. Dublin

core Metadata Initiative. 2008. retrieved on http://dublincore.org/documents/2008/03/31/dc-dsp/.

32. chaudhri, talat. Assessing Frbr in Dublin core Application Profiles. Ariadne. (58) 2009. retrieved on

April 6, 2009. http://www.ariadne.ac.uk/issue58/chaudhri/.

33. DcMI Usage board. criteria for the review of Application Profiles. Dublin core Metadata Initiative. 2009.

retrieved on April 7, 2009. http://dublincore.org/documents/profile-review-criteria/.

34. Nilsson, Mikael, thomas baker and Pete Johnson. Interoperability Levels for Dublin core Metadata.

Dublin core Metadata Initiative. 2009. retrieved on May 1, 2009.

http://dublincore.org/documents/2009/05/01/interoperability-levels/.

35. Dublin core Metadata Initiative. DcMI by-laws. 2009. retrieved on April 7, 2009. http://dublincore.org/about/

bylaws/.

8303 Keith De La Rue

KEitH DE LA RuE ([email protected]) works as an independent consultant in knowledge management, communication and learning at AcKnowledge Consulting in Melbourne, Australia. He previously spent eight years working in Knowledge Management at Telstra, Australia, where he led a team engaged in the transfer of product and service knowledge to Telstra’s enterprise and business sales force.

Transferring detailed, technical knowledge from one part of an organization to a non-technical audience

Building and Maintaining a knowledge transfer toolkit

Keith De La Rue

b 03

84

AbstrAct

TIMAF Information Management Best Practices Vol. 1

Many large organizations have subject matter experts with

a deep knowledge and understanding of business-critical

information. This knowledge needs to be conveyed to a target

audience in another area of the organization, mostly comprised

of staff with a lower level of technical expertise. In traditional

organizational structures, content management, intranet,

communications and training are often located in separate

silos. Yet all of these areas provide tools that assist in

knowledge transfer – the desired end result is an informed

audience. This splitting of functions can lead to inefficiency,

duplication of effort, confused messages and errors. Other

critical factors impacting effective knowledge transfer are

maintenance of the currency and accuracy of content, as well

as the problem of knowledge hoarding.

In this article, a strategy for building a complete knowledge

transfer toolkit will be described. This toolkit includes a range

of individual elements, comprising content management,

communications, learning and multimedia elements,

coordinated as a managed program. Approaches to

maintaining the currency and accuracy of content, dealing

with knowledge hoarding and the relevance of social

media principles will also be addressed.

85

Background

03 Keith De La Rue

The organization where this approach was developed was a large, technical organization, where a problem had been identified in the effective knowledge transfer of product and service offerings from a technical product management group to a (mostly non-technical) business and enterprise sales force. Similar approaches have also been applied in other organizations.

The skILLs needed To PerForM ThIs BesT PrACTICeThe effective operation of the knowledge transfer toolkit described here ideally requires a team with a broad range of skills – and preferably multi-skilled individuals within the team. The team Manager needs strong stake-holder management skills and a background in knowledge management. Skills required for the other team members include those listed below, but the actual mix and depth of these will vary, depending on the requirements of the specific business environment:

• Communication.• Contentmanagement,database administration and some taxonomy skills. • TrainingandeLearning.• Anunderstandingofthesubjectmatter (not necessarily to a deep level). • Understandingofthecultureandbehaviors of both the knowledge providers and the target audience. • Changemanagement.• IT,multimediaandsocialmediatechnology skills, or an ability to deal with the providers of the technology.

sTeP 1: esTABLIsh A knowLedge TrAnsFer TeAMIn order to establish an effective knowledge transfer program, a core, multi-skilled team is critical. This team manages the day-to-day delivery of the knowledge program and over-sees the maintenance of content. The team needs to embody a set of innovative skills and abilities, in order to embrace the entire spectrum of content, communication and learning tools used to build the knowledge transfer toolkit. The team also needs to have a strong understanding of the content matter and the ability to understand the motivations and culture of both the content providers and the target audience and to work well with them. (The skill-set is described above).

The size of the team will vary, depending on the scope of the work involved and the size of the audience. The team composition will also vary, as some of the actual work such as video or audio production may be outsourced, but it has been found that actual content development is best handled by the subject matter experts, under guidance from the knowledge transfer team. The time taken to up-skill external staff in complex content has not been found to return any effective cost savings.

This team provides a critical function in build-ing a sense of community among content providers and providing all required help and guidance to both the contributors and the target audience. The primary attributes team members should possess are: a passion for delivering learning and knowledge to the audience and a strong desire to work with a diverse team.

86 TIMAF Information Management Best Practices Vol. 1

sTeP 2: BuILd A ProgrAM MAnAgeMenT FrAMeworkThe next critical step in establishing an effective program is to gain agreement with key stake-holders on a program management structure and a schedule for delivery of content. Stake-holders will include the management of the information source organizational business unit, the target audience and other support groups as necessary. In the case of transferring knowledge on product initiatives, this would include the Product Management and Sales teams and also the Marketing team.

By holding regular program management stakeholder meetings, agreement can be reached on the relative priority of initiatives and endorsement gained on the mix of media used for communicating and documenting each initiative within each cycle. These meetings are also a forum for reporting back on the results of previous initiatives and gaining continuing endorsement on the program.

The program management structure is useful for continually renewing the mandate of the knowledge transfer program, gaining corporate endorsement and budget for activities and ensuring that the program is meeting organizational objectives.

The frequency of meetings will depend on the subject matter – in the case of complex products and services, a ninety-day cycle has been found to be effective.

sTeP 3: sTArT sMALL; ThInk BIgIt is helpful to start building a knowledge transfer toolkit in a contained environment – a “grass roots” or prototyping approach. This can be achieved by initially identifying a narrow scope for the content or starting with a subset of the audience. This approach will allow gradual development of the contributor community and adaptation and expansion of

the knowledge transfer tools to meet emerging audience needs.

While management endorsement of the knowledge transfer toolkit is critical for long-term success, it has sometimes been found that a “big-bang,” top-down approach to implement-ation of similar initiatives can fail, due to resistance of the initiative by disengaged stake-holders. A more gradual introduction allows time to tailor the approach to suit organizational culture and provides better engagement of stakeholders. This will also allow “safe-fail” experiments in individual elements of the program at a low cost.

To ensure long-term success, there is a need for scalability with any technologies used, and continual management endorsement of the toolkit is critical to ensuring long-term viability.

Although a program may commence on only a small scale, it is important to publicize and promote the program widely and to highlight all successes along the way. Further aspects of this will be discussed in the next step.

sTeP 4: deAL wITh knowLedge hoArdIngAll content that is provided through the know-ledge transfer toolkit should aim at meeting the understood and agreed upon organizational objectives. Content contributors need to be able to understand the importance and value of sharing their knowledge with others within the organization. This value should be explained to contributors in terms that will make sense to them.

It has been found that some knowledge hoarders will often complain about the number of times they are contacted by those needing access to their knowledge. Their email inboxes and voice mailboxes will be frequently over-flowing. In many cases, merely explaining to

8703 Keith De La Rue

these people that they can reduce these demands by explicitly sharing as much of their knowledge as possible will encourage better knowledge sharing.

The use of a coherent rewards and recognition program is useful, provided that rewards are clearly linked to the objective value of the contributions made. Rewards may be linked to salary packaging or bonuses, but financial rewards are not the only relevant motivator. As much as possible, contributors should be given some sense of community, where recognition of key contributions can be made publicly through group communications or meetings.

A rewards and recognition program may also be linked with an internal branding campaign. The commencement of a new knowledge transfer program is a good opportunity to hold some formal launch activities on a fairly broad basis, to ensure that all stakeholders are aware of both the benefits and requirements of the program.

One approach to linking the rewards and recognition to clear deliverables is to use a “knowledge points” system, where each per-son’s contributions to the knowledge transfer toolkit made over a period of time can beaggregated. The total points earned by each contributor can then be translated into a component of his or her salary package. Rewards can also be non-monetary and may include a range of gifts, vouchers or branded merchandise. A community newsletter can also be used to announce achievements. The intent is to develop a strong community spirit. One personal instance displayed that even after organizational change and a formal branding campaign ended a readiness to share knowledge continued, as a cultural pattern had been developed and embedded in the organization.

Statistics on content use and other helpful

information should also be provided to con-tributors. A clear expectation should be built from the outset that knowledge sharing is a fundamental part of each contributor’s day-to-day job, and the business need of the audience to access this content should be stressed.

Access to all knowledge transfer tools should be made as easy as possible, to ensure that no barriers are placed in the way of knowledge sharing. Wherever possible, the provision of content should also become part of standard business processes; for example, the provision of information to a sales force should be a required step in product development prior to the launch of new products. This will require strong management support of the knowledge transfer toolkit.

sTeP 5: ProVIde A CenTrAL knowLedge LIBrAryThe central element of the knowledge transfer toolkit is an online knowledge library. This knowledge library provides a “pull” reference source for the target audience. In order to ensure that audience members have easy access to all needed information, a single, central repository that includes all required content is important. (In a sales environment particu-larly, where time is of the essence, it has been found repeatedly that having content accessible in a single location is vital.)

The library should be capable of storing audio and video, as well as conventional documents of any format, to ensure that all audience needs and learning styles can be catered to from a single source. The library should include content intended for the use of the target audience members personally, as well as collateral for presentation to customers. The content needs to be reliable, accurate and kept up to date (this will be further discussed in the step “Maintain Content Accuracy and Currency” below).

88 TIMAF Information Management Best Practices Vol. 1

In order to provide information in a predictable format, standard templates can be provided in a defined range of content types. These templates will be structured into sections to assist the content authors to document technical content in a consistent way that will meet the needs of the target audience. Sufficient information on structure and formatting should be included in the templates, and additional documentation should be included in the form of guides, as required.

Templates can be provided not only for written documents, but also for audio or video presen-tation scripts and face-to-face or online training material.

A standard set of documents for product information may include the following items:

• Briefoverviewproductinformation document; • Detailedproductsalesguide,highlighting features, functions and benefits; •Amoredetailedtechnicalguide,ifnecessary, describing technology details; • Pricingguide;• Listofkeycontactpeople;• Customerpresentation;• CustomerProposalsupportinginformation;• Applicationform,ifrequired,orlinksto online application.

The primary information that a sales person will need on a product set is focused in two key areas:

• Business drivers: What are the customer needs that this product will address? • Value Proposition: What does this product offer (over competing products) that will be of interest to my customer?

The technology chosen for the library should provide the capability of indexing and searching content and building workflow rules for loading and maintaining content (described further below). Depending on the nature of the target audience, remote access to the library may also be important. If the workforce is mobile, access via handheld devices has been found to be highly valuable.

sTeP 6: deVeLoP ConTenT ACCess MeThods And TAxonoMyContent should be accessible via a simple taxonomy addressed to the needs of the target audience. As the major purpose of the content is to inform the audience, the taxonomy struc-ture needs to be aligned with the audience’s view of the content, which may differ from the view of the information providers. It is impor-tant that the knowledge transfer team has the authority to structure content in the way that best suits the audience’s needs.

A faceted taxonomy is useful for providing different methods of access for different needs within the target audience. Trees may be used for grouping content where required. For example, a sales team will want content to be grouped in a way that aligns with customer demands, even though products meeting those demands may be provided on different techno-logy platforms that are provided by different product teams. Some key indices that may be used in this environment are:• Productname(simplelist,forquickaccess, where the product name is known); • Businesssolution(ashallowtree,usedfor grouping products to meet known customer needs, where a specific product name is not known); • Contenttype(simplelist,asperdocument type list above, plus others as required).

A comprehensive search function is also important.

8903 Keith De La Rue

The needs of the target audience are para-mount; a sales audience requires rapid access to content, in order to meet customer demands. Accordingly, the taxonomy and search func-tions must be simple and fast. Access should also be fast and transparent to the audience. If the security policies of the organization permit, then anonymous reader access is recommended.

The provision of alerts on new and updated content is also helpful to the audience. This should be provided on an opt-in basis and can use an email subscription service or RSS feed, depending on available technology. Audience members should be able to select content categories of interest by using the same audience-focused taxonomy to select topics for alerts.

sTeP 7: ProVIde oTher TooLsTo supplement the “pull” nature of the central repository, another important element of the toolkit is a simple, regular, “push” newsletter. This provides regular updates on new initiatives and activities that are relevant to the audi-ence. This may be weekly and would be most appropriately delivered on a standard intranet site. A summary of each week’s issue should be displayed on the front page, with links to click through for more detail on each article. Earlier issues should be accessible via an archive on the site, for later historical reference if required.The audience’s attention can be drawn to each week’s issue, via a simple email or RSS feed focused on the main topics of the issue. This should provide sufficient information for audience members to determine if topics relevant to their role appear in the issue. If not, then they will have no need to click through to the site.

The newsletter is distinct from the subscription service on the central library; the subscription service only provides an alert that a new docu-ment has been loaded, whereas the newsletter

provides high-level information on new initia-tives in line with the priorities determined by the program management framework.

The newsletter site should focus only on overview and transitory information; content is written once and archived after use. It is not updated. All details and changes should be updated in reference material on the central library, which can be linked to from the news-letter. Other tools in the toolkit can vary from time to time, depending on audience needs, available technology, cost/benefit and other factors. Some suggested tools include:

• Regularcombinedwebandaudioconference briefing sessions, using scripted presentations by subject matter experts with accompany- ing slide presentations. Provision can be made for questions and answers to be included in the program. The sessions can be recorded and edited for later access (as audio only or video files) via the central library. Used with the subscription service mentioned above, this can also provide a podcast or vodcast function.• Anonlinequiz,providingahigh-levelover- view of a topic in a fun learning environment. This would be supported by summarized source materials, and provide prizes for speed and accuracy in answering a range of simple questions. This approach is effective for desk-based staff.• Pre-recorded,scriptedaudioorvideo information sessions. These should be fairly brief and ideally professionally-produced for optimal impact. They can be provided via the central library or by using other platforms to suit the audience. Compiled programs could be provided on audio CDs for field staff on the road or as audio podcasts for playing on mp3 devices. • Fully-integratede-learningmodulescan also be used, but these can be very resource- intensive to develop to an adequate standard. • Face-to-facetrainingsessions,withabrief,

90 TIMAF Information Management Best Practices Vol. 1

focused delivery of key content that includes some form of knowledge reinforcement activities, such as a role play, quiz or question and answer session. • MobilephoneSMSbroadcasts,forbriefstaff messages. • Bloggingbykeysubjectmatterexperts.

sTeP 8: MAInTAIn ConTenT ACCurACy And CurrenCyThe most critical issue in ensuring content currency is ownership. Each entry in the central library should have the name of one owner clearly – and publicly – recorded. The account-abilities of ownership include content accuracy and timely updates. To this end, the owner of each entry is ideally a subject matter expert on the content of the entry. This may be the original author of the content or another expert as appropriate. All owners are also responsible for ensuring handover of their content to other relevant owners at times of staff movement.

To enable content owners to take full account-ability, access to the content library should be made as easy and simple as possible. Owners should be able to access all features necessary for loading and updating their content, along with managing the status of their entries – up-loading drafts, publishing and deleting at end of life. Relevant metadata should be entered at the time of entry creation and kept as simple and automated as possible.

A key requirement for content currency mainte-nance is an agreed upon window for review of entries. For consistency and ease of manage-ment, a single window for all content types is preferable. Aligning this with the schedule of the program management framework provides consistency. (In the example mentioned above, a 90-day window was used.)

Also critical is the ability to make available workflow capabilities so that automated

reminders can be sent out to owners of entries that require updating at the end of the review window period. It is helpful to make a distinc-tion between the database entries in the content library (which include the metadata) and the attached content stored in each entry in the form of a document or multimedia file. For effective management, each entry should contain only one attached file.

Each entry should be reviewed at least once in every review period, even if the attached con-tent may not require updating. Owners should be given at least two reminders to review entries by the due date. After the review date, the attached content in unreviewed entries should be “hidden” or archived, so that it cannot be used in error.

Workflow tools can provide for multiple levels of checking – for example, if an attached file is unchanged after two review periods, the owner may be prompted to enter an assurance that the content has been reviewed but that updating is not required. The owner’s line manger can be sent a copy of this assurance for their informa-tion.

Further controls may be introduced to ensure that content is updated regularly and existing business processes should be used to support this where available. In one case, an existing management “scorecard” process was used to report on residual content within the process mentioned above that still remained unmodified for over 12 months. The introduction of this approach produced a marked improvement in content currency.

Senior management endorsement of this process is critical to the success of this approach. Management reporting can also be provided to track overall progress.

In order for the target audience to be kept fully informed of the status of content, both the date

9103 Keith De La Rue

of last review of the entry and the date the con-tents were last updated should be displayed.

The needs of the audience should again be paramount; a sales audience requires access to accurate, reliable content, and staff in this role have been found to have a low tolerance of content that appears to contain inaccuracies.

sTeP 9: engAge The AudIenCe As PArT oF The MAInTenAnCe ProCessWhile content owners have key accountabili-ties, the ability of the audience to support this role should not be overlooked. By providing key information to the audience, such as entry status and review dates and a simple feedback form on each entry, audience members can play an important role in overall content quality by providing feedback. This is particularly relevant when some of the audience members also have some amount of relevant subject matter expertise.

Entries for content that have not been reviewed by the required date should be accessible (but the actual content hidden), to enable audience members needing the content to request that it be reviewed and made available. All comments on the accuracy and quality of entries should be sent directly to the content owners, and the knowledge transfer team should be able to monitor all comments.. Readers may also be provided with a “rating” facility (for example, a one to five star rating), to provide public feed-back on content quality.

While anonymous read access is desirable, as mentioned above, the identity of all readers providing feedback or ratings should be cap-tured – preferably in the background through the use of a single sign-on or similar system – for later follow-up by the knowledge transfer team if required.

It is important that the knowledge transfer team provides oversight of review activity and all exchanges on content, and also that the team ensures that all issues are followed-up on and entry update requirements are adhered to. This team would also be charged with ensuring that all necessary help and guidance is provided, intervening whenever required.

sTeP 10: use soCIAL MedIA PrInCIPLesKnowledge management as a discipline has been with us for some time, while social media is a relative newcomer. However, many of the principles at the heart of social media underpin an effective knowledge transfer toolkit. Some of the important aspects of community have been discussed above. The fundamental basis for all this is trust.

To this end, contributors must be allowed to take as much control of their content as possible. When the toolkit is created, the more open all the elements can be made, the more effective the sharing will be. While audit trails should be kept, they should take place in the background, to allow contributors direct access to content without separate logins or authenti-cation. In this author’s experience, where this trust has been extended, it has been over-whelmingly honored.

In addition, social media tools make a valuable contribution as knowledge transfer tools. Pod-casting and vodcasting were mentioned above. The central knowledge library discussed above should also to be built using open principles – access should be available to anyone to load or update content. Again, experience has shown that this trust is honored in practice. To this end, it is possible that a wiki platform could be used for this purpose, provided that the required sup-porting workflow tools can be made available.

TIMAF Information Management Best Practices Vol. 192

With blogging and other social media approaches, audience engagement can be further increased, empowering the audience to have an increased impact on content quality and accuracy.

sTeP 11: reVIew And reFreshContinual measurement and improvement of the entire knowledge transfer toolkit is a vital step in ensuring that business objectives and needs continue to be met, particularly during times of organizational change.

The audience and contributors should be con-tinually engaged and their feedback monitored and reported on. This can be done with a range of survey and measurement tools, including social media tools and periodic direct feedback should be provided to all stakeholders on the actions taken.

Use of the tools should also be measured using relevant approaches – hit counts on the know-ledge library entries and newsletter articles, use of video and audio programs, attendance and feedback at face-to-face training, etc. It should however be borne in mind that pure hit counts and attendance are not the primary measure of effectiveness. The aim of a broad-based knowl-edge transfer toolkit is to provide tools to meet each learning style. Thus, not all audience members will use all tools.

Also, with targeted newsletter content, if a dis-tributed summary includes only topics that are not relevant to an individual audience member in a particular week, then that person should not be clicking through to the newsletter site to read the detail. In this case, the absence of a hit is success. To this end, other assessments of the usefulness of the tools should be given greater weight than the mechanical quantitative measures.

Outcomes of regular reviews may include improvements in individual tools, the termina-tion of tools that are no longer meeting audi-ence needs or the introduction of new tools. The regular program management stakeholder meetings should be used as a forum for provid-ing reports on the review and refresh process and seeking endorsement of changes to the toolkit as required.

Summary and ConclusionFrom the experience gained in this environ-ment, a number of key principles of success-ful knowledge transfer programs have been distilled:

• Buildahighlycapable,multi-skilledteam.• Developastrongprogrammanagement framework, with strong endorsement. • Knowthebusinessandmeetreal organizational needs. • Startsmall;thinkbig.• Knowyouraudienceandcontributors, their needs and how they operate. • Buildabroad-basedtoolkit,ignoring traditional distinctions between disciplines. • Managecurrencyandaccuracythroughclear ownership. • Addressaccountability,behaviorand engagement. Make it “the way we do things around here.” • Exercisetrust,andmakeitaseasyaspossible to share effectively. Human issues come before technology!

9304 Apoorv Durga

APOORV DuRGA ([email protected]) is an analyst with The Real Story Group where he covers Portals and Web Content Management. Before joining this company, Apoorv was a Practice Head for the Portals and Content Management Practice of Wipro Technologies where he helped clients with their web strategies. Apoorv has a Bachelor’s degree in Engineering from the Institute of Technology, Varanasi (India) and a Master’s degree in Management from the Indian Institute of Management, Calcutta (India). Apoorv is based in New Delhi, India.

How to choose when and if to use standards

Do Standards in content technologies Matter?

Apoorv Durga

04 w

94

AbstrAct

TIMAF Information Management Best Practices Vol. 1

Because of the hype associated with standards, we often see

that they end up as mere “tick marks” in analyst reports or

Request for Proposals (RFPs or “tenders”). Customers expect

a product to be compliant with all relevant standards and

vendors, in their quest to score highly in their proposals, try

to become “standards compliant,” even if it means that they

are only partially compliant. As a result of this hype and

confusion, many customers are not very clear about the

relevance of standards.

Despite the many benefits of following standards, there

are often trade-offs to be made, and it may not always be

the case that a standards-based approach is better than an

approach that is proprietary. Finding a balance between these

different approaches is important to solving the problem.

This article discusses different standards, as they pertain to

content technologies, the benefits of following them and

the myths and challenges associated with them.

95

Background

04 Apoorv Durga

The word “standard” has many meanings, and as a result there are many terms that get grouped under the broader terminology of “standards.” Although there can be arguments on what defines a standard, most people agree that there are many benefits to following stan-dards. Clearly, when technology markets see numerous competing vendors, rapid innovation and even turmoil and dislocation, analysts and consultants alike almost always counsel anxious software customers to find refuge in industry standards. But when looking more closely at standards, one will find many problems and challenges:

• Thereisactuallyadearthofmeaningful standards. • Manynormstoutedasstandardsarenot really standards but guidelines and frame- works. • Manystandardsarevague,incompleteorso heavily burdened that no one applies them.• Sometimes,complyingwithstandardsdoes not yield better results than applying a well- proven approach – even a proprietary approach (which arguably becomes a kind of de facto standard).

The skILLs needed To PerForM ThIs BesT PrACTICeBroad understanding of the content technolo-gies market would be beneficial to make the most out of this article. These best practices are relevant for anyone interested in implementing content technologies but most useful for the following roles:• Requirements Analyst: This person will translate and prioritize the business require- ments and propose solutions. An under- standing of how one can achieve a particular functionality in a better way with the help of standards is an important aspect of prioriti- zation.

• technology Architect: This person will use appropriate and relevant standards to define the overall roadmap, the integration points, building blocks and so on.• technology/Product Evaluator: This person will ensure standards compliance, while selecting the right products and tools for business requirements.• System integrators: The implementation team (developers, testers) are required to actually implement according to the standards that have been chosen.

sTeP 1: deFIne “sTAndArd”A major reason why people find standards to be confusing is because there’s often no agree-ment on what a standard is. There are many other terms that are grouped under the broader terminology of standards, often by vendors.

Let’s identify all these terms and make an attempt to differentiate standards from other terms.

• Laws and norms: These are usually acts defined by law and require compliance against a set of specifications. Examples are the Sarbanes-Oxley Act and Section 508.

Figure 1: Are there too many terms that get clubbed together under “Standards?”

96 TIMAF Information Management Best Practices Vol. 1

However, as a commercial enterprise, if these are not required then company software will not need to support it. Requiring a vendor to support this ensures that the software has been rigorously tested against an exhaustive list of specifications (1).• Standards: These are a set of specifications, often with a reference implementation and usually supported by multiple vendors. The Java Content Repository standard, JSR- 283 and the Java Portlet standard, JSR-286 are examples of some popular standards. Standards can also be de jure (backed by law or a standards organization) or de facto (something that becomes so popular that others start adopting it as a standard). They could futher be open standards (anyone can participate) or even proprietary.• Guidelines: These are recommendations or best practices. An example of this would

be the Web Content Accessibility Guidelines (WCAG), which define good practices for improving accessibility of content.• Frameworks: These can be process oriented like the Capability Maturity Model (CMM) or technical patterns like JavaServer Faces (JSF) or Struts. They often simplify develop- ment by making it predictable and reusable. Frameworks often provide a higher level of abstraction than standards.• Platforms: These are broadly implemented technologies with published Application Programming Interfaces (APIs) and docu- mentation. Business applications as well as technical frameworks are usually built on top of these. Examples include J2EE and . NET.

All the above terms in some sense help lead to a better end result. One will likely hear about

Figure 2: A typical enterprise application will need to integrate and interoperate with several other applications - Portal, ERP, eCommerce and so on.

9704 Apoorv Durga

different standards in pretty much all the areas related to content technologies, such as content management, workflow, security, portals and even more specialized areas like metadata management, document formats and so on. However, it’s important to differentiate and recognize that not all of them are useful, contrary to what vendors or one’s favorite analysts may be saying.

sTeP 2: exAMIne The PossIBLe BeneFITsFirst let’s look at the good side of standards. There are many benefits to following a stan-dards-based approach, and integration and interoperability are probably the most important. An Enterprise Content Management (ECM) or portal product is most often one of the pieces in the puzzle called Enterprise Information Management (EIM). These products need to integrate with or at least talk to other products.

For instance, a telecom service provider will likely use a Web Content Management (WCM) system to enter information about new hand-sets and plans. However, the WCM might need to get inventory data from an Enterprise Resource Planning (ERP) system and deliver that along with new handset information to an online portal. As another example, one might have multiple ECM repositories that expose content via a portal. Products that are based on standards should make it much easier to integrate an enterprise.

This is by no means the only benefit. Instead of trying to hire people skilled in multiple technologies, one can only hire people who understand the standards. Most of the techno-logies are niche products, it is difficult to get people skilled in proprietary technologies and the learning curve is often steep. Standardiza-tion ensures that developers can work across

products and there is uniformity in what they develop.

If one has products that are standards-based, it is often possible to buy off–the-shelf applica-tions that can be deployed on an infrastructure. This also means that applications can be ported to a new product or infrastructure, should the need arise. Standards like JSR-168 (and JSR-286) for developing Java-based portlets ensure that portlets written for one portal can be implemented on another portal that follows the standard. One can actually write once and re-use existing infrastructure and applications, even in the event of migration to a different technology.A by-product of all this is that development and maintenance becomes simple and predictable, in turn decreasing costs. In fact if products com-ply with standards, one can actually do a lot of development on open-source and free products and then deploy on commercial products. This means huge savings in licensing costs as well.Now developers can focus on actual business, instead of spending time on commodity tasks like browser testing.

sTeP 3: exAMIne The PossIBLe drAwBACksEven though there are many benefits, as mentioned above, there are many drawbacks as well, and one should be aware of both sides of the coin. Benefits for users do not always translate to benefits for a product vendor and vice versa. For example, standards provide a level playing field. But from a product/vendor perspective, they lose the ability to differentiate from competition. If all products were to provide features based on standards then it would become difficult for product vendor A to be better than B.

Some people also say that standards are meant for features that are no longer niche. One no

98 TIMAF Information Management Best Practices Vol. 1

longer buys an application server for Java v1.5 versus v1.6 compliance, which is a minor issue now. One buys products for reasons other than standards. That means what sells and differen-tiates products from each other are the features (which are proprietary), inroads made by the development tools, ease of using those develop-ment tools, documentation available for products/ tools, expertise available in the market for those tools and so on.

Some people actually think that standards dis-courage innovation and creativity. This is more of a myth; would there be more creativity if one had to buy a different wireless card for his laptop depending on the network he wanted to access?

Whether one likes it or not, the fact is that there are non-technical factors playing behind the evolution of standards. When there are so many products and vendors with varied interests, it is no surprise that the standard in question has to cater to the majority of them. What this means is that they end up providing a lowest common denominator, and that means one might have to sacrifice good features because they are not part of the standard.

Another reason why many standards are not as popular as they could be is lack of support by vendors. A case in point is the Java Content Repository standard JSR-170 (and now JSR-283), a specification that defines a Java-based mechanism to access content repositories. Publicly, many vendors support this, but privately many vendors have disparaged it and have not actually implemented it.

Many standards are evolving and not yet mature. So there is also the risk that comes with using an immature and incomplete standard. The Java Portlet API, defined by JSR-168 is a good example of a standard which lacked important features like inter-portlet communica-tion. If one wanted his portlets to communicate

with each other, one had to either depend on proprietary extensions or wait for the next version (JSR-286) to implement this. Even when a standard is quite mature, it is not always clear what is mandatory and what is optional.

The huge number of standards and standard bodies only adds more confusion. This often results in parallel and competing standards that make it difficult to decide which standard is best.

sTeP 4: Ask The rIghT QuesTIons

Will You Port Your applications in the Future?Portability is an important benefit achieved by following standards. Typically, any application consists of many components – the application itself, database, operating system, application server and so on. If one uses a standards-based approach to build an application then he can potentially replace one or more of these com-ponents without impacting the application itself. Alternatively, one could actually use free and/or open-source products during development and use commercial products for production, thus saving on licensing costs. The obvious tradeoff is that one cannot use the proprietary exten-sions provided by each of the components. So it is very important to evaluate the probability of porting from one application server to another or from one portal server to another. Having spent a couple of million dollars on different technologies, one is probably not going to switch over too soon. Also, one can probably negotiate with vendors to bundle free develop-ment licensing; thus doing away with the need to use free and/or open source products for development. Portability is not as common as some people think, so consider it carefully, and then decide whether there is a real need for that particular standard or not.

9904 Apoorv Durga

What Will You Integrate With?An ECM or a portal is just one component in the enterprise architecture and it needs to fit in with your organization’s technology philosophy. This means that it has to integrate with and interoperate with multiple other enterprise systems. A standards-based approach makes it much easier to integrate. However, this also means that all the products withinin the tech-nology landscape and not just the ECM or por-tal must follow standards. So for example, one’s portal might support JSR-286, but if none of the other applications provide JSR-286 portlets, one will not see any benefits. It is important to access and audit the technology landscape and decide which standards will help in integrating and which will not.

Does Following a Standard help With time to Market?A standards-based approach can help one improve time to market because one can buy components off-the-shelf and then deploy them. That is not true of all standards though. Sometimes, the reverse is also true and one can actually end up spending more time for the sake of following a standard then with a mature proprietary technology. Are you willing to forego features for standards or willing to wait for the standard to evolve and have those features? An example of this would be Content Management Interoperability Services (CMIS). CMIS, although technically not a standard yet, has shown a lot of promise. However, it will be some time before it actually evolves as a well supported standard.

Do You Need the extra complexity?Many times, simple patterns are good enough to fulfill needs, rather than complex standards. This has been shown by the rise of representa-tional state transfer (REST) architectures that essentially provide a URI-based mechanism to access content services. So one should evaluate if his needs can be met by such simple means or if he actually needs more heavyweight WS-*

(Web Services) standards.

ask Your Vendor the right QuestionsMake sure to understand vendors’ claims properly. Most vendors will say that they com-ply with all web services standards. However, in reality, most of them have the ability to expose content by way of web services but not the ability to consume another application using the same mechanism. So, it is also important to try out and verify these claims, as they pertain to your scenario.

sTeP 5: MAke deCIsIons

Don’t Buy BlindlyNot all standards are always required. One probably only needs some standards. So do due diligence and ask for only those standards that are required. Here are some practical scenarios where one might need a specific standard that can aid in deciding if a standard is needed or not.

Document SharingContent entry people create documents. They are typically business users who want a quick way of uploading documents without the hassle of logging into a web interface and then upload-ing documents. One needs to be able to allow these users to just copy and paste (or drag and drop) documents (in bulk if required) through a familiar file explorer interface. One also needs WebDAV (a web-based mechanism for distributed authoring and versioning, based on http) or Common Internet File System (CIFS) for such scenarios because both of these techni-ques let one mount his ECM repository as a shared drive.

Many products claim to support these, but what matters is the extent to which the standards are actually implemented. For example, even though certain products claim WebDAV

100 TIMAF Information Management Best Practices Vol. 1

support, one can download documents but cannot actually check-in, check-out and run rules that would normally run if one had uploaded documents through the normal mechanism of uploading via the web interface.Similarly, for developers who prefer to use Integrated Development Environments (IDEs) like Eclipse to code templates instead of using a CMS’s built-in editor, life would be much easier if the product supported WebDAV or CIFS.

Supporting Multiple PortalsA need may exist to support multiple portal servers – perhaps from different vendors – for diverse scenarios, but it is undesirable to develop portlets for every different portal server. Or one may manage content in a Web Content Management (WCM) repository but use a de-coupled portal server to deliver content. Although one may use a WCM system to manage content, one will still want users to use a portal interface instead of using the WCMS’s interface. In such a scenario, one

would want the portal server to support JSR-168/286 and WCM (or ECM) to have the ability to expose functionality using JSR portlets.

authorizationOne who uses an ECM system for managing documents, but the authorization for the infor-mation of documents as well as users comefrom another security system, would be better off if all systems, including the ECM system, supported the security standards like the Security Assertion Markup Language (SAML).

aggregating and Syndicating contentIf one wants to expose multiple content repositories through a unified access mechanism, make sure that each of them support JSR-170/283. Likewise, with the requirement of syndication or aggregation of content, one would want to make sure that the products can both produce and consume Atom, RSS or similar feeds.

Figure 3: WebDav allows an easier alternative to submit content

10104 Apoorv Durga

Don’t Ignore the Need for Proprietary knowledgeFollowing a standards-based approach ensures the consolidation of skilled resources, and the learning curve is less steep. So instead of people having knowledge of (or learning) multiple proprietary APIs, people can code against a standard API. However, do not assume that people will not need any proprietary know-ledge. In order to be truly beneficial, one will probably need to understand the extensions of each API and how that plays with the standard. While the time to scale up resources will still be much faster, there is still some need to train people on proprietary stuff.

Pay close attention to IntangiblesWhile considering a specific standard, do research on who supports the standard, whether it has enough traction and so on. There are examples of standards which have not been able to get much traction and support and including them will probably not give you too much of an advantage.

Figure 4: In many cases, the same content can be re-purposed and published to multiple destinations.

TIMAF Information Management Best Practices Vol. 1102

Summary and ConclusionStandards are quite helpful and have many advantages. Some of these benefits are:

• Integrationandinteroperability;• Skillrationalization;• Fastertimetomarketbybuyingoff-the-shelf applications;• Portability;• Simplerandmorepredictabledevelopment lifecycle; • Allowfocusonbusiness,ratherthan commodity tasks.

However, like everything else, one needs to be ready for quite a few trade-offs as well. These are mostly as a result of immature and evolving standards as well as factors (mostly non-technical) that hinder widespread accep-tance of standards.

Treat standards as a means to an end and not an end in itself. One must look at the standards holistically, in order to ascertain how much benefit one is likely to get and at what expense. Many people believe that a standards-based approach is always a better approach, but by asking the right questions, as described above, one will probably be able to decide better if one indeed needs standards and if so, which ones will provide the most “bang for the buck.”

references1. Priscilla Emery – specifications, certifications, and compliance: there is a Difference (cMs Watch,

December 15, 2005), http://www.cmswatch.com/Feature/136-records-Management.

10305 Joe Gollner

JOE GOLLnER ([email protected]) has been an advocate of, and practitioner with, open markup standards for over twenty years. He has participated in dozens of content management initiatives in such varied capacities as project manager, solution architect, business strategist and sponsor-ing entrepreneur. During these years, he has leveraged most, if not all, of the technologies available for improving how content is created, managed and deployed. He is the Director of Gnostyx Research (www.gnostyx.com), an initiative focused on advancing and leveraging open content standards, and he is also the Chief Solutions Architect (Intelligent Content Technologies) for Stilo International (www.stilo.com).

Battling Organizational Entropy with Flexible Information Management Standards

coordinating enterprise Information Management Investments

Joe Gollner

05 h

104

AbstrAct

TIMAF Information Management Best Practices Vol. 1

An information management strategy was needed to overcome

the inherent individualism (often bordering on isolationism) that

afflicted the organizational units, professional disciplines and

equipment program teams within the Canadian Department

of National Defence (DND). It had become abundantly clear

that only a coordinated approach to creating, managing,

exchanging and using information would solve the massive

inefficiencies evident in how the department handled its

substantial holdings of complex technical documentation. It

was also clear that if these problems could indeed be solved,

the department would realize compelling operational benefits

in addition to significant financial savings. A truly effective

strategy would be one that corrected the manifest inefficien-

cies while sidestepping the trap of pursuing one-size-fits-all

technology solutions that had already been proven, at least

twice in departmental experience, to be impractical.

While establishing a core interdisciplinary team with a robust

mandate and predictable funding served as an indispensible

prerequisite, the key ingredient to success was the combina-

tion of an extensible information management standard with

a proactive approach to engaging stakeholder groups in the

evolution and application of that standard. The solution thus

featured equal measures of a sound technical approach and

a measured management strategy. At the end of the day the

ability to be flexible made it possible to sidestep the technical

hurdles that had undone previous attempts and to engage, at

the management level, a diverse range of stakeholders amid

the invariable changes that beset modern organizations.

105

Background

05 Joe Gollner

The Canadian Department of National Defence (DND) is a reasonably large institution employing approximately 150,000 personnel that sustaina wide range of operations around the world, including humanitarian relief, peacekeeping and combat missions. As one part of these operations, DND expends several billion dollars annually on the acquisition, modification and maintenance of complex equipment systems and, as part of that expenditure, several hundred million dollars a year are directed towards the creation, manage-ment, translation and publishing of equipment documentation.

complex Publishing NeedsWhen compared to many documentation environments, DND represents something of a high water mark in terms of complexity. The reasons for this complexity are not hard to find. The Canadian military acquires and operates a state-of-the-art array of advanced weapon systems with a relatively small number of each type of system being procured in each case. This presents a significant number of logistical challenges all by itself. Added to this is the fact that given the broad range of missions the military performs, these systems feature numerous variants and undergo a continuous stream of modifications.Complicating things further, the Canadian military is officially bilingual and all documen-tation resources are created and maintained in both French and English. The format specifica-tions for both print and online delivery are notable in that they attempt to allow users to not only operate equally well in either language but also to traverse back-and-forth between the two languages at any time. For many years, the technology vendors supporting the market for loose-leaf publishing from content maintained in the Standard Generalized Markup Language (SGML) simply shook their heads and

declared DND’s print output specifications to be impossible to satisfy using automation.

two Previous Initiatives had FailedIt is worth observing that DND was an early adopter of open content standards, embarking on two successive attempts to implement advanced content management and publishingsystems to address its unique documentation needs. One of these was undertaken beginning in 1985 and featured the design, development and implementation of what may be the world’s first true content management system. The second project was undertaken in the early 1990s and sought to introduce a workflow management sub-system adapted from the United States Department of Defense (DoD) Joint CALS (JCALS) solution. As background, CALS, or Continuous Acquisition and Lifecycle Support, was an initiative started by the DoD in 1985 with the goal of modernizing how technical information was acquired and managed. Both of these DND projects were unmitigated disasters that together were responsible for the loss of millions of dollars and, even worse, millions of pages of valuable technical information. Any new initiative would need to overcome this track record as SGML had become the acronym that must not be spoken and content technologies in general had become shrouded in suspicion.Within this context a number of forensic studies were performed that sought to determine themagnitude of the documentation problem, to unearth the reasons for past failures and to identify a potential path forward. The results of these studies were clear – the problem of inefficiency in how documentation was managedand deployed desparately needed a solution, and millions of dollars a year could be saved if some order could be brought to the massively uncoordinated and heterogeneous environment that was the world of DND documentation.

106 TIMAF Information Management Best Practices Vol. 1

the People InvolvedIt is worthwhile introducing the main players in this case study. As with any drama, there are heroes, comic jesters and villains. The players will not be directly matched to these categories as that is best left to the reader, although making the connections probably will not pres-ent much of a challenge.

The chief protagonist in the case is the DND CALS office, the team assembled and chartered in 1993 with “making paperless material management a reality by 2002”. To achieve this objective the team would need to solve the documentation problem that had risen to such prominence in the department.

The departmental publication bureau was an important player because they officially owned the print output specifications that were so notoriously difficult to address using automated publishing processes. The bureau was also a service center operating on a cost-recovery basis and thus was wedded, in no uncertain terms, to the status quo. Finally, the bureau was also responsible for the official registration of departmental publications and their dissemi-nation to operational units. This latter role ensured that the bureau had to be co-opted into any strategy that would change the way content was distributed to field units.

Another group of key players was made up by the equipment program teams. These teams, in many ways, constituted the “clients” of any information management strategy and solution. It was within these program teams that docu-mentation was acquired, managed and used. The trend at the time, and it was a profoundly good trend, was towards forming integrated product teams that assembled together all participants in the process of acquiring, managing and servicing a given equipment system. So it was within these teams that the authors, editors and translators would be found as well as the equip-ment maintainers and managers who consulted

the documentation.Associated with the equipment program teams were equipment support contractors who provided a variety of services associated with engineering, maintenance and parts resupply. Frequently these support contractors were the original manufacturers of the systems or the integrators who had performed major modifi-cations to those systems. It goes without saying that these support contractors had a vested interest in making their roles unique and indis-pensable, so in most circumstances they viewed open content standards as a potential threat to their proprietary control.

Finally, there were the technology suppliers that provided either technology products focused upon documentation management or services associated with the deployment of those products. In a manner similar to the support contractors, these suppliers had a vested interest in seeing certain technology products adopted and therefore viewed open standards as a threat.

The skILLs needed To PerForM ThIs BesT PrACTICeAdapting this best practice to a different organi-zation and a different circumstance would still require that a number of things be true, if the assembled team hopes to follow a similar path. Specifically, the team will need to exhibit a significant level of experience with the business of the organization in question and, ideally, in each of the main domain specializations that will be impacted by the change initiative. This is a prerequisite insofar as it equips the team not only with a solid foundation of knowledge and a working network of contacts but also with an indispensible measure of credibility in the eyes of stakeholders. Depending on the or-ganizational culture in which the initiative will operate, this prerequisite can constitute a make or break proposition. The prerequisite that the team must possess a sound knowledge of the

10705 Joe Gollner

organization and its business should be obvious because how else will the team be in a position to formulate an appropriate solution? While this sounds obvious, numerous organizations launch into their information management initiatives by importing a team from outside and then wonder what went wrong when the resulting investments flounder.

An additional requirement is that the team must possess some expertise with the different technologies and implementation strategies that will likely be leveraged. In this case, of central importance was experience leveraging open content standards to integrate technologies, processes and organizations. This knowledge may reside in a subset of the team and, while important, it can be added in a number of ways. The team should also possess strong skills in project management and procurement, with the latter being especially important in this type of enterprise initiative. As outlined in the best practice case study below, procurement skills within the team and the ability to navigate and manipulate the “black arts” of contracting were leveraged more than once in stakeholder projects to introduce solutions when all hope had been lost.

KEY ROLES:• initiative Champion: Someone who is capable of engaging with executive management and responsible for steering the initiative through the political landscape.• initiative Leader: Someone who ensures the initiative makes progress towards its mandate and performs critical tasks quickly and effectively.• Procurement Specialist: Someone who assists the initiative, as well as stakeholders, to perform timely and cost-effective procure- ments to assemble the resources needed to move ahead.• technology Architect: Someone who helps chart a path through the forest of standards and competing vendor claims, so that the

initiative genuinely achieves its goal of facili- tating business electronically across a massively diverse community of stakeholders, allies and suppliers.• InformationArchitect: Someone who excavates the structures within the documen- tation resources to be modernized and illuminates the different approaches that might be taken to express those structures.• Project Managers: People who lead indivi- dual projects and coordinate investment activities.

sTeP 1: deCLAre A VIsIonThe first step in any such initiative is the ar-ticulation of an idea. If it is done in the right way and presented to the right people at the right time, a chance to build on the idea can be secured. In this case, an eloquent Air Force Colonel picked up the damaged CALS file in 1993 and was able to distill the results of the forensic studies of the department’s documen-tation problems into a simple but compelling message. Leveraging a few salient details, such as the prominence of documentation mainte-nance costs in the annual operating budgets for Canada’s fleet of F18 fighter aircraft, this Colonel caught the attention of the general staff who agreed that past failures could not be taken as an excuse to live with unacceptable practices. The green light for change was given. It was not carte blanche that was being handed over, as the mandate was provisional pending the formulation and approval of a complete strategy and detailed implementation plan.

This step, successful as it was in a time of budgetary stress, was not without an element of drama. In taking up the documentation issue, the Colonel had championed it as a matter to be tackled as part of a modernization in how equipment systems were acquired and managed. The logic followed that the initiative should be led by the organization responsible for engineer-ing and maintenance, and hence the acquisition

108 TIMAF Information Management Best Practices Vol. 1

and management of equipment systems. This in itself was a departure and as events later proved it was a vitally important one. In being a depar-ture, however, other organizations, particularly the publication bureau, would be potentially sidelined and this would require special atten-tion moving forward.

KEY ACtiOnS:• Engageachampionwhohasstrongcredi- bility with the stakeholder community and specifically with the executive level.• Frameacompellingpropositionthatprovides a graphic description of the business problem to be addressed and a simple-to-understand explanation of what needs to be done.• Providesupportinginformationthathelps executive management assess the importance of the business problem.• Establishwhereintheorganizationthe responsibility for solving the problem most properly resides.• Connecttheproposedapproachtokey organizational priorities and high-visibility industry trends, so that executives appreciate the importance of the initiative.• Seekareasonablemandate,onethatprovides the initial resources to examine the problem in detail and to recommend a process for determining and deploying a solution.• Setupafirmtimeframewithinwhicha recommended solution strategy will be defined, and set a process for gaining the approval to move ahead.

sTeP 2: AsseMBLe A Core TeAMPartly through skill and partly by simple good fortune, the Colonel was able, in a remarkably short period of time, to assemble a team that brought together an unusually broad range of competencies and that would be able to act effectively on the initial mandate.As a centerpiece for the initiative, a team leader who was an experienced and highly respected equipment systems engineer was added. This

individual was relatively senior within his community and his area of specialization – radar systems – was one that allowed him to converse with engineering specialists in all of the main equipment system elements – land, air, maritime and communications. It turned out that his extensive experience in large NATO equipment programs and the information sharing challenges associated with them made this individual strongly receptive to the new mandate of tackling the department’s documen-tation problems. He became a second champion who could add the important engineering perspective to the Colonel’s more managerial vantage point.

Similar benefits accrued, as each new team member was introduced. One brought a wealth of procurement experience, which helped in practical ways such as contracting for services and technology. This procurement background also provided valuable insights into a key phase of the content lifecycle, since the way that documentation is managed is effectively set by how it is acquired. Another team member brought a wide-ranging familiarity with all aspects of logistics support and, as fortune would have it, this individual was just one of those people who “knew everyone”. Another brought a strong background in project manage-ment and equipment system engineering, and this additional resource seemed to further bolster the reputation of the modernization team as being “grounded” and thereforecapable of getting things done. Finally, an external resource was engaged as a full-time consultant. This resource was someone that had been actively working on similar initia-tives, had performed the key forensic studies into the departmental documentation problems, had prior experience working with the key standards and technologies that would need to be leveraged and had previously served in the military as an equipment operator.All in all, the team was exemplary in providing the skills needed to make something of the

10905 Joe Gollner

provisional mandate. It turned out that all these individuals also brought to the table a fervent desire to improve how documentation was used in their respective domains. Some of these ingredients may not be easily replicated, and there is no small measure of good fortune behind the team’s formation. What can be observed, though, is that this team benefited greatly from the broad diversity of backgrounds that was assembled and by their collective passion to realize change.

KEY ACtiOnS:• Engageahands-onteamleaderwhocangive the initiative practical momentum as an important corollary to the efforts of the initiative champion.• Ensurethattheteamleaderfurtherenhances the credibility of the initiative and in particu- lar with regards to being able to deliver on promises.• Deploytheinitiativechampiontoaddressthe political issues that invariably surface as soon as changes are proposed or budget allocations are made.• Assembleacoreteamofthreetofive resources that can support the team leader by providing essential skills and expertise.• Allcoreteammembersshouldpossessthe ability to manage individual projects, conduct procurement activities and communicate effectively with stakeholders.• Teammembersshouldcollectivelyprovide a solid background in the main operational areas where the initiative will be active.• Atleastsomeoftheteammembersshould possess prior experience working with the standards and technologies that will likely be leveraged in the solution.• Allteammembersshouldbefamiliarwith aspects of the business problem to be solved and should therefore share a common passion to see a solution realized.

sTeP 3: eLABorATe The VIsIonThus far the initiative had two very important things going for it – the ear of executivemanagement and a capable team. There was good cause for optimism. The next step was to put some meat on the bones of the initial vision and the associated provisional mandate. It was well understood by the team that the executive management overseeing this effort had been disappointed by two previous efforts and had made it clear that they looked forward to being convinced that the third attempt would succeed. It turned out that the particular executives involved in this case were unusually wellinformed about the problems at hand, so there was a further challenge in that they were not going to be fooled by either mock quantifica-tion or fuzzy promises.

Once again fortune seemed to smile on this initiative because when the core team set to work to elaborate the vision they found that the failures of the past, and the forensic reviews they occasioned, provided fertile ground in which to dig for answers. These conditions, perhaps mixed with some extra good luck, allowed the team to make some penetrating observations and to strike upon a number of principles that later proved to be key to success.

Identify the root ProblemThe first element of the new vision was tech-nical. Past initiatives had failed at least in part because they were not actually setting out to solve the documentation problem at all. Upon closer inspection, the preceding investments in content management had been directed towards automating the print publishing process and then streamlining the associated editorial workflow. The fingerprints of these goals could be found on every technical detail in the two defunct investments. Like forensic pathologists coming upon incontrovertible evidence of a malefactor’s guilt, the team began to recognize that the two previous investments had been

110 TIMAF Information Management Best Practices Vol. 1

made under the guise of addressing the docu-mentation problem when in fact they were only designed to help the publication bureau perform its business more efficiently. While this might have made a small contribution to the currency and quality of the publications being used in the field, in reality these investments failed because they perpetuated one of the core problems – that of a centralized publication process that segregated the documentation life-cycle from the equipment systems it describes and the equipment teams it supports.

Digging further into these details, the core team was able to determine that the real key tosolving the documentation problem lay not in accelerating the print publication process but in rethinking equipment documentation altogether. The insight that turned everything around was the recognition that the problem in the past was that the documentation and the equipment engineering information were effectively managed in isolation from each other. This led to the continual lag time between equipment system changes and documentation updates.

The dichotomy between equipment systems and their documentation also led to massive levels of information redundancy in the docu-mentation sets. This meant that if one item of information was updated to reflect an engineer-ing change there was an overwhelmingly good chance that other information items would need to be updated, and frequently they were not. It turned out that no one knew how much redundancy existed in documentation sets. Early inquiries indicated that for most equip-ment programs an information redundancy level around 50% could be expected. More ominously, there were indications that for certain types of equipment fleets the redundan-cy levels might surpass 80%. When every single unit of content was associated with maintenance and management tasks as well as with transla-tion fees, the cost of unconsciously sustaining

these levels of redundancy was astronomical when extrapolated across the main equipment fleets.

Frame a Business caseSuddenly a business case began to form based on some eye-catching numbers. Better still the opportunity to realize savings on the cost of managing documentation could also be linked directly to operational benefits such as short-ened repair times, reduced false replacements, improved equipment availability and dramati-cally accelerated publication update cycles. Attractive as these benefits may be, it seemed clear that the core elements of the business case would focus upon the savings realized through the elimination of information redun-dancies and the curtailment of the associated inefficiencies.

On the side of functional benefits, it was determined that the central publication bureau was taking approximately 18 months, on average, to effect updates in a major publication. This meant that once the update was made it was effectively obsolete because other change orders were already in the queue. Looking for-ward to the new system, it was posited that the turnaround time for a publication update could be reduced to a matter of days and it could be fully integrated into engineering change work orders so that, in a real sense and for the first time, there was no lag between the engineering change and the associated publication update. This vision was not unfounded as, in one striking example that took place once the new CALS solution was realized, an illustrated parts catalogue for a tracked armored vehicle went from taking 18 months to update to less than 18 minutes. The equipment publication manage-ment team in this particular example enjoyed adding that these 18 minutes in fact coincided with a coffee break because it was a fully auto-mated process that formatted the parts manual from the current parts database. Adding insult to injury, at least from the perspective of the

11105 Joe Gollner

publication bureau overseeing hardcopy distribution, this equipment team was also delivering the illustrated parts catalogue in an interactive electronic format that the mechanics found infinitely more useful than the larger, printed predecessors.

With these insights in hand, the core team articulated a more complete vision and one that proved overwhelmingly compelling to executive management. The successes that followed in this initiative, although they definitely involved a great deal of work by a large group of people, can be traced back to the quality of the insights that the team was able to articulate. The major-ity of the efforts that the team made henceforth greatly benefitted from knowing what the real problem was and what steps needed to be taken to address it. This is even more important than it sounds because it is common, in information management and in all walks of life, to find great industry being directed to solving phantom problems – which makes for a spectacle as sad as Don Quixote tilting at windmills.

KEY ACtiOnS:• Identifytherootcauseofthebusiness problem with the understanding that what may initially appear to be the cause of the problem, or that may be widely declared to be the cause, may not be.• Establishwhatbusinesschangeswillneedto be made in order to resolve the root cause of the problem.• Determinewhatthegeneralcharacterofa solution might be and how it differs from alternatives.• Recommendageneralsolutionandthe process to be followed in realizing that solution.•Declareavisionofhowtheproblemcanbe solved and how business will be conducted in the future. • Listthefinancialsavingsandoperational benefits that the solution would be chartered to realize.

•Frameabusinesscasethatdistillsintoaclear form the objectives for the initiative, the nature of the returns being pursued and the scope of the investments to be made.

sTeP 4: esTABLIsh An oFFICIAL MAndATeAt this point, the CALS office took some very prudent, if somewhat mundane steps. Rather than seek short-term funding for a limited number of investments, the team sought to leverage the insights made to establish the framework for a concerted push towards solving the root problem. So it was that two items of formal policy were prepared, and the arduous process for having official policies established was walked through. These policy items recognized the standing of the CALS family of standards, which included SGML among others, and established a process whereby equipment programs would be obliged to align how they planned to handle information with the best practices being established by the CALS office. These policy items were linked to the CALS office and its place within the equipment engi-neering, acquisition and management process. The net effect of these policy items was the for-mal integration of the initiative into the official departmental authority matrix.

The process of framing and formalizing these two policies largely involved identifying and filling gaps in the existing policy framework with respect to addressing the widely acknow-ledged need for the department to accelerate its adoption of electronic means for collaborating with industry. One policy document addressed electronic commerce, and it bundled the CALS mandate with the more generally visible elec-tronic commerce concept and made it explicit that the open standards governing technical documentation should be intrinsically connectedto agreements governing all electronic trans-actions with industry. The second policy specifically addressed the CALS mandate, and

112 TIMAF Information Management Best Practices Vol. 1

it explicitly linked this initiative with the efforts to modernize equipment engineering and maintenance practices. The net effect of these two policy items was that CALS became formally connected to equipment acquisition, engineering and maintenance and formally dissociated from the publication bureau and from the legacy print publishing process. The high level of political interest in making investments in the area of electronic commerce helped to speed these policy items through the approval process and to override any objections raised by organizations with a vested interest in perpetuating the status quo.

One major consequence of the investments made in establishing the policy framework was that it became possible, in a way that would not have been possible otherwise, to establish a ten year mandate for the CALS office and a firm multi-year financial budget and personnel allocation. The financial budget would equip the office with the ability to provide seed funding to pilot projects and to actively participate in those projects, so as to ensure an enterprise standard and solution could be harvested from local investments. The personnel allocation would ensure that the assembled team would be kept together and therefore be able to roll the lessons from one investment into the next. In total, the investments made in the policy foundation, and the associated budget and personnel allocations, established the conditions for success in a variety of ways. Most important was the fact that it enabled the CALS office to take a somewhat longer view than is usual in organizations and to evolve an enterprise stan-dard and solution over time, in the context of real projects and with the input of operational stakeholders.

What is also notable, and highly unusual if the truth be told, is that the policy foundation put in place was one that exhibited a high degree of pragmatism. Specifically, the policy directivesdid not fall into the all-too-familiar trap of

declaring various wholesome actions as manda-tory and then assuming reality would follow suit. The policy directives, in this case, essen-tially ensured that the discussions between the CALS office and equipment program teams were undertaken, but they left the ultimate decision with the equipment team management that bore ultimate responsibility for support-ing the field units. From the perspective of the CALS office, this approach forced its own team to be continuously relevant and practical when providing advice and support to the equip-ment program teams. In plain terms, it forced the CALS office to adopt a customer-service mentality.

KEY ACtiOnS:• Leveragethedeclaredvisiontoformalize the initiative, ensuring that the organizational allocations, executive responsibilities and governing policies are updated accordingly. • Aligntheinitiativewithkeyorganizational priorities and objectives, thereby maximizing the visibility of the initiative and its latitude for introducing major changes. • Establishapolicyframeworkthataccentuates how the responsibilities being created will remain operationally relevant, rather than being hidden amid administrative layers. • Usetheformalizationoftheinitiativeto clearly establish a clear organizational position, authority framework and reporting line to an executive sponsor. • Securelong-termpersonnelandfinancial budgets, so that the strategic goals of the initiative can be pursued in addition to short- term wins.

sTeP 5: esTABLIsh A suPPLIer neTworkOne of the key things that a central agency pur-suing a specialized mandate can make available to collaborating stakeholders is expertise. In the case of the CALS initiative, a range of open content standards and technologies were being

11305 Joe Gollner

advocated about which equipment product teams, along with their community of support contractors and technology suppliers, would know little. So it was deemed critical that the CALS office establish a network of suppliers and associated contracting mechanisms that would allow the CALS office to bring to bear expert assistance when required by equipment program teams.

This expert assistance would prove an impor-tant capability for the CALS office as the provision of funding without it being coupled with proper expertise was unlikely to provide satisfactory results. In truth, a couple of very early investments proved this point dramatically as funding given to some equipment program teams for documentation modernization was quickly consumed by the support contractors and technology suppliers associated with the equipment program teams. As these contractors and suppliers were completely unfamiliar with the standards and technologies being applied, it was to be expected that their attempts would not proceed very far even if that had been their intention. The results of these early and inadequately supported efforts are best left in obscurity.

Two main acquisition efforts were undertaken by the CALS office with a view to establishing the type of scalable supplier network thatwould be necessary as investments in an enter-prise standard and solution, as well as specific program interventions, grew more ambitious. One acquisition was smaller and was intended to put in place a core support capability to the CALS office itself. This capability would provide such services as project management and over-all solution architecture guidance, with thismechanism adding a couple of key resources to the core CALS office team. The second and much larger acquisition would establish a number of qualified suppliers, providing specialist capabilities that could be engaged separately or in unison to support specific

project goals. Quite intentionally, this second contracting mechanism would be used to main-tain a level of competitive pressure on the suppliers as each supplier could provide an alternative to the services of the others. Furthermore, resources from one supplier could be used as a source to review and test deliverables provided by another. All suppliers were authorized to draw in resources fromacross the world, so the initiative could leverage lessons learned from anywhere in the world. In the early years of the initiative, a particularly strong use was made of this latter mechanism.

The strategy for engaging the suppliers to the CALS office could be described as “robust management” with the various suppliers being used to challenge and validate the work of other suppliers. Overseeing the process was the smaller supplier selected to provide project management and architectural guidance,together with the members of the core CALS office team. In general, the process put in place was highly effective and this could be attributed to the general quality of the suppliers engaged and to the constructive benefits of the robust management model.

KEY ACtiOnS:• Workwithintheestablishedprocurement process to establish long-term contracting mechanisms that will make the requisite expertise and resources readily available.• Incorporateintothesecontractingmechan- isms procedures and performance criteria that encourage responsiveness and quality amongst the suppliers.• Ensurethattheestablishedcontracting mechanisms are structured to maintain a continuous level of competitive pressure on suppliers.• Completethisactivityasrapidlyaspossible, as it provides the initiative with an indispens- ible ability to respond effectively to stake- holder requirements and to realize practical progress against initiative goals.

114 TIMAF Information Management Best Practices Vol. 1

sTeP 6: neTwork wITh The sTAkehoLder CoMMunITyEarly in the implementation phase, an outreach program was initiated and sustained with the intention of flushing out and engaging the many different stakeholders who would be able to impact, or would be impacted by, the CALS initiative. This undertaking took a substantial amount of effort, but the leaders of the CALS office were determined to keep both friends and enemies as close as possible.

It turned out that this strategy was another example of an investment that paid solid dividends. By keeping close tabs on stakeholder organizations, the CALS office was continuous-ly aware of the political landscape. The team knew, for example, when a supposed friend to the initiative was entertaining actions that would fracture the enterprise objectives by spawning competing standards. Similarly, the team was able, through continuous discussion, to break down the barriers that had made some organizations foes of the initiative. In these cases, the supposed enemies were in fact allies in disguise and once the parties had reconciled their understandings of boundaries, goals and methods, they became close friends.

What is worth noting here is how the CALS office managed to sustain this type of interac-tion with the stakeholder community. Once again, the answer can be traced back to the solid budget and the capable team that the CALS office had at its disposal. Drawing on these resources, the team was able to convene a regular program of learning events designed to make available useful information harvested from the international CALS community and from the general technology marketplace. These learning events typically entailed bringing in representatives of pilot CALS projects from other countries and well known specialists in the standards and technologies being leveraged. These types of events were seen by all stake-holders as an undiluted good and something

that none of them would be able to facilitate in isolation. In addition to these events, and frequently grafted onto them, were information sharing activities where stakeholders exchanged ideas and issues amongst themselves and with the CALS office. Some of these events were convened internally within the department, but several were held as public gatherings intended to engage the equipment support contractors. By facilitating a community in which useful in-formation was made freely available, the CALS office encouraged participation and this kept lines of communication open.

Another measure taken to invigorate and sustain the links to the stakeholder community was the inculcation of a genuine customer-service mentality within the CALS office team. If a stakeholder contacted the team, it was deemed essential that the response be imme-diate and constructive. The team was chartered with the ability to invest time and money in helping stakeholders, and equipment program teams in particular, so that questions and problems could be resolved. Again having financial resources and a significant level of expertise in addressing information manage-ment challenges, the CALS office was in a position to be extremely responsive, and experience showed that responsiveness went a long way to building close working relation-ships with stakeholders.

KEY ACtiOnS:• Identifystakeholders,andcontinuouslyscan the environment for the appearance of new ones.• Facilitatelearningeventswiththegoalof raising awareness, improving capabilities, communicating plans and building networks of contacts.• Ensurethatlearningeventsprovidegenuine value to the participating stakeholders. • Hostinformationsharingsessionsasaway to gather inputs from the stakeholder community.

11505 Joe Gollner

• Facilitatecross-communicationamongst stakeholders, in order to foster a sense of community and a willingness to share resources and coordinate activities. • Keepthestakeholdercommunityapprised of initiative progress, and specifically high light successes that involve individual stake- holder organizations. • Adoptacustomer-servicementalitywithin the initiative that places a high priority on being responsive to stakeholder requests for assistance.

sTeP 7: FrAMe A ConCePT oF oPerATIonsFrom the beginning of the CALS initiatives within the US Department of Defense in the mid-1980s, there had been an emphasis placed on establishing a Concept of Operations (some-times termed a ConOps). This was a description of the business context in which an envisioned solution would operate and how that solution would work. One positive attribute of the Concept of Operations was that it was not to be drowned in details. Instead, it was supposed to address the context, the requirements and the solution architecture at a level, and in a way, that could be used to engage the executive level. The document was to be a short answer to the question “what is all this for?”

The CALS office again achieved success in this area. Not only did it distill the problem, as we have seen, in a way that re-engaged a twice-bitten management, but it built up a practical depiction of the envisioned solution that made sense on a number of levels. Perhaps critically, the envisioned solution stressed the critical role of the supplier community, in both realizing the solution and participating in it. This was a central element of the larger CALS concept, but, in this case, the Canadian CALS office carried the concept forward to its logical conclusion. If the content underlying the documentation set would be most naturally

managed in conjunction with the equipment component it describes, then the content must be modeled, maintained and managed in a way that is fully aligned with the equipment systems and fully independent of the publication struc-tures used to deliver the content to equipment operators and maintainers. Going further, the content structures should be optimized for the types of system and supplier combinations that were seen as essential to supporting future equipment systems. Whereas the US Depart-ment of Defense CALS initiative had invested heavily in establishing standards for hardcopy and electronic publications, the Canadian initiative would endeavor to establish standards based on the equipment information itself.

The resulting concept of operations was a combination of many ideas, and it was the combination that made the core message both understandable and attractive for seniormanagement. The proposed initiative positioned itself as learning from and capitalizing upon the mistakes of the past, both at home and abroad. This lent the plan an air of prudence. The initia-tive was also consciously aligned with parallel business initiatives aimed at forming integrated equipment product teams that would establish a new, more efficient mode of collaboration between the military and its industrial suppliers. The plan was positioned as a logical extension of these larger business initiatives, and it was even declared that these larger business goals would not be realized unless new technologies and new standards were exploited for the digital exchange of intelligent content. Finally, the initiative was somewhat shameless in tying itself closely to trends in the marketplace, specifically those surrounding web technologies and electronic commerce, which were receiving high levels of media, and therefore manage-ment, attention.

The concept of operations framed by the CALS office was successful because it presented a core innovation: that of organizing and managing

116 TIMAF Information Management Best Practices Vol. 1

information in the context of the equipment it described, in terms that were clear, tangible and even a little sexy. This combination proved too much for executive management to resist.

KEY ACtiOnS:• Distilltheenvisionedsolution,itsrationale and returns into a clear business proposition that will be simple to grasp by executives whose interests rightly align with operational concerns.• Ensurethatkeythemesareforegrounded, such as how past investments will be leveraged. • Addresshead-onanyconcernsthatexecutive management may have with respect to the initiative, such as past failures, and identify how they will be handled. • Establishastrongconnectionbetweenthe proposed initiative and key strategic business objectives. This allows executive management to situate the initiative within the larger enterprise context. • Emphasizethecoreinnovationbehindthe initiative, and make it clear why this is both important and necessary. • Align,whereverpossible,theinitiativewith newsworthy trends, as this can make the initiative more saleable to executive manage- ment and their stakeholders.

sTeP 8: esTABLIsh A goVernIng sTAndArdAlthough each of the steps set out thus far were challenging in their own right, and the successes achieved in each step stand as laudable, the real master stroke by the CALS office was in how it translated the concept into a technical solution that actually worked.

translate the Vision into Something UsefulIt is always remarkable how quickly and how completely management becomes mesmerizedby technology, but there is some reason behind

this attraction. When a business process is distilled into technology it becomes scalable and manageable – something all managers know is difficult to achieve and sustain in a fluid world of people and politics. Within a technical implementation, management repeatedly sees the chimera of stability and predictability that they long for but never find. The kernel of truth in this management impulse lies in the fact that, when properly deployed, technology can very much expedite a process and free up human resources to address activities of a distinctly higher value. Translating a process, even a very good process, into a practical technical imple-mentation that mediates that process is both a valuable undertaking and a hugely challenging step. Many projects falter on this very point, yet this is precisely where the CALS office succeeded.

Find an Innovative SolutionThe CALS office marshaled its more technically inclined resources and undertook a series of investigations. Some surveyed the past experiences of the US Department of Defense CALS office and the challenges it was facing as a result of its initial investments in publication standards. Others looked at AECMA 1000D as a candidate standard, which at the time was deemed to be too exclusively focused on aircraft and to be pursuing a design path that would lead to its becoming an unwieldy artifact (events have since borne this out as largely true). Still others took a closer look at the technical lessons learned from the department’s own failed investments. One team was dedica-ted to performing a comprehensive documentanalysis of the content and format specifications associated with the department’s technical documentation assets.

From these investigations emerged the idea for an information model that placed at its center, not a particular type of publication, but rather an equipment item about which various types

11705 Joe Gollner

of information were collected. As can be imagined, raging debates ensued about what this equipment item should be called. Each of the stakeholder disciplines had their ideas. Further discussions revolved around how specific the content model should be made. In this discussion, one camp advocated an absolute specificity, with each equipment com-ponent being associated with its own content model element name. Tags such as <radar> or <actuator> could be seen as distinct possibili-ties, and most stakeholders agreed that this level of specificity would lead to unmanageable complexity. Almost instinctively, the team began seeking a natural level of abstraction for the content model, where the elements would be meaningful enough to facilitate authoring and management processes but not so meaningful as to become a semantic quagmire.

Define an assembly Information ModelAfter much back and forth, and several false starts, agreement was reached amongst the team and the various stakeholders. The core element of the new standard would be an assembly. By definition, this term referred to a collection of items and the items could be understood as both equipment components and information modules. What made the concept of an assembly interesting, from a content modeling perspective, was the fact that it was both the root element and a recursive structure. Assemblies, obviously, could include other assemblies. This decomposition could continue ad infinitum, and some declared that this extended down to every nut, bolt and line of software code used in a system. The decision was made, however, to add an important caveat. An assembly had to be associated with at least one rich content asset, such as an illustration or a service instruction, in order to be sufficiently complex enough to be deemed an “assembly”. Otherwise, the items were defined as “parts,” about which data was maintained but no com-

plex information objects. From a complexity management perspective, this was a vitaldistinction.

Within the content model, an assembly was associated with a metadata construct known as “assembly information” (assembly-info) and could be associated with descriptive, operation-al or service information and it might contain, or contain references to, other assemblies or parts. Descriptive, operational and service information was then provided by a collection of information building blocks that represented the main information types encountered in complex technical documentation. Examples of information building blocks included main-tenance tasks, diagrams, specifications, unusual conditions and so on. Interestingly, each of these information types was a specialization of a base information module, to which all content was returned during any publishing process that did not require access to the rich semantic details. Also interesting was the fact that publishing structures existed that would reference the information building blocks in whatever order necessary to support different information products. Finally, and most interest-ing of all, was the fact that the entire assembly content model contained fewer elements than the Hypertext Markup Language (HTML) Document Type Definition (DTD).

This assembly model became the kernel of a governing information management standard and the foundational information architecture that would be used to guide CALS implemen-tation projects. A measure of the fit between the standard and the requirements is the fact that once cast, this model needed very few changes during several years of challenging implemen-tations. It was found that the assembly model genuinely could be used to meaningfully reflect and organize the information associated with a system as complex as a missile frigate or as simple as personal protective clothing.

118 TIMAF Information Management Best Practices Vol. 1

As enamored with the assembly model as the CALS team became, it was understood that validation would be needed. Accordingly resources were allocated to engaging stakehold-ers in identifying and completing pilot projects where the assembly model could be tested against real-world data and could be used to support applications that would address real-world requirements. The success of the founders of the CALS office in securing adequate fund-ing meant that the overtures for participation in pilot projects were accompanied with sufficient funding to make the requests something that any sane project manager could consider a reasonable proposition.

KEY ACtiOnS:• Investinthedevelopmentandvalidationof an elegant information architecture as the foundation upon which all solution invest- ments will be based.• Emphasizesimplicity,adaptabilityandmain- tainability in the design of the information architecture, as these attributes will be directly inherited by all resulting solutions. • Engagestakeholdersinthedocumentationof

requirements and deployment scenarios that are used to frame and then validate the infor- mation architecture. • Harvestallavailableresourcesintheprepara- tion of the information architecture and the refinement of the associated solution concept, with an emphasis being placed on re-using past investments and the lessons learned by similar initiatives. • Establishthetimeframewithinwhichthe information architecture and the resulting solutions will need to be sustained. • Establishtherangeofpotentialdeployments, so as to determine the degree of extensibility that the information architecture and the associated solutions will need to exhibit. • Ensurethattheinformationarchitectureis fully and completely independent from and, in no way bound by, technology constraints.

sTeP 9: engAge PILoT ProJeCTsProbably the most unusual aspect of this entire adventure was the extensive range of validation projects that were undertaken. Partly under-taken to reinvigorate the interest of equipment

Figure 1: Canadian CALS Assembly Model

11905 Joe Gollner

program teams and partly to season what came to be referred to as the assembly information architecture, pilot projects were launched in a bewildering array of venues. Specific effort was directed towards engaging program teams in each of the service areas (army, air force, navy and communications) and on different scales, from simple computer systems provided by a small Canadian supplier to massive NATO weapon acquisition programs involving hundreds of suppliers from around the world. Almost any suggestion that a certain type of equipment or information would be impossible to accommodate using the CALS assembly model was enough to spawn a new pilot project.

As already noted, what made this strategy sus-tainable was the fact that the CALS office had been so successful in establishing its mandate, a solid budget, a capable core team and a set of contracted suppliers. With these resources at its disposal, the CALS office was able to pursue all of these activities in parallel. Reflecting back on the period of the mid-1990s, the members of this CALS office would all likely agree that the volume of activity initiated was probably the most that could have possibly been managed at any one time and that the struggle for the team was the assimilation of so many lessons in such a short period of time.

The merits of embarking on this frenetic pace of experimentation and validation became abundantly clear. Not only was the applicability of the assembly model to different scenarios and different requirements proven as valid, a great deal was learned about what equipment program teams needed to be able to do with their information assets.

What also became very clear was that the technology products available for implementing integrated content management and publishing systems exhibited a wide range of strengths and weaknesses. The massively distributed nature of many equipment program team environments

also punctured a common misconception about how technology resources should be deployed in these types of applications. Specifically, it was impossible to consider content manage-ment and publishing processes in any way except as being fully independent of any one content management repository or publishing platform. These pilot projects had no choice but to operate using a heterogeneous mixture of content management and publishing tools. This fact simply made the role of the assembly model, as a standard, all the more important.

A number of technology vendors worked hard to position themselves as the preferred plat-forms upon which the business of equipment documentation could be run. But the graphic example provided by the serious problems en-countered by the US Department of Defense’s Joint CALS (JCALS) infrastructure investment proved that mandating a single technology suite was a ludicrously impractical path to follow. The technology resources provided by all vendors would need to be utterly disposable, and the standard would need to ensure that every interface between technologies, suppliers and organizations was completely technology independent. The consequences of this recogni-tion went beyond a simple endorsement ofopen standards, and it extended into the semantic portability of the content and its physical amenity to being processed by a wide range of technologies. The assembly information model, as a physical instantiation of the CALS information architecture, came to provide the universal constant to which all implementations could resort in order to circumvent technical, operational or organizational barriers.

To ensure the unbounded interoperability of the assembly content, a litmus test was established. The litmus test dictated that all parts of the information lifecycle had to be something that a two- person business in the rural countryside would be able to complete, while having access to no more sophisticated technology than

120 TIMAF Information Management Best Practices Vol. 1

a 286 computer, running DOS, a text editor and a public domain SGML parser. This was an admittedly harsh test, but it ensured that the standard did not imply a need to invest in expensive technologies. Larger organizations may opt to invest in more advanced tools in order to bolster their performance, efficiency or security, but at no point would the techno-logy choices of one organization impact those of another. Each participant in a supplier network effectively agreed to be able to receive information in a specific way and to return information in a specific way, leaving each orga-nization to chose its own tools and to conduct its business as it saw fit. It became clear that, among other things, an effective information management standard allowed all business units to optimize their operations independently, while at the same time coordinating their activities with partners to realize process-wide efficiencies.

KEY ACtiOnS:• Initiatepilotprojectsinallbusinessdomains being addressed by the initiative. • Limitthesizeandnumberofpilotprojects to match the initiative’s ability to effectively support them. • Ensurepilotprojectsactuallyaddresspracti- cal business needs, in addition to advancing or validating the standard or aspects of the general solution. • Harvestlessonslearnedfromthepilot projects, and continuously improve the standard and solution components, based upon these experiences. • Shareinformationgleanedfrompilotprojects with the stakeholder community. • Leveragepilotprojectsasanopportunityto build capabilities within stakeholder communities and to develop a network of informed advocates. • Managepilotprojectsaccordingtoschedules that meet the needs of the participating organizations. •Ensurethatparticiptingorganizationsachieve

self-sufficiency, so that they can operate and maintain the implemented solutions. • Emphasizethecentralimportanceof adhering to the spirit and letter of the standard and of resisting the ever-present temptation to subordinate the long-term viability of the solution to the limitations of specific technologies that will offer only temporary and localized benefits.

sTeP 10: ConsTruCT A reFer-enCe IMPLeMenTATIonA recognition that establishing a standard for the department, even a very good one, was not enough was a somewhat natural outcome of the approach taken by the CALS office. What was needed was a long-term solution to the depart-ment’s documentation problems. On one side, it was understood that thrusting a standard onto a struggling equipment program team was akin to delivering them a punch in the head when what they needed was a helping hand. On the other side, the temptation to invest in a one-size-fits-all technology solution was not an option.

The answer to this conundrum arose out of the pilot projects. In addition to the validation of the model and the expanding knowledge of the requirements to be addressed, the various pilot investments offered another type of residual outcome. In the context of these pilot projects, actual technology components were being created and refined, and these started to cluster together into an emergent technology solution that was fundamentally different than those that had been tried in the past. What was emerging was a reference implementation that was itself grounded in open standards and that could be adapted to fit the technology infrastructure within various organizations. It became possible for the CALS office to identify the assembly in-formation model as the departmental standard and then to offer a collection of technology components that could be used to accelerate project implementations. In fact, before long the

12105 Joe Gollner

quality of several of these components reached a point where the problem became one of man-aging the expectations of projects that came to assume the “shared components” were going to do everything needed and that it was someone else’s responsibility to make sure that they did.

The position of the CALS office on this matter turned out to be the correct one, although it was one that required continuous reinforcement and defense. The assembly model was estab-lished as a formal standard and was provided to projects and their suppliers as a contractual requirement. The technology components that had been assembled into the “referenceimplementation” were then provided as “information” that could be consulted and used in order to assist with implementations, but they were provided “without warranty”. The CALS office continued its investment in the reference implementation and even escalated it over time. The components were tightly integrated into the supporting documen-tation for the assembly model so that thereference implementation could be used to illustrate the intended use of the standard. It followed that the function of the reference implementation was primarily that of illustra-tion and instruction. Its quality was a secondary attribute, and if projects could jump start their implementations by re-using the source code from the technology components, even better. But the responsibilities for these implementa-tions remained with the equipment program teams, and these teams remained free to leverage existing or new technology assets in any way they saw fit.

The CALS office seemed to understand intuiti-vely that it would be complete folly to confuse the reference implementation for the standard itself or, to do as some more modern standards have erred in doing, constrain the standard based on the limitations of a specific technical implementation.

Although firmly positioning the growing library of technology assets as a reference implementa-tion, the CALS office had to concede that the functionality provided by these components was so complete that in some ways it did consti-tute an enterprise shared solution. Interestingly,this solution had some of its own notable successes, not least of which is that it imple-mented the almost impossibly complex print publishing specifications for bilingual hard-copy manuals. Over a period of several years of consistent investment, this shared solution came to provide well over 80% of the publish-ing functionality that most equipment program teams would ever need.

KEY ACtiOnS:• Evolvetechnologycomponents(leveraging open standards and public domain software to the greatest extent possible), and provide a reference implementation for the informa- tion management standard.• Integratethedevelopedtechnology components with aspects of the supporting documentation and, in particular, with deployment scenarios illustrating how the standard can be used to address specific requirements.• Leverageinvestmentsinthereference implementation in the validation of the standard and in the refinement of the implementation guidelines.• Emphasizethedevelopmentofcomponents that offer value to all stakeholder implemen- tations, including those that depend more heavily upon integrated commercial solutions (i.e., invest in assembly and validation processes that help stakeholders to deliver valid information sets).• Clarifyandmaintainthedistinctionbetween the information management standard and the reference implementation.• Allowthereferenceimplementationto evolve, and, where the interest exists within the stakeholder community, facilitate the further integration of the components to

122 TIMAF Information Management Best Practices Vol. 1

form an enterprise shared solution.• Establishtheframeworkandprocessfor facilitating the evolution and maintenance of the shared solution, while continuing to keep it and its management separate from the information management standard.

sTeP 11: ProgressIVeLyIMProVe The sTAndArdA significant investment continued to be made in the assembly model standard itself. These investments focused on making adjustments and additions to the standard in the light of the pilot projects. New trends in the larger marketplace also sparked investment, such as the increasing viability of leveraging browser-based interfaces for accessing complex technical documentation sets. Accordingly, a number of “delivery structures” and associated publishing processes were established that facilitated the transformation of equipment documentation into a form that could be accessed across the Web. Also, the appearance of an effort to tailor the use of SGML for web applications – later becoming the Extensible Markup Language (XML) recommendation from the World Wide Web Consortium (W3C) – was factored into the design of the assembly model. In truth,the orientation of the assembly model towards simplicity meant that it was already largely conformant with what would later be called the basic wellformedness constraints of XML.

As mentioned previously, one of the interesting indicators of the robustness of the assembly content model was the low number of changes that were technically necessary. Over a period of five years, between 1995 and 2000, only about 10 content model changes were made to the original models, with only one of these not being backward compatible with the origi-nal model version. This is astounding, when compared to the volatile rate of change seen in recent releases of S1000D. It should also be noted that the low number of changes was not

a function of resistance. In fact, while the team worked diligently to address identified require-ments using the available models, it did not shy away from exploring options that involved adapting the original models. It simply turned out that the CALS assembly model provided an elegant enough approach to marking up technical documentation that there were very few cases where a change was advisable let alone required. Other than these changes, there were a limited number of additions made to the content model, with these taking the form of introducing new delivery structures. This type of change was something that had been actively encouraged in the original standard.

The bulk of the improvements made to the standard were directed towards expanding and enhancing the support resources that allowed equipment program teams to implement the standard successfully. These materials included a comprehensive element dictionary, training modules, guidelines on adapting the standard to specific project requirements, instructions on recommended authoring and management techniques, and reference materials for address-ing more complex information sharing scenarios. Also included in these materials wasan extensive library of demonstration instances and test scenarios that could be used for both instruction and system validation. Related to the demonstration instances and test scenarios was the reference implementation that provided a repository of technology components. These components could be deployed in support of training scenarios or as the basis of production content management and publishing systems. These components were all grounded in docu-mented open standards and leveraged public domain software, so as to provide a demonstra-tion capability that did not rely upon commer-cial licenses. This combination of resources made the CALS assembly model among the best documented and supported standards in existence. While this represents an area where most information management projects cut

12305 Joe Gollner

corners, and where some skip it altogether, the CALS office wanted to make the standard as useful to equipment program teams as possible, and this meant that documentation, training materials and a reference implementation were essential.

KEY ACtiOnS:• Developcomprehensivedocumentationfor the information management standard. • Incorporateintothedocumentationavariety of deployment scenarios that illustrate how the standard can be used or adapted to meet different needs, and continuously expand this collection. • Integratetechnologycomponents,provided as part of the reference implementation, with the documentation, and thereby allow would-be implementers to experiment with the standard, in addition to reading about it. • Extendthedocumentationtoincludetraining materials for use in orienting equipment program teams and their suppliers and to facilitate sufficient knowledge transfer to provide implementation projects with the highest chances of success. • Embracetheprospectofchangesbeingmade to the standard, and ensure that efforts to address requirements with the standard as it exists are done solely to leverage past invest- ments and to coordinate implementations and not to forestall changes that may be legimately needed.

sTeP 12: PuBLICIze The BeneFITs AChIeVedMany of the pilot projects had in fact blossomed into full-scale implementations, and this trend saw the CALS assembly model standarddeployed on a large scale within a significant number of flagship equipment programs. An effort to survey and catalogue the benefits being achieved did yield some very impressive results. The original vision again proved to be valid, and the financial and operational benefits

associated with re-thinking how equipment documentation should be managed were shown to be very real indeed.

The primary benefit of the shift towards a modularized and equipment-oriented documen-tation approach was the elimination of content redundancy. The level of content duplication across publication sets turned out to be truly shocking. For most equipment types, the percentage of redundant information when compared to the entire collection of publications approached 50%. In the case of particularly complex equipment systems, the percentage was closer to 70%. On large fleets of related equipment variants, as was often seen with land systems, the redundancy level could hit 85%. What was most shocking was that this duplicated information was actually being authored, managed, illustrated and translated separately. The department was actually paying to have the same work done not only twice but sometimes many times over.

The financial savings that were realized as this information redundancy was identified and eliminated were massive. Unlike many produc-tivity figures that are posited in business cases, these savings were associated with documen-tation support contracts and therefore could actually be converted into real dollars saved. Beyond the strict numbers involved, significant as they were, the germane point to be made was that now changes to the documentation set could be done quickly and universally. The quality of the information products used by equipment operators and maintainers were fundamentally improved. This constituted a tangible operational gain for the department and its people.

With these numbers in hand, there was a glorious period of time when the CALS office was able to intervene forcefully in equipment programs and to demonstrate how the equip-ment team would save precious time and

124 TIMAF Information Management Best Practices Vol. 1

money by taking the enterprise route rather than going it alone.

The author can recall one especially memorable meeting where a senior officer, a Colonel on an equipment program team who showed scant interest in leveraging any type of enterprise shared solution, used the tried and true tech-nique of asking for a business case to justify his participation. The author had become so accustomed to performing these calculations and to presenting these arguments that on this occasion he opted for a different and more explosive approach. No, the business case for using the enterprise solution would not be prepared and instead he, the Colonel, would prepare a business case justifying how gross redundancy in his documentation, delays inupdate dissemination and the poor overall quality of his publications constituted a good idea and that wasting departmental dollars, amid a time of economization, was a defensible course of action. To say that this meetingbecame animated would be an understatement. It did however become part of a mythology that spread through the documentation stakeholder community as a sign of how confident the CALS office was about the value proposition behind its standard and solution strategy.

KEY ACtiOnS:• Performreviewsofinitiativeimplemen- tations in order to establish the real effects of the standard and solution when deployed into operational environments.• Usethesereviewstocollectactualsavings figures, and compare these against what had been originally claimed as potential savings.• Distillinformationaboutthesavingsand benefits being realized, and share the results with the stakeholder community.• Updatethedocumentationforthestandard, to strengthen the details provided about the savings and benefits to be achieved.• Adopt,ifthenumberssupportit,amore aggressive posture towards challenging

resistance to the enterprise initiative.

sTeP 13: deFend The InVesT-MenTAs the Canadian CALS initiative entered the late 1990s, the emphasis shifted very clearly towards one of supporting the main implemen-tation projects. So numerous had the pilotinvestments been that the department was literally filled with their production offspring. Monitoring these projects and providing pro-active support as they sought to address new or changing requirements became a significant undertaking.

Plan for Budget cutsIt is at this point in the story that dark clouds began to collect on the horizon. A number of factors came into play. One was that the CALS office was now within sight of the end of its mandate in 2002. Another was that the late 1990s was a period of extensive cost-cutting and reorganization within the Canadian Military, and this bolstered the natural forces of organizational entropy. A third factor was an increased rate of change in the general technology marketplace where XML and new generations of web applications were becoming increasingly attractive to equipment program teams and especially to their support contrac-tors and technology suppliers.

Budget cuts to the department should have privileged the CALS assembly standard that was already in broad use and that was suppor-ted by a variety of resources that could be used to bring down to negligible levels the invest-ment needed to deliver high-quality digital documentation. In reality, the financial con-straints being imposed on the department, when coupled with broad organizational changes, had the opposite effect. The community of equipment support contractors and techno-logy suppliers undertook a fulsome assault on

12505 Joe Gollner

the standard and waged a war of alternatives at the level of the equipment program teams. Any option, no matter how proprietary or un-tested, was preferable from their perspective, to working with a proven standard and a practical solution that left very little money to be made in developing custom software or selling third-party product licenses. Management turnover and organizational changes meant that new people were in charge of the key stakeholder organizations, and these people were easy prey for those who wanted to champion paths other than that of an enterprise solution.

At the same time, the funding levels of the CALS office came under pressure as it was argued, with some justification, that the team had achieved its goals and that the sole task remaining was to institutionalize the standard within the traditional mechanisms for managing changes to policies and standards. As a conse-quence, the CALS office budget for outreach, education, ongoing investment and project support was largely withdrawn by 2000. Furthermore, change had taken its toll on the CALS office itself with a number of key personnel retiring and the key contracted advisor shifting to part-time involvement. The plan to recruit new personnel into the CALS office, so close to the completion of its mandate, became a suggestion that was actively resisted by management.

The willingness of departmental management to see the CALS standard as entering a period of passive support was a lethal change. Informa-tion management standards are not, it turns out, comparable to trucks that once purchased can be driven into the ground. While it is question-able whether this is even true for trucks, it can be declared that if an information management standard is classified as a static artifact, is it effectively dead.

Prepare to engage competitorsThe internment of the CALS standard into a matrix process for standards maintenance proved more than theoretically fatal, as the owner of the key position in this process was none other than departmental publications branch (the very organization who figured so prominently in the original documentation problem in the first place). Not surprisingly, it was at this time and in this period of organi-zational uncertainty that the publication bureau reasserted itself and embarked on an infrastruc-ture investment project that would implement a final technology solution for departmental documentation management and publishing. Although a slim consolation, this new one-size-fits-all system, that sought among other things to hardwire documentation processes into the Enterprise Resource Planning (ERP) system, was no less of a disaster than its predecessors.The members of the CALS office who remained during these final years did work valiantly to defend the investments that had been made and to protect as many equipment program teams as possible from the generally specious proposals of self-interested and often ill-informed support contractors and technology vendors.

KEY LESSOnS:• Basedonthefeedbackfrominitialimple- mentations, the initiative team should re-evaluate its mandate and approach and potentially re-focus efforts onto areas with the greatest positive impact.• Whereinitialimplementationshighlight challenges to be overcome, the initiative team should prepare strategies to circumvent, minimize or address those challenges.• Assoonastheresultsofinitialimplementa- tions become known and where initial results are positive, executive management should be re-engaged, and efforts should be directed towards refreshing the initiative’s mandate.• Leverageinitialimplementationsuccessesto defend or even augment the budget resources available to the initiative, so that the problems

126 TIMAF Information Management Best Practices Vol. 1

presented by organizational changes can be effectively addressed.• InformationManagementstandardsshould be recognized as services as opposed to policy artifacts.• Itshouldbeacknowledgedthatifsufficient funding is not available to sustain the active evolution and promotion of an information management standard then that standard is effectively dead.• Theeffectsoforganizationalentropy,through structural changes and staff turnover, demand that a constant level of investment be sustained.• Itmustberecognizedthattherewillbemany that will actively seek to circumvent or overturn an enterprise information manage- ment standard, and that they will use almost any pretext to force the organization back into a state of making disorganized invest- ments.• Amongthemosteffectiveweaponsleveraged by the enemies of enterprise standards and co-ordinated investment is the appearance of new standards upon the horizon, with these being championed as the justification for jettisoning past investments in order to pursue the novel and untried.• One-size-fits-alltechnologysolutions, advanced as final investments that will bring in an era of predictability and order, remain as ineffective as ever, even if the successes surrounding an effective information manage- ment standard tempts people to believe otherwise.

sTeP 14: ProVIde An exAMPLe To FuTure InITIATIVesIt would be ideal that a best practice chapter conclude on an entirely happy note but this will not be the case here. Many things had been done well in the launch and conduct of the CALS office in the Department of National Defence. In fact, there were even a number of instances where simple good luck contributed

to the momentum behind these initial successes. The information management initiativechronicled in this case study featured a solid balance of effectiveness on both the political and technical sides. Somehow the initiative was able to allay the deep misgivings of executive management and win substantive support for a long-term solution. Somehow the initiative was able to attract and retain a stellar group of people, both staff and suppliers, that could provide the skills, knowledge, enthusiasm and dedication to power through a daunting series of pilot implementations. Somehow the initia-tive was able to ferret out of the rubble of past failures a number of critical insights that led to a resoundingly effective technical solution. And somehow the initiative was able to discern that simplicity and flexibility were two critical elements needed in both their technical solu-tion and their method of engaging equipment program teams. In many ways, the story of the ascent of the CALS office and the evolution of an enterprise information management standard and shared solution sounds almost too good to be true. For those who participated in those heady years, it actually does not feel entirely real in retrospect. As events later showed, there is a sense in which it really was too good to be true.

That the forces of organizational entropy eventually undid most, if not all, of what had been achieved can be taken as the most impor-tant lesson of all. It is hoped that fragments of the original contribution managed to live on in the knowledge applied in subsequent projects, but evidence to this effect is difficult to find. As an example of what can be achieved when all the right forces are aligned, the CALS initia-tive does still remain an exemplar for all future attempts to follow a similar path and to once again realize enterprise-wide benefits through a coordinated approach to information manage-ment. It can be declared, based on this graphic lesson, that enterprise information management initiatives in particular are the types of under-

12705 Joe Gollner

takings that must be actively pursued andsupported along the lines of what the CanadianCALS office has demonstrated. It can also be declared that if any successes are achieved in the field of information management, they must be sustained and defended energetically because there will be many whose motivation will not only be to resist those successes but to actively stamp them out. Information management is a service that must maintain a high level of innovation, outreach, education, promotion and even defense, in order to remain vital and valid amid turbulent organizational times.

The leader of the CALS office, upon his retire-ment, distributed a momento to the core mem-bers of the CALS office team. It was a small ornament that featured the logo of the Canadian CALS initiative and a short commen-tary: “Unique Project – Special People.” Events since that time have proven that it was more true a statement than he probably intended.

KEY LESSOnS:• Afteranysignificantperiodofharrowing effort, an initiative team will require fresh blood. The recruitment and orientation of new team members and new advocates within the stakeholder community should receive careful attention.• Afteraperiodoftime,aninformation management initiative will need refreshing. Executive management should be re-engaged, so that the mandate and strategies of the initiative can be kept in line with current realities.• Outreach,education,communicationand promotion grow in importance over time, while it is often assumed that once an initia- tive has made it over the initial implementa- tion hurdles the initiative’s benefits will sell themselves and investment in these engage ment activities can be cut back.• Formalizationofaninitiativeanditsbeing wed to the policy machinery of the organiza-

tion alone will not protect an initiative from the onslaught of alternatives at the operatio- nal level.• Thepersistenceanddesperationofthose with a commercial interest in re-establishing disorder in the areas of technology invest- ment should not be underestimated.• Itmustbeacceptedthattherewillbemany that stand to profit from the demise of an enterprise information management initiative, and chief among these will be technology vendors eager to deploy one-size-fits-all solu- tions and even more eager to see them fail.

Summary and ConclusionThis best practice covers a lot of ground, and it can likely be revisted on different occasions for new and different insights. Some of these would be taken from what went well and others from what did not turn out so well. By way of summary and reflection, highlights from this case can be catalogued under the criteria of key actions, success factors, challenges and lessons.

KEY ACtiOnS:• Catchtheinterestofexecutivemanagement.• Declareacompellingvisionofwhatis needed. • Assembleacoreteamwiththerightmixof knowledge, skills and credibility. • Leverageofficialchannels,withoutbecoming complacent that they will be enough. • Buildworkingrelationshipswithsimilar initiatives elsewhere and with competent suppliers. • Engagestakeholdersinacontinuousexchange of information and ideas, and strengthen the relationships by providing useful support. • Translatethevisionintosomethingpractical: an enterprise solution that balances long- term interests with short-term effectiveness.

TIMAF Information Management Best Practices Vol. 1128

• Groundtheenterprisesolutioninasound information management standard that incorporates a tangible information model, a complete set of supporting documentation and an easily adapted reference implemen- tation. • Emphasizetheroleofopenstandardsinthe enterprise solution, and ensure that the implementation of these standards does nothing to limit their intended openness. • Leveragepilotprojectstoaddresslocal requirements, validate the solution and grow the community of advocates. • Publicizethesavingsandbenefitsrealized by stakeholders through the pilot projects. • Defendtheinitiativeagainstthosewho would seek to overturn its progress and who have a vested interest in obstructing improve- ments in enterprise coordination.

KEY SuCCESS FACtORS:• Effectivechampionswhoareabletoengage executive management on their terms. • Clearunderstandingoftherealproblemto be solved. • Apowerfulbusinesscasemixingfinancially quantitative savings with compelling opera- tional benefits. • Along-termmandatethatallowstheinitia- tive to operate strategically and to learn over a number of years. • Customer-serviceorientationtowards addressing stakeholder needs. • Simplicityandflexibilityascoreattributes of the enterprise solution, underpinned by an elegant and extensible information model. •Concretefinancialsavingsandoperational benefits realized and used to underscore the importance of a coordinated enterprise approach.

KEY CHALLEnGES:• Thegeneraltendencyofbusinessunitsto resist enterprise initiatives due to a fear of being encumbered with obligations and hampered in their abilities to perform

effectively.• Vestedintereststhatseektoperpetuateout- moded business models and even to encourage increased disorganization especially in approaches to technology investment.• Organizationalentropythatcontinuously changes the makeup of the initiative team, management players and stakeholders and erodes the network of essential alliances.• Thegeneraltendencywithintechnical implementations to spiral towards sophistica- tion that threatens to sidetrack information management initiatives by introducing costly technology dependencies and limiting the range of stakeholders and suppliers that can participate.• Thetemptationtoover-emphasizetheimpor- tance of technical investments, for example in the reference implementation, and to allow one technical realization to influence, or worse limit, the information management standard and enterprise solution.

KEY LESSOnS:• Aneffectiveinformationmanagement initiative must engage the enterprise at all levels with a vision that combines the rational appeal of real financial savings and a passionate goal of improving the operatio- nal effectiveness of the organization when performing its core functions.• Informationmanagementstandardsshould be completely independent of any one technology or organizational model, and accordingly these standards should facilitate the movement of content across all manners of technical and organizational boundaries.• Informationmanagementmustbeunderstood as a service that is actively pursued, customer- focused and operationally relevant and not as a musty collection of standards, policies and earnest declarations.• Thecombinedpassionforchangesharedby a team of people who bring together a diverse range of backgrounds can achieve

12905 Joe Gollner

remarkable things in overcoming challenges and solving seemingly invincible problems.• Sometimestheharvestingoflessonslearned and the use of history as an instructive case study is all that can be salvaged once the forces of organizational entropy have eroded all that has been accomplished.

130 TIMAF Information Management Best Practices Vol. 1

13106 Drazen Djukic, Volker Grünauer

DRAzEn DJuKiC ([email protected]) is responsible for the content management strategy and development of Wienerberger’s websites. He managed the roll-out of the company’swebsites and online product catalogs to more than 25 countries and participated in building up a corporate intranet. Drazen holds a Master’s degree in Information and Communication Systems and Services from the Austria’s Technical University for Applied Sciences.

VOLKER GRünAuER ([email protected]) is responsible for the e-marketing strategy at Wienerberger, which includes 37 websites in 25 countries. Together with the IT department he managed the rollout of the CMS into new Wienerberger markets. Previously, Volker worked as an art director at elements.at, a web-design firm located in Salzburg, where he created inter-active experiences for customers in different business sectors. Volker holds a Master’s degree in E-Business from the University of Applied Sciences of Upper Austria.

Managing Product Information in an International and Decentralized Web Environment

creating a Product Information Strategy for your corporate Website

Drazen Djukic Volker Grünauer

R 06

132

AbstrAct

TIMAF Information Management Best Practices Vol. 1

Especially for production industries with decentralized product

information, it is essential for your company to consider

Product Information Management (PIM). Due to numerous

acquisitions and different regional needs, Wienerberger’s

product range varies from country to country. This results in

a decentralized organization and product structure.

Nevertheless, Wienerberger implemented a centralized

content management strategy to support regional market

needs.

This best practice example focuses on a handful of ingredients

necessary for a successful PIM project. These include system

integration, media asset management, and organizational

workflows. There is a saying in German that goes, “There’s no

all-in-one animal which provides you with milk, wool, eggs and

meat”. One of our key conclusions is to follow the strategy

“start small and improve continuously”.

133

Background

06 Drazen Djukic, Volker Grünauer

Founded in 1819 in Vienna and traded since 1869 on the Vienna Stock Exchange, Wiener-berger is a company that looks back on a great tradition. Over the years, Wienerberger grew from a local Austrian brick-maker to the world’s largest producer of brick. During this time, the number of production facilities rose from 11 to 240 plants in 26 countries.Due to many acquisitions and different regional needs, the product range varies from country to country. This results in a decentralized orga-nization and product structure. Nevertheless, Wienerberger implemented a centralized Content Management Strategy to support regional market needs.

The skILLs needed To PerForM ThIs BesT PrACTICeTo understand this best practice article, you should be familiar with the general concepts of Web Content Management (WCM).

Note that there are different parties involved in the process of implementing a Product Infor-mation Management (PIM) strategy. This article addresses the project management level of the process, but is worth reading for any job role dealing with PIM or WCM.

sTeP 1: deFIne your goALsCan you afford outdated or even wrong data on your website? Nowadays customers and other stakeholders expect accurate and timely online information. We know from research that inaccurate information on websites damages the brands behind them.

Let us think about a customer looking for a certain product—in our case a very special roof tile. In an ideal case, the customer finds exactly what he is looking for in your online

product catalog. He has checked the reference images and the technical details, and they all seem to be the perfect match for him. He is so impressed that he instantly orders the product. Now imagine what happens if his search returns a completely different product because the image on the catalog page was wrongly associ-ated. He may feel that your website—and your products—are not worth his time. Or imagine a user who is searching for the address of a nearby building. After driving for two hours, he recognizes that the address leads to a complete-ly different building than the one he is looking for. He will probably be very disappointed. We are sure there are a lot of similar cases in your business; therefore one could say that giving no information at all is better than giving wrong information. This makes assuring up-to-date, complete, and correct information an important part of your Web Content Management (WCM) strategy.

Developing a vision and stating a strategy are the only ways to make sure that everyone works together in order to reach your company’s goals; otherwise you risk having problems like those previously discussed. To get started, we suggest reading your WCM strategy and taking it as a basis for develop-ing a Product Information Management (PIM) roadmap. In our case, we defined the following overall goals:

• Costsavings.• Improvedcustomerservice.• Reductionincomplexity.

To reach these goals we:

• Simplifiedtheeditorinterfacetoreduce content management efforts. •Madeuseofexistingdatathroughintegration and merging of data sources.

134 TIMAF Information Management Best Practices Vol. 1

• Establisheddataqualityassuranceusing reporting. • Definedworkflowsforbettercollaboration.• Offeredproductconfigurators.• Avoidedredundantdata.• Definedmeaningfulproductcategories.• Setupasearchstrategy(incombination with product categorization).

sTeP 2: deTerMIne how dATA Is LInked To ProduCTsBefore you start implementing PIM related tools, it is essential to dive into the organiza-tion’s tool inventory. Most organizations have more than one database/system where they store product-related data. In most cases, this architecture makes sense, as each database/system has different requirements and a diffe-rent focus. For example, ERP systems e.g. SAP concentrate on production and logistics and are therefore quite technical, whereas WCM systems focus on transporting marketing information with multimedia content.

If you are setting up a new database/system for the same kind of information that is already in an existing system, you run the risk of having to do data updates twice. This usually leads to degraded data quality. To avoid this, you need to assess these systems, figure out what kind of information is managed there, and determine what can be used for the online PIM system.

This kind of data inventory allows you to avoid redundancies and insures that your product information is adequate.The following steps will get you started:

• Check the existing data sources. Check which databases/systems already exist in your organization and what kind of product- related information is stored there. Remem- ber to ask who is maintaining the systems and how often they are checked to insure that their data is up to date. Often, it is also

necessary to check the format of the data to ensure that it is correct and complete. For example, there could be an image somewhere in the system, but without any metadata like title and description.• Draw a rough product model. The next step is to draw product models—ideally one product from every product group—with all necessary data—technical attributes, descrip- tions, images, references and instruction manuals, for example. Then you determine which system inside your organization you can get this information from. The end result of this process is a product model that has data links to specific systems.• Define the data links in detail. Take a closer look at the data and the system where it is stored. The following things should be taken into consideration: - What language is the data in? Is the data language independent (a technical value, for example)? How many languages is it avail- able in? Is it possible to manage translations of certain data in the specific system? - is the data structured or unstructured? For example, is the data text containing more than one value? - is the data regularly updated?• Synchronize your product model. Check with the product management, marketing, and sales departments to ensure that you do not violate any product strategy. Some systems like an ERP contain a lot of confi- dential, internal data that does not fit with your marketing or product strategy.

sTeP 3: seT uP orgAnIzATIonAL And ConTenT MAnAgeMenTworkFLowsIt is crucial for every project is to set up a project organization that matches your com-pany’s organization chart. Your PIM project involves at least your Marketing and IT depart-ments, but can also involve any other part of your organization that is linked to product

13506 Drazen Djukic, Volker Grünauer

information. Marketing and IT have the follow-ing responsibilities:• Marketing-Responsiblefortopicslikedata linked to products, information structure and strategy.• IT-Responsiblefortopicslikeinformation architecture, database management and technical implementation.

For the ongoing maintenance, you need to delegate responsibilities to different people within your organization. As mentioned earlier, data accuracy is an important success factor. Therefore, you must set up content manage-ment workflows that regularly update data. Keep these workflows practical; start by brain-storming of all tasks which need to be done weekly, monthly and yearly. Then go through this list and link the tasks to departments and job roles. Do not use the names of employees in this chart; write down their job roles instead. This helps you to deal with employee turnover.

Reporting—discussed in Step 6—helps you to see problems in your project’s organization. For example, if the colleague who is responsible for updating the product information for a specific product group leaves the company, he might have not forwarded this task to his successor. If you track the number of logins and the amount of data changed over several months, you could see a significant change in that report. Trigger-ing such events helps you to see changes in your organization which you may not be aware of through the official channels. In large organizations, you simply may not be informed about organizational changes, even if they affect your project. Therefore we highly recommend creating such reports and checking them regularly. Of course, you should also ask every-one involved and their supervisors to inform you about upcoming changes.

You should also assess which parts of the information flow can be automated. If you, for example, know that you must enter every new

product in your CM system and your ERP system, you can set up an event to trigger a reminder when anyone adds a new product to the system. There will be a lot of possible automations and in general, people have great ideas for such process optimizations. To decide whether something can be automated, deter-mine how often this trigger really happens, and how likely it is that your users can miss a key step in the process if they do not see a trigger. If the trigger occurs frequently and automation reduces the process time and saves costs, or if the trigger can easily be missed, you should automate the trigger.

When you have finished developing your work-flows, create use-case based training for your employees. People learn more easily if they can apply what they are learning directly to their daily work, and they do not like admitting that they are not able to do something. Your training should consist of use-cases that show your users how to complete specific common tasks,instead of showing them all the functions of the database. We did all of the trainings in-house, without the help of consultants, becausewe know our products and the needs of our customers better than a third party vendor, and we think this know-how is essential for training.

sTeP 4: InTegrATe dATA sourCesYou designed your product data structure in Step 3; you know where to get the information from and which formats will be delivered. In this step, we will concentrate on the technical implementation of that data structure.

In most cases, you will already have one or more web content management (CM) systems that you use for managing the company’s web-sites, portals, and intranets. CM Systems usually manage product content like descriptions, images, technical attributes, and so on. Inte-grating an ERP system to make use of certain data, or using a Digital Asset Management

136 TIMAF Information Management Best Practices Vol. 1

(DAM) system for storing product images and instruction manuals, eases the content manage-ment process by reducing your editors’ efforts.

Consider the following items before implement-ing your data structure:

• Start smart and improve continuously. Projects often fail because project teams start off with too large a scope. It is tempting to automate as much as possible, and to do everything in one project. This results in an undesirably long implementation time. Instead of implementing all of your data links at once, implement one, roll it out, test it, and improve it continuously.• Choose your integration technology wisely. Modern CM systems offer a lot of different possibilities for data exchange and integrating with other systems. The most common inte- gration methods are Web Services. Web Services allow data exchange in a hetero- geneous system landscape and reuse of services within an infrastructure. Web Services open standards also save you a lot of money. You can also choose other techno- logies for integrating systems as long as they fit your IT strategy; the important point is that almost all systems in your enterprise (CMS, ERP, etc.) must support the technology that you choose.• Develop an update strategy. After you have chosen an integration technology, determine how often data has to be updated. For example, you may import technical attributes of products from your ERP system to your CMS. You need to figure out how often those technical attributes change in your ERP system, and how fast those changes have to be reflected online. It is also important to set up an organizational workflow for updates.- In some cases, it may be sufficient that an administrator can update the data on request. In other cases, users need direct control of updates to ensure a fast update of their web sites.

- Therefore, you must design an easy-to-use interface to make your system usable for CMS editors. When importing data from other systems you must also verify data quality and integrity before publishing it to your customers. Therefore, you need to plan data quality reports and checks.• Make use of rollout and training. Instead of having one big bang rollout, roll out your sites one by one and use your stakeholders’ comments to make improvements. If you have multiple websites/applications, choose one for a test run. As you test the site, try to illustrate its benefits to your stakeholders— consistently up-to-date data, less effort in content management, quicker time to market, etc. Train your users to use the tools that you developed, and check with them to see if the tools are missing functionality. Improve the tools in a second run, and then present your changes to the people who are respon- sible for other websites.

sTeP 5: seT uP rePorTIng And ConTroLLIngProper reporting and controlling procedures help you organize and promote your project.

Consider an online product configuration feature. The feature can also be used to track which settings are the most important for your customers. The product management team thinks that this is interesting information, which leads to the use of your reports for further product development. As you see, reporting can and should be an important feature in your project, as it gives you direct feedback from your customers. For example, by analyzing product statistics you can recognize valuable trends in markets.

Because our PIM is directly linked to our websites, we used standard web analytics soft-ware for tracking the access rate of the online product catalog.

13706 Drazen Djukic, Volker Grünauer

For quality assurance, you should also implement some further KPIs which may be tracked directly in the database. We check the following figures:

• Duplicated content - Check whether some products occur twice or more in your data- bases. • Accuracy - Check whether the technical details of your product are realistic. • update frequency - Check how often the data is updated. • Login activity - Check how often your organization’s users are online. If that figure changes dramatically, it could indicate that one of your users switched to a new job role and did not inform you. Take such signs as indications for new training.

You also need to set up different reporting levels. Your management is interested in different figures than a content editor; give every user group the figures they need plus some extra figures—that motivates your users.

sTeP 6: ConTInue The IMProVeMenT ProCessIn the preceding step, we used web analytics to improve our business by utilizing the PIM database in a proper way. You should also consider improving the PIM project continuous-ly—in fact, it is a process, not a finite project.

This step describes how your PIM team can also profit by using web statistics. As described in the first step, you need goals for the project. Write down these overall goals again and try to figure out which KPIs you can use to track your project’s success. For example:

• EvaluatingCostSavings.- Track how much time is needed to enter new products.- Implement one cost center for the whole project to see how the total costs for the

project evolve.• ImprovingCustomerService.- In Step 2 you talked with your internal and external customers. This gives you good feedback for the project. Keep up these interviews, doing them at least once a year. - Add quantitative questions to your inter- views to make your improvements traceable. For example, ask users how happy they are with the PIM solution, based on a 10 point scale.• ReducingComplexity- Visualize the data model and count the redundancies you have after the first year. Monitor how the number of redundancies decreases over the years.

In general, these KPIs will only work if you keep them the same over years. The evolution in the figures—the trends—are much more important than the exact numbers.

After defining and measuring the KPIs for your project, you have to act on changes in the figures. If you notice an increase instead of a decrease in the number of redundancies, for example, you need to figure out why that happens. If some departments still publish independent websites with product data from independent sources, you have to ask them why they do this. These questions give you good feedback, and sometimes good ideas about how to improve your project.

It is also wise to implement a guideline signed by the management which states that all new projects have to be built upon the centralized solution.

TIMAF Information Management Best Practices Vol. 1138

Summary and ConclusionImplementing one centralized and non-redun-dant product database for all of your websites is an almost impossible task. In most cases, you will have distributed product information on your system landscape (CMS, MAM, ERP, etc.) with the need of integrating rather than merging. Integrating different systems with your CMS and making use of existing data is benefi-cial when it comes to data quality and content management processes. The organizational part of PIM is also important. Define and publish the responsibilities for data management and automatize workflows where it makes sense.

In the end, a well-implemented PIM solution enables better customer service through up-to-date and correct data, and less editorial effort through integration and automation. This is something which is definitely worth a try!

13907 Peter Haan

PEtER HAAn ([email protected]) is Communications Manager Online Media for KPN and Getronics at KPN’s Corporate Communications. His main focus is on the application of new and online communication tools and techniques for both internal and external communications. Peter developed the concept for the social media platform at KPN and was responsible for the realization and implementation of this platform, including communications and governance. He has previously held various communications positions within the Ministry of Transport, Public Works and Water Management.

How Using Social Media Can Help Strengthen Your Organization

enhancing Unity and creativity With an Internal Social Media Platform

Peter Haan

07 (

140

AbstrAct

TIMAF Information Management Best Practices Vol. 1

An internal social media platform enhances involvement and

enables quicker exchange of expertise and practical knowledge.

It enhances social cohesion within an organization that is

becoming increasingly loose in structure in physical terms. For

organizations who want to leverage the way today’s employees,

who intensively use social media in their private lives, work and

interact, internal social media is practically mandatory.

It was for precisely these reasons that made KPN launch

“KPN1 Connect”: an internally created social media platform

for internal use. KPN1 Connect combines the best and most

popular features of Twitter, LinkedIn, Facebook, Hyves, BlogSpot

and Flickr. The profiles, for example, were inspired by LinkedIn,

the short messages use the same 140 character length as tweets

and the group capabilities were derived from Facebook and

Hyves.

This best practice describes the journey KPN made when

implementing the internal social media platform.

141

Background

07 Peter Haan

KPN is adopting the “new world of work”; this approach focuses on managing employees based on the results they achieve rather than simply on their presence – on expertise rather than on hierarchy. Using this approach means shrugging off the traditional office and being open tomore-flexible ways of working.

One of the challenges for a change from the traditional organisation to the new world of work is declining social cohesion – not merely because of the divergent working hours, but also because the practical knowledge exchange that once happened around the water cooler is slowly disappearing. KPN believes social media can help counter the negative effect of declining social cohesion. It can also enhance the positive aspects of the flexible, horizontal organisation. After all, the focus of social media is on people as a whole and not merely as employees. This is most evident for the digital natives of Genera-tion Y. On the Internet, these individuals tend to seek new forms of solidarity, involvement and purpose. In their private lives, they have already found them: social media are fully integrated into their lives.

An organization that uses social media internal-ly is signing up for sweeping social changes. An organization that takes this step has a greater chance of achieving sustainable success than one that clings to the old ways of thinking. It was for precisely those reasons that made KPN launched ‘KPN1 Connect’: an internally created social media platform for internal use. KPN1 Connect combines the best and most popular features of Twitter, LinkedIn, Facebook, Hyves, BlogSpot and Flickr. The profiles, for example, were inspired by LinkedIn, the short messages use the same 140 character length as tweets and the group capabilities were derived from Facebook and Hyves.

The idea for KPN1 Connect was born early in 2009 at a conference on online communication, during a Kodak presentation about the external use of social media to link clients. This idea led KPN to consider the idea of also linking and connecting employees internally and the idea of using social media at work soon came to life. After an initial presentation to the KPN board in the Netherlands in January 2009, the plan was elaborated into a proposal which was approved in September 2009 and was launched in February 2010. All KPN employees had access, along with temporary employees who had access to the internal KPN network – a total of some 21,000 staff.

The skILLs needed To PerForM ThIs BesT PrACTICeAs our experiences at KPN have shown, build-ing and expanding an internal social media platform requires the type of people who understand the subject, who persist despite internal resistance, are able to mobilize both the management and staff and are able to inter-act with IT colleagues, both in terms of content and diplomacy. This produces the following skills and characteristics that we believe are important for an individual to possess to make an internal social media platform a success:

• understanding social media: it is absolutely vital that the leader of the project under- stands the mechanisms of social media and has a feeling for the way in which undirected communication works.• Knowing what the organization needs: it is imperative that the project leader is open to the needs of the organization and knows the course the organization wishes to follow in the years ahead.• Building networks: right from the start, a major network must be set up within the

142 TIMAF Information Management Best Practices Vol. 1

organization. After all, such a project can hardly come into being along formal lines. Instead, the project leader must be willing to get right into the heart of the organiza- tion and be prepared to act as the project’s ambassador.• Convincing it colleagues: most IT within an organization depends on control, while to a large extent an internal social media platform actually needs to be set free. So, not only does the project leader need to be able to talk to IT colleagues on their own level, he must also be able to get them to go along with a world that holds a much different view of IT management. Fundamentally different demands are placed on a social media plat- form than on financial software or process automation.• Daring to experiment: the road to success differs in every organization. At least in the initial phases, therefore, the project leader must have the courage to explore a variety of possibilities.

sTeP 1: geT CoMMITMenT FroM The orgAnIzATIon’s exeCuTIVesA new internal medium that turns the way of communicating with staff on its head the way this one does, demands commitment at the very highest level. This is necessary not just in terms of finance, but also – and more importantly – in terms of strategy/organization. Although a return on investment could be calculated, for example, because other tools could possibly be dispensed with at a later stage, the true ROI of an internal social media platform is less tangible. Various studies have, however, shown that motivation and company pride are essen-tial ingredients for engendering greater employee resourcefulness and productivity.

The KPN board gave KPN1 Connect the green light, but not because there was any cost advan-tage to be achieved. It had, in fact, already been calculated that the project would pay for itself.

Still, that argument had a less significant impact with the board than the opportunity to enable staff to achieve their greatest personal potential. This was what gave the project team the right incentive to take the plan to the organization.

sTeP 2: work ouT The PLAn wITh A sMALL grouP And A Low ProFILeGenerally in large organizations, the issue is to confine development to a small group, without giving the plan too much publicity. The reason-ing here is that, if too many people are able to offer their opinions, the project will be sunk by compromises – a case of too many cooks spoiling the broth. Ideally, the group should be composed of employees, because they are the most important target group. The involvement of managers should also be kept to a minimum and only to elicit the required commitment and support from internal service providers and information providers such as Legal, IT and HR. In the meantime, participants should be re-cruited from as many parts of the organization as possible. What is important here is for these people have a serious affinity with social media.

At KPN, a small group of staff from the commu-nications department organized a smallcollective around them to help give shape to the platform. During construction, more people became involved, including for testing. Part of the original group is now active in guiding the platform along the appropriate lines. Following the launch, a sounding group of active users was also created.

sTeP 3: keeP The sysTeM sIMPLe And ACCessIBLeEnterprise collaboration solutions often tend to be implemented with too many options, what makes them too complex. For the time being, the same also applies to linking existing social media, although the first steps are now being

14307 Peter Haan

taken towards greater integration. Therefore, find a system that is relatively easy to imple-ment and one that can be adapted quickly, easily and at low cost. Fast and cost-effective modifications for user wishes in particular are necessary to keep giving users the feeling that they own the platform. User-friendliness and easy access also play a major role. Be sure to pay considerable attention to these two aspects, because they represent precisely what isrequired for large-scale participation.

The total design and construction process for KPN1 Connect lasted a mere five months. This was only made possible because the choice had been made for a simple system. Simplicity and accessibility were also important to thetarget group, and this target group at KPN is an extremely broad one, ranging from manage-ment and knowledge workers in the offices to call center agents, technicians and shop staff throughout the country. KPN1 Connect is also specifically intended for these latter groups, perhaps even more so than the relatively easilyaccessible headquarters staff. The platform now gives voice to what is happening among staff across the country. These are also the staffclosest to the clients, and thus KPN’s front-line sensors in society. Making their experiences and input heard is precisely what is most vital for the entire company’s course.

sTeP 4: LAunCh The PLATForM wIThouT A ConTenT And CAMPAIgn oFFensIVe, BuT rATher LeT IT grow FroM The rooTs uPIt’s inadvisable to have the communication department fill the platform with content in advance. A platform overflowing with corporate content creates the impression that the initiative is being driven from above. Instead, recruitambassadors and get them enthusiastic about the many benefits and attractive functions of the platform. A large-scale campaign offensive

via traditional internal media similarly creates the wrong image: one of an extra medium that simply serves as a managerial mouthpiece.A better option is to spur interest among the employee population with brief messages that arouse their curiosity.

At KPN, a study was carried out about a month and a half before the launch to find out which employees were already active in social media. Fifty of these employees were paid a personal visit and were given the opportunity to partici-pate in KPN 1 Connect as testers, specifically to provide direction. In exchange, two weeks before launch, the participants made a contribu-tion in the form of a completed profile, a short message, a photo or a blog. Each participant also had to ask ten of their colleagues to fill in their profiles as well. At the same time, groups of employees who had begun work-related groups outside KPN on Hyves were approached to start similar groups on KPN1 Connect. These initiatives were so successful that, as of two weeks before the launch, KPN had already received spontaneous applications from 100 extra ambassadors and 20 extra groups. A viral video that was distributed on the internal KPN network was then also extremely effective in attracting the KPN population to the new medium: 99 per cent of the intended target group visited the platform on its very first day. By contrast, little information about KPN1 Connect was included in the traditional internal media – either on the intranet or in the staff magazine.

sTeP 5: MAke The MedIuM CoMPLeMenTAry To oTher InTernAL MedIA And LeT The MedIA ChAnge In TAndeMUse the platform to supplement existing inter-nal media. Employees themselves, for example, can use social media to submit potential stories for these internal media. This offers the commu-nications department with a larger pond in

144 TIMAF Information Management Best Practices Vol. 1

which to fish for topics for these media. In this way, the opportunity is created to foment a new kind of interaction between the traditional media and the new social media platform. This interaction could mean that internal mass media such as the intranet and the staff maga-zine will in turn become more ‘social’.

KPN’s internal communications have under-gone a general shift. KPN1 Connect will assist the traditional media to change from top-down policy articles to stories that beat closer to the heart of everyone at KPN, which will enhance unity among all employees of the company.

sTeP 6: InVoLVe MIddLe MAnAgeMenT AT A LATer sTAgeIt is inadvisable to permit managers to partici-pate heavily in the early stages. Avoidingexcessive managerial involvement initially gives employees the chance to make the medium their own. However, management involvement is indeed desirable after several months to give the platform greater weight. First, ask the company’s executives to take an active role. This can potentially work as a stimulus and legitimization for middle and upper manage-ment who may, in this case, still have doubts about the medium.

After a few months, enquiries were received from KPN managers who were used to the more traditional methods of communicating with employees. Their initial attempts at blog-ging also fit within that pattern: completed stories offering no opportunity for responses or discussion. A round of mini-workshops for these managers – coming largely from (higher) middle management – offered them several tools that gave them the ability to work more effectively with the platform. Several top managers also set a good example and gave middle management the legitimization to also participate in KPN1 Connect. KPN’s HR manager, for example, started a blog discussing the negotiations

around the collective employment agreement (the Dutch CAO). In fact, the HR manager didn’t write about the actual content of these sensitive discussions, but instead focused on the difficulties of the negotiation process. The response feature put him in direct contact with employees on the work floor, and in subsequent blogs, he had the opportunity to consider ques-tions, address concerns and dismiss rumors. Employees, for their part, were given the oppor-tunity to approach their HR manager directly, whereas otherwise they might not normally be so willing to send him an e-mail.

sTeP 7: ALLow TIMe For A CoMMunITy To deVeLoP And ALLow undIreCTed CoMMunICATIonFirst, allow a community grow, simply put, by giving employees what they want. This is made possible with tight lifecycle management with a quick succession of steps. Then, permit undirected communication between staff. Experience has shown that employees tend to handle their freedom sensibly. Be sure to allow them to discuss topics that might be less agree-able to the company. Retaining credibility is essential for the platform’s success.

Active lifecycle management is achieved at KPN through a small steering committee and a broader sounding group who regularly submit new ideas. When it comes to content, virtually no subject is taboo. A lively discussion group focuses on the iPhone, for example, even though it’s not in KPN’s catalogue. Naturally there will be rules against coarse or offensive language, but moderating is generally not required.

sTeP 8: InCreAse The PLATForM’s reLeVAnCe And VIsIBILITyOnce communication and a group feeling have taken wing on the platform, it’s time to give

14507 Peter Haan

the platform its own identity and to make it relevant to larger groups of users. Helping the platform develop its own identity can be achieved, among other things, by identifying the platform as a separate medium in search requests. Giving the platform relevance is achieved through (behind-the- scenes) profiling of leaders and by activating strategic topics,which is enabled by inviting experts to blog on a specific subject and providing publicity for this via the traditional media – specifically the intranet. In this case, it’s important to only in-vite experts. Don’t allow communications staff to blog on company strategy – which is manage-ment’s domain. Also, bring together relevant employees if there are musings in a specific group about a new product or service, or if there are discussions on a problem. In this case, only notify managers at a later stage, and allow employees to sort it out for themselves first.

At KPN, KPN1 Connect now contributes to the changing role of the communications depart-ment from a censor to a spider in the web. Of course, internal communication’s role continues to be to keep employees well-informed, toencourage the right attitude among them and to make it clear what is expected of them. Now, this is not simply done by telling employees this directly, as was customary with traditional media, but by rather enabling employees to make KPN1 Connect their own through social media. Now and then, the department can give a little push, such as by asking employees to contribute to a discussion group and by then pointing other employees towards the discus-sion. Direct and open communication between employees then helps the discussion to progress faster than an article in a staff magazine or a memo on the intranet. In such instances, traditional internal communication was often seen as more akin to meddling.

sTeP 9: Add FeATures To enhAnCe The Low ThreshoLd And dIsConTInue suPerFLuous exIsTIng TooLsThe platform’s low threshold can be further enhanced by adding features and integrating existing communications tools. Other existing features that have become superfluous through the platform’s arrival can be discontinued. This not only saves money, it also sends more employees to the platform.

KPN1 Connect will soon be introduced in a mobile version to also offer traveling staff simple access to the platform. Integrating Twitter is also scheduled for the near-future, among other things through the hash tag #kpn1. Notification facilities by e-mail are also being expanded considerably. The traditional ‘face-book’ has largely become superfluous through the arrival of the LinkedIn- inspired profiles on KPN1 Connect.

sTeP 10: MAke InTernAL CoMMunICATIon CAMPAIgns enTIreLy ‘soCIAL’The final step in transforming internal commu-nication into the exchange of ideas and know-ledge on the work floor is to make internal communication campaigns entirely ‘social’. Communication in regard to major milestones for the organization is then carried out almost entirely via the platform, and can be aimed more easily at specific target groups than the internal mass media permits.

This step towards ‘internal marketing 2.0’ still lies in the future for KPN, but the initial thoughts are already taking shape. We are curious as to the experiences of other organi-zations who have already taken exploratory steps in this direction.

146 TIMAF Information Management Best Practices Vol. 1

Summary and ConclusionAn internal social media platform gives employees the space and the responsibility to develop their own interpretations of their functions. As with other basic facilities, such as a laptop or a mobile phone, this type of plat-form is one of the facilities that enable employees of Generation Y to carry out their work in a modern fashion, appropriate to an organization that no longer operates based on presence, but on output. An internal social media platform enhances their involvement and enables quicker exchange of expertise and practical knowledge. Above all, the tool matches the technologies that the new generation of employees are accustomed to in their private lives. It is practically mandatory for organiza-tions that wish to leverage the way this new generation works and interacts.

The organization itself also benefits from this type platform. It’s much easier, for example, to measure the ‘mood’ of the organization and to discover what concerns the staff. Internal communication can occur with far more focus, based on what employees themselves have submitted in their profiles. An internal social media platform also enhances social cohesion within an organization that is becoming increas-ingly loose in structure in physical terms. This cohesion, which employees themselves create based on their knowledge and interests, in turn contributes to greater involvement and compa-ny pride among staff. Enthusiastic employees translate into loyal clients, which is an essential precondition for sustainable profitable growth.

We can distinguish a clear series of success factors for the implementation of our internal social media platform at KPN:

• Thetechnologyandthecostswerenotthe most important factor – the executive will

to change and to trust our own staff was even more important.• Justasvitalwastheprincipleofnotwanting to do too much at once, but rather to give the platform time to grow organically and from the bottom up.• Thetimingininvolvingtherightpeople requires precision: here, too, patience is a virtue. • Finally,theinternalcommunicationsdepart- ment acknowledged that it should play a different role, more behind the scenes; not that of a commander, but rather one of a liaison officer.

The results of the implementation have been very encouraging. In the first two weeks, KPN1 Connect was visited 296,000 times, 1,600 profiles were modified, 373 blogs were written, 243 groups were formed with 2,500 groupmembers, 1,000 short messages were sent and 392 photos were uploaded. After four months, the counter of the monthly average was on half a million visits, 5.3 million pages visited, 3,300 modified profiles, 1,100 blogs with 1,100 responses, 389 groups with 6,000 members, 2,650 short messages and 900 photos. 20% of employees operate actively, while 95 to 99% read along. After five months, the number of groups is growing less quickly, but the number of blogs, short messages and responses continue to increase. In short, the emphasis is shifting from sending to interaction.

We are curious about what’s to come. We would be happy to enter into discussions with organizations who are finding themselves on the same road into the unknown, in order to further strengthen this best practice. We are particularly interested in the creation of ‘internal communication 2.0’.

14708 Phil Kemelor

PHiL KEMELOR ([email protected]) is Vice President Strategic Analytics for Semphonic, and he leads the company’s government and non-profit practice. He also serves as lead analyst for ‘The Real Story Group Web Analytics Report’, an annual evaluation of web analytics software vendors. Throughout Phil’s 13 year career in web analytics he developed one of the world’s first web analytics program offices for Bell Atlantic (currently Verizon), composed ‘The Executive’s Guide to Web Site Measurement and Testing’ and served as a Principal Consul-tant for NetGenesis – a first-generation web analytics software company.

How to Build a Web Analytics Plan and Roadmap

creating a Web analytics Strategy for Your organization

Phil Kemelor

& 08

148

AbstrAct

TIMAF Information Management Best Practices Vol. 1

In years past, the practice of web analytics has focused on

the constraints of the web analytics software required for data

collection and report generation. However, the “people” part

of the equation (i.e., management commitment, organizational

definition and support, process and communications) required

to support the software has more influence on whether web

analytics is considered to be valuable.

In this best practice, we’ll review common mistakes in web

analytics management as well as how to best address and

remedy them, so that one may come closer to having his

organization use web analytics for strategic and tactical

decision support.

149

Background

08 Phil Kemelor

The full value of web analytics in many organi-zations has suffered due to a lack of qualified staff, lack of access to web analytics tools by business users and the challenging nature of using web analytics tools to produce data that may be clearly used to support decisions related to online initiatives. Many organizations focus only on the purchase of a web analytics solution as a commodity item. Management often holds the viewpoint that the web analytics function is based solely on the purchase of software, and so, managers do not plan governance, process or staffing around the software purchase. This generally leads to underutilization of the tool. More importantly, the organization does not receive or analyze the data it needs to make decisions that guide the web channel.

Good analytics begins with a good organiza-tional structure. The quality of the data, analysis and usage is driven by the strength of the sup-port, process and governance put into place to scale the acceptance of data as a vital com-ponent in the organization decision structure.

The skILLs needed To PerForM ThIs BesT PrACTICeOne who manages web analytics for an organization, or is a web analyst involved in day-to-day operations of a web analytics initiative, will need to understand the issues described in this chapter, and will be able to benefit from the best practices outlined.

More on roles and skills for web analytics is described in Step 2 of this Best Practice.

sTeP 1: deVeLoP A BusIness CAse For A sTruCTured weB AnALyTICs ProgrAMThe concept of “Web analytics” no longer requires the business case, as it did three years ago. The introduction of a free services like Google Analytics and Yahoo! Web Analytics, both democratized and commoditized access to web analytics software. The need to advocate the importance of web analytics to the business no longer exists, since there is no need to pay for the software.

Still, the availability of free web analytics soft-ware has not necessarily led to an increased use of analytics data to make business decisions. Even while web analytics is available for free, many people still mistakenly believe that having the software is enough to provide answers. People tend to check off the box: ‘Yes, we have web analytics; now let’s move on.’ However, this is faulty logic because having a free analytics tool and one resource to crank out reports once a month does not make an analytics program.

The issues users face with using free web analytics data today are the same as they were when analytics software was available for a fee. Only now there are even more organizations that do not employ personnel with the knowledge of how the data should be used or the time to analyze the data and present recommendations on business decisions. Furthermore, there is a lack of trust in web analytics data as compared to other methods of web site measurement and testing, such as surveys, usability testing and market research.

With the advent of social media vehicles such as blogs, Twitter and Facebook, and the in-creased usage of audio and video, the challenge of tracking usage among all online properties

150 TIMAF Information Management Best Practices Vol. 1

grows. New vehicles for promoting web sites are available, but the opportunities for the web properties of others are growing as well through various partner and affiliate deals. And one must still think about the content that is being hosted on third party platforms, such as commerce and subscription engines. So, while in the past the concept of analytics and tracking usage might have been limited to a single web site, the last few years has seen an explosion of new online media to track. How can one manage his spend on content development, promotion, visitor acquisition and retention? If one has a web analytics solution implemented, but isn’t using the data to drive a business, he will likely spend money in the wrong places and miss opportunities.

Some organizations are starting to see the value of comprehensive analytics that include web analytics, social media analytics, market research, usability, focus group testing and surveying.

These organizations understand that there must be a coordinated initiative across all disciplines to include technology along with governance, process, organizational support, management commitment and ongoing communications to make viewing the larger picture possible. This insight often results in the creation of a central program office to manage the effort.

Who drives the effort? In the best of all worlds, a senior manager with budget and vision funds the office. However, if one is not in the position to fund the effort with one’s own budget, she will need to create a business case to support the legitimacy of an analytics program office.

Let’s be honest – this is not a trifling endeavor; rather, this task will cost one a significant amount of both time and money. It will likely involve shifting tasks and responsibilities out of silos, which will certainly create political issues. In many ways, the creation of such an office is at the core of reorganizing an association’s view of its entire online strategy. How can one make the case?

Make sure the investment pays off. Like any business case, one must tell a story that answers the following questions:

1. What is needed? 2. Why will it help the organization? 3. Who is involved, who will be impacted and how? 4. What is required to operationalize the initiative? 5. What will it cost? 6. What are the impacts to current processes? 7. What are the benefits (in $ figures)? 8. What are the consequences and costs, if the initiative is not funded?

As this is a major effort, it will help to get executive sponsorship. Getting managers to buy into the concept is the first step. One who cannot get her manager involved will be at a significant disadvantage.

It will also help to begin an alliance building strategy. Socialize the idea with managers who sit in groups that may support the new office with resources. This is an important consensus building phase, and one may get significant input on how to frame the business case so it is actually funded.Figure 1: Common Business Case Rationale

15108 Phil Kemelor

sTeP 2: deVeLoP A weB AnALyTICs sTrATegIC PLAnWhere does the web analytics function sit in organizations today? IT? Marketing? Editorial? Web Production Team? If other methods are used in an organization (e.g., market research, focus groups, surveys, usability testing) where are they managed?

Organizations that succeed in implementing enterprise web site measurement and testing initiatives adhere to a six-point framework where the following items are addressed strategically and managed tactically:

• Commitment–Analyticssucceedswhen there is commitment from top to bottom and across the board.• Communications–Analyticssucceedswhen results are presented so that stakeholders understand how to use the data.• Organization–Analyticsworkswellwhen there are sufficient and correct resources dedicated to administration, project manage- ment, analysis and technical support.• Metrics–Usefulmetricsarethosethat answer business questions. • Process–Successfulwebanalyticsinitiatives are founded on a collection of processes that must be created and followed by stakeholders, managers and support teams. • Solutions–Webanalyticsisdrivenby software. There is no one right or wrong tool. The key is matching the tool to organizational requirements.

commitmentWeb analytics initiatives succeed when each level of the organization understands theimportance of using analytics to achieve business goals and also understands how to advance the analytics initiative through leader-ship and support. However, analytics does not exist in a vacuum, and senior management’s commitment to the web as an integral part of

the organization also plays an important role in how data drives decisions about the web. The way that senior management communicates its commitment to offices and managers within the organization, that will need to support analytics and who will be expected to use analytics is of critical importance. Management commitment must come with associated levels of account-ability to ensure that initial support for analy-tics is more than an empty promise.

Management can set the tone for acceptance of analytics throughout the organization in a number of ways. Here are a few examples for how this can be done effectively:• Establishacentralizedwebanalytics program office and formally announce its existence to all staff. There must be a centralized staff office to drive web analytics throughout an organization. Management will need to establish and provide budget and staff for this office. Clear communication to middle managers of departments that are supporting the web analytics office and those who are expected to use web analytics will help secure acceptance of the web analytics office’s initiatives.• tie division, office and staff performance goals to analytics results. Accountability is everything. Analytics is often not given sufficient respect because there is no connec- tion established between web site perfor- mance and business and performance goals. Management’s commitment to analytics will indicate that analytics is considered to be an important yardstick. The measurement of analytics should include the accomplishment of performance goals, whether the goals are based on cost, fulfillment of mission or other objectives that are tied to measurable achievement. While this will not be an over- night process, it would be beneficial to review web site goals in context with performance to set targets, so that the success of new content, applications and marketing programs may be determined.

152 TIMAF Information Management Best Practices Vol. 1

organizationFor web analytics to be successful in large organizations, clear definition and account-ability of roles and responsibilities is essential because most available resources have many other tasks to perform; and many of these tasks are considered to be more critical than web analytics. As a result, web analytics may not receive the attention it requires in order to be used effectively. This creates an environment where web analytics is practiced only in an ad hoc manner.

Once the web analytics office is established, it requires support from other departments. As the explanations below demonstrate, even as the analytics program office may manage and conduct most analytics activities, key support is required.

• Web Analytics Program Office - Lead and coordinate process for rolling out web analytics across the organization, - Manage the process for collecting analytics requirements and developing reports based on requirements, - Manage access to the web analytics solution and the type of access assigned to users, - Manage the deployment of data collection page tags into new content and applications following the initial implementation, - Conduct ad hoc and special analytics projects for senior management, - Provide report analysis to senior management, - Determine organizational needs and provide analytics training to analytics users, - Manage administration of and services contract for web analytics tool, - Manage and coordinate the use of web analytics in conjunction with other measure- ment and testing disciplines, such as user experience testing and market research.• Web Project Management Office For new web applications that require page tagging for web analytics data collection, the Project

Managers will: - Coordinate web analytics requirements and report development process between the Web Analytics Program Office and stake- holders requesting the new application development, - Integrate web analytics report development, page tagging and testing lifecycle into project plans and coordinate all activities to assure implementation of web analytics data collection.• Software Engineering - Implement page tag data collection on web sites and applications.• Server Administration - Encourage all sites on their servers to use web analytics and contact Web Analytics Program Office to become engaged in the process of page tagging and report develop- ment.• Web Site Governance - Encourage all sites to use web analytics and contact Web Analytics Program Office to become engaged in the process of report development.

The figure below outlines these department roles:

Figure 2: Web Analytics Organizational Support

15308 Phil Kemelor

Figure 2: Web Analytics Organizational Support

The organizational roles outlined above are de-pendent on adequate staffing, especially within the analytics office. The analytics office should accommodate the following positions:

• ProgramManager- Manages analytics team and technical support resources;- Serves as primary interface between analytics team and organization management; - Develops research plans that include multiple web site measurement and testing methods; - Maintains and manages vendor relationships; - Manages analytics and research education and training; - Promotes analytics and coordinates analytics initiatives throughout the organization.• WebAnalyst/WebAnalyticsProjectManager These roles will be charged with the tactical aspects of conducting web analysis tasks and activities. - Develops metrics requirements with stakeholders; - Determines data required to develop metrics; - Analyzes reports, calculates metrics; provides strategic and tactical recommendations based on metrics analysis; - Manages Web site measurement projects to include initial implementation and new initiatives; - Conducts explorative web analytics projects; - Conducts training and educational assistance to analytics stakeholders; - Ensures that new applications are developed to capture required web data for analytics reports. • WebAnalyticsApplicationAdministrator This resource is primarily responsible for providing technical support related to the web analytics application. - Sets system permissions for those who will have access to reports; - Schedules and publishes reports;- Provides diagnostic skills and the first level of support on system problems; serves as point of contact to vendor support.

See the figure below to view the staff required to support Web Analytics Program Office.

ProcessHaving the right roles, responsibilities and resources assigned to web analytics is crucial to success, but they must work in coordination with each other. Creating processes around web analytics enables people to work together smoothly, especially to ensure timely data col-lection, report building and production. Typical-ly, processes encompass the following activities.

Figure 3: Web Analytics Staffing

154 TIMAF Information Management Best Practices Vol. 1

In many organizations, web analytics is considered an afterthought in the web and content development process. To address this, there are basic actions that may be taken, such as institutionalizing web analytics into web requirements and development process. Formalizing the integration of web analytics into content and development cycles should be a priority for any web analytics manager.

This task can be made easier when cooperation from supporting departments is agreed upon (see Organization section).To ensure that web analytics metrics and reports are produced in time for the launching of new applications, their development should be incorporated and/or follow the Web Applica-tion Life Cycle, as illustrated below:

Figure 4: Web Analytics Processes

Figure 5: Web Analytics Development LifeCycle

15508 Phil Kemelor

Timelines associated with this process will need to be agreed upon between the Web Analytics Program, stakeholders and support teams. To do this, stakeholders, the Web Analytics Program and support teams need to provide and agree on baseline estimates for the components of work for which they are responsible so that there is enough time to develop web metrics and reports.

MetricsSite measurement tools and services provide an overwhelming array of reports.

• Somemaybeusefulbecausetheyallowone to answer business questions and get a sense of how the web site is performing on a tacti- cal level.• Somemaybeusefulbecausethedatawill be used for calculating metrics that enable tactical and strategic decision making.• Somemaybeusefulbecausetheywillbe the basis for exploring deeper analysis. • Somemaynotbeusefulatall.

Without a central web analytics office, manager or point of contact, metrics tend to bedeveloped on an ad hoc basis and with little standardization. This commonly creates an environment where the same numbers are interpreted differently or inconsistently. This leads to confusion among stakeholders and can be quite misleading.

When it comes to the development of metrics, there should be a standard approach that becomes the benchmark for the entire organization. Standardization can take the following routes:

• Useofa“managementdashboard”report for presentation to senior management on a regularly scheduled basis.• Developmentofastandardreportsettobe used by all stakeholder groups as a baseline

for month over month or week over week trending.• Developingsuccesseventmodelsthatare based on both commerce and non-commerce events.• Developingastandardsetofmetricstobe used across the organization for comparative purposes. For example, establishing metrics to compare visitor value, monetization, navi- gation and usability, content and marketing.• Establishreportingsegmentsfortheentire organization, such as first vs. new visitors, visitors from particular geographic locations or search engines and marketing sources to obtain more specific insight into visitor behavior.

communicationsCommunications provide the glue that holds an analytics program together. All too often there are scenarios where reports are created and not interpreted; or they are interpreted, but the analysis is not shared. It is also common to see situations where requirements are not collected effectively, resulting in reports that are produced but provide little business value.

Communications is generally not pinpointed as an underlying reason for these scenarios. Often the tool is blamed for not performing as expected, or stakeholders are blamed for not taking the time to read reports. In general, reasons for lack of web analytics program effectiveness are seen as a series of ad hoc or unrelated scenarios, when in fact they are symptomatic of not having a cohesive web analytics communication program.

Communications may be improved through the development of documentation and the creation of more diligent feedback loops among stake-holder and support groups, as illustrated in the following examples:

156 TIMAF Information Management Best Practices Vol. 1

ExAMPLE 1: COLLECt AnD DOCuMEnt ACCuRAtE WEB AnALYtiCS BuSinESS REquiREMEntSAnalytics requirements are conducted sporadi-cally and there is no standard documentation used by the groups and no educational docu-mentation provided to stakeholders to prepare or educate them on the requirements gathering process. All too often web analytics require-ments sessions begin with the question, “What reports do you want?” The focus needs to be on the business questions that need to be answered to meet specific business objectives.

To ensure this type of discourse, a survey questionnaire should be developed that can be used as the basis for the development of web analytics requirements and meetings held with stakeholders to elicit and validate accurate requirements.

ExAMPLE 2: EnSuRE ACCuRAtE MAPPinG OF BuSinESS REquiREMEntS tO DAtA COLLECtiOn VARiABLESAfter requirements are gathered, they must be mapped to the correct data collection variables to ensure that all data is collected and collectedaccurately. While it is true that a great deal of information is captured from base page tags, data from search, forms, exit links, video and audio applications also require page level attention. It is not unusual that these types of content and application elements do not get this attention upon initial solution implementation or when new applications are launched.

Functional specifications and technical design documents are used to define all of the elements of the page tag and custom variables. These documents validate with the stakeholders the data needed for meeting report require-ments and spell out to the development team how the page tag will need to be designed to collect the data.

ExAMPLE 3: DEVELOP A StRAtEGY FOR REPORt DiStRiButiOn AnD AnALYSiSWeb analytics reports are usually made avail-able to stakeholders through the web analytics solution user interface or as a distributed email report in Excel or through PDF. But these reports include no interpretive analysis. Terms, trends and anomalies are not explained, explanations are not provided about how the data was collected and the issues affecting data quality and accuracy are not addressed. Because little is done with regard to developing analytics requirements in context with business goals, interpretations are not generally provided in the context of business questions.

Analysis is focused on numbers of page views and visits, and the stakeholder needs to figure out what that means relative to the business questions they may have about the web site. In addition, there may be little or no explana-tion for some of the more arcane web analytics terms, such as “no referrers.” This lack of expla-nation leads stakeholders to blame the solution for not providing the information they need, when it is really the responsibility of the person managing web analytics to provide context.

To address this situation, report distribution and analysis needs to be addressed at the time of report requirements development. A key point for consideration is what format will best enable stakeholders to use web analytics to complete their jobs. If people are going to use the data to create offline metrics, then one should provide an Excel report that will enable them to do their own calculations; however, if people are only looking at the numbers to get a quick understanding of search engine keyword effectiveness, then an HTML report with that information should be emailed. Report distribu-tion strategies may differ among groups within the organization. For example, it may make sense for some content managers to be given access to the web analytics tool so that they may create reports for others within their group

15708 Phil Kemelor

on an ad hoc basis. Still, other content mana-gers may only require reports on a monthly basis.

Interpreting web analytics reports and metrics is a challenging task because stakeholders have different literacy levels, when it comes to under-standing how the web analytics data relates to business goals. To address this, strategies must incorporate multiple methods of providing and communicating analysis of web analytics.

These methods include:

• OneononesessionsbetweenWebAnalysts and stakeholders to explain the use of reports and metrics to answer business questions posed during the requirements process.• GroupsessionstaughtbytheWebAnalyst that explain how to use the recommended report sets.• Annotatedreportsetswithinterpretive analysis observations provided by the Web Analyst or Power Users.• Glossaryoftermsprovidedwithreports and metrics. • MonthlyanalysiswrittenbytheWebAnalyst about top level trends and their meanings to be distributed throughout the organization, especially to management. • Plainlanguagepagetitlesprovidedinanaly- tics reports instead of URLs; enable this programmatically, rather than having a resource do this manually. • Presentingwebanalysisfindingsand recommendations in a formal setting to management or a web analytics governance board. • Settingupanonlinewebanalyticsknowledge center to provide a self-service environment in which stakeholders may educate them- selves. • Developananalyticstrainingandeducation curriculum.

SolutionA web analytics software solution is at the heart of any web analytics program. What’s the difference between a good tool and a bad one? Usually factors that have nothing to do with the solution itself, but are more dependent on what it’s required to do, how it’s being supported and by whom.

As part of the strategic planning process, these issues should be clearly defined and understood, so that mitigation tasks may be determined and initiated. These tasks may then be prioritized and carried out in phases over a defined period of time, such as 6-9 months.

More often than not, issues associated with web analytics solutions that have already been implemented are related to poor maintenance of the data collection method (page tags or log files), a lack of understanding among users about how to interpret data anomalies and trends that are presented in analytics reports, or a lack of definition on the specific data types that are feeding the analytics reports.

ExAMPLE 4: tHiRD PARtY SitESIf one has a store, subscriber application or content hosted by a third party vendor, she must find a way to capture the data from the servers. Upon request, the owners of the servers may provide access to the pages that apply the HTML part of a page tag. But it will be much trickier to secure permission and time to apply the base JavaScript code.. Still this issue must be resolved, both technically and within the business agreement, for complete data collec-tion.

ExAMPLE 5: MiCRO-SitES OR CAMPAiGn SitESWhen working with agencies that host micro-sites or campaign sites, how does one ensure uniformity in data collection solution and methodologies? One may need to develop implementation standards that spell out custom

158 TIMAF Information Management Best Practices Vol. 1

variables to use within the page tag and also develop campaign and page naming conven-tions to ensure that there is consistency in report labels as well as in definition of success events that are used for calculating conversion metrics.

sTeP 3: deVeLoP A weB AnALyTICs roAdMAPConducting the strategic planning process will provide an opportunity to assess the entire web analytics initiative. It may provide the ground-work for establishing a web analytics program office, or it may provide direction for how to formalize web analytics processes, organization, communications and metrics, as well as energize senior management commitment to the impor-tance of web analytics.

In both cases, the recommendations that come out of a web analytics strategic plan should be put into a roadmap that lays out tasks, level of effort estimates, resource planning and a high level timeline for when recommendations will be acted upon. The roadmap may also serve as an additional document for developing a business case for additional web analytics resources and scaling a smaller program into one that is built for enterprise-wide operation.

Summary and ConclusionOrganizations rarely get the full value out of investments in web analytics. This is true not because they don’t invest in software, but because they do not invest in the people, processes and governance procedures that manage and maintain the investment.

Establishing a web analytics program office and staffing it adequately makes sense from a resource perspective. A strategic planning

process is also necessary in order to identify and evaluate the strengths and weaknesses of the institution’s commitment, processes, organi-zational support, communication maturity, web metrics and web analytics solution. Once these issues are identified, it is possible to develop a roadmap that prioritizes how to maximize the strengths and mitigate the weaknesses. More importantly, the plan and roadmap together enable one to scale web analytics throughout the organization, so that the data may be used by decision makers in the planning and deploy-ment of all web site content, marketing and social media initiatives.

15909 Jim Larkin, Bill Birney

JiM LARKin is founder and CEO of Resources Online, a Seattle-based content and production company established in 1995. As CEO, Larkin maintains strategic direction over all projects the company undertakes. Prior to founding the company, Jim worked as a technical writer and editor. In the 1980’s Jim worked on the research staff at the Institute for Research and Information in Scholarship, Brown University, which, among other activities, built and deployed an early fore-runner of the World Wide Web.

BiLL BiRnEY is a technical writer, audio engineer, and audio and video producer. Before joining Resources Online, he spent five years on the Windows Media team at Microsoft Corporation, and has written, produced, and done post-production on numerous projects throughout his career. He currently leads video production at Resources Online.

Using Videos to Enhance a Software Product

Managing Media creation to Meet Business requirements

Jim Larkin Bill Birney

09 J

160

AbstrAct

TIMAF Information Management Best Practices Vol. 1

Clients often have difficulty articulating the relationship

between including media or interactivity in their content

and the business value of that media or interactivity. They

may believe that adding media adds value, based on

anecdotal evidence, or because of competitive pressure

or executive fiat. However, clients typically do not have hard

evidence to support a decision to invest in media creation

or knowledge of what sort of media would be of most value

to the organization and its customers. Their lack of evidence

or knowledge about the most effective types of media for their

customers presents a problem for consultants who create the

media: If we do not know what clients really need, how will we

know if we are successful?

We have found that one of the most valuable services we

can provide is to help clients articulate what they need,

what investments are worthwhile, how to respond to their

customers and meet those customers’ business needs, and,

ultimately, how to determine if the effort is successful.

Sometimes this service requires us to challenge our clients

to produce substantive evidence about their customers’ needs,

pressure from competitors, or their assumptions about the

benefits they hope to get from media projects. Even if no

substantive evidence exists, and the client’s answer about

the driving force behind the project is “Because my VP says

so, and it is successful if my VP likes it,” it is still useful for a

media producer to have that information explicitly. Ultimately,

we align the stories a business wants to tell with a process

that reflects its business goals.

161

Background

09 Jim Larkin, Bill Birney

Many of the business decision makers (BDMs) we interact with know they want media-rich websites. Creating and using media is com-mon today because of the ubiquity of devices for capturing pictures and video, devices for playing media-rich content—including mobile phones and netbooks—and high-speed band-width. Community-created media ranges from personal content for sharing among friends to sophisticated entertainment. This is the context within which business media exists, and it is differentiated from other sorts of media (such as that created for entertainment or education) primarily by the fact that business media serves some goal of the business that creates it. How-ever, the relationship between media and the business goals that it serves is not always clear.

In this article, we are making a few assumptions:

• First,themediawecreateisrelatedtosome set of business goals. We are defining business goals as “the desired response or action of the audience that interacts with the media”. This desired outcome may be an obvious business outcome, such as convincing the audience to buy something, or it may be more nuanced, such as helping someone change a perception, stop a harmful behavior, or learn how to do something (thereby avoiding an expensive call to tech support). All of these and more are “business goals”. We are not talking about media created for entertainment or personal communication.• Second,readersofthisarticlearethosewho create media for clients (who have business goals), or who are themselves the clients or BDMs who hire media production teams.• Third,businessgoalsarenotalwaysexplicitly expressed, but projects will be more success- ful if they are.• Fourth,mediaisacommunicationmethod that competes with other communication

methods. A software company that supplies information to customers to teach them how to use the company’s products, for example, has choices and problems. The company may provide static information that is complete and searchable, but users still call technical support. So the problem is not managing content—the problem is understanding how to best serve a customer need. If the goal is to reduce customer support calls, can video content help? Is it more interesting, educa- tional, and engaging, and if so, to how many people? And, ultimately, how well does it reduce the need for customer support cost and increase customer satisfaction compared to other ways of providing equivalent content?

Successful video producers or filmmakers do more than convey information. Even in the digital age, they work in an old-fashioned, linear, and temporal medium that actively presents information in a form that is readily consumed by a viewer to achieve a particular end result. One of the most effective ways to present information is as a story. So periodically throughout this article, we mention storytelling and its use in helping maintain the relationship between the media and the business require-ments of media creation projects

An effective way of creating media to meet business requirements is to take advantage of the power of stories. Information is passive content. A story is a way to package theinformation to make it active. Why is a story so powerful? Everyone understands the story form, and it immediately resonates with us all. Understanding what the story is or should be is the challenge we face, and meeting business goals is the standard we use to measure the success of our stories.

162 TIMAF Information Management Best Practices Vol. 1

To deliver the right story, you proceed through multiple steps that themselves have many sub- processes. At the highest level, these steps are to:

• Identifybusinessrequirements.• Plantheproduction.• Executetheplan.• Measuresuccess.

In this article we do not provide detailed how-to information for each of these steps; instead we focus on the approaches and considerations that keep the focus on business goals during media creation.

The skILLs needed To PerForM ThIs BesT PrACTICeThe primary roles involved in carrying out the steps described in this article are the media producer, project manager or client liaison, and BDM.

• Media producer: An experienced media producer has experience working with other roles to produce media that fills the goals identified for the project.• Project manager/client liaison: The project manager/client liaison has skills that include experience with an array of media produc- tion options, the ability to gather require- ments, and the ability to understand the business goals. Depending on the intended use of the media, usability, training, and web analytics experience may be useful as well.• Business decision maker: BDMs are the clients in this article. That is, BDMs buy media production services, and they are the business stakeholders who have implicit or explicit business goals that the media will address.

sTeP 1: IdenTIFy BusIness reQuIreMenTsCreating a good video starts with a good conversation—or a series of conversations. These conversations take place between the producer and the major stakeholders, and identify the requirements that the video must satisfy. The result of the requirements process is an answer to the question: How will we know if the media project is successful?

The notion of success for a media project is complex. The first requirements that may surface from many clients might be based on specific objectives (“We want a video that demonstrates how to use a feature of our product or that communicates specific marketing messages.”), or it may be strictly aesthetic (“We want a cool, cool video, period. Oh, that shows our product....”).

Less frequently, stakeholders may articulate how the videos relate to business requirements (which we use interchangeably with “business goals”). If they do not, the producer should explicitly steer the discussion in this direction.As producers, we need to challenge our stake- holders; ironically, we have some of our most productive exploratory conversations by challenging our clients’ assumptions that they need our services at all. We ask, “What do you hope to accomplish with this media project, and is this the most efficient way to achieve your goals?”

The following list of challenge questions will, at least, force clients to think through the reasons for undertaking a media project. Ideally, they will produce a guiding set of requirements that will help you create the right solution. Not all questions apply to all projects (For example, not all projects include human actors.).

16309 Jim Larkin, Bill Birney

• Whatdowewantourintendedaudienceto do with our media? - Not do something else, like call tech support; - Like us better, or change their opinions of our company or our products; - Change their behavior; - Buy something; - Learn something; - Take some other action that we can measure.• Whatarethestoriesweneedtotellto get our audience to do what we want them to do? • Whowillprovidethestories?Whowillstory- board? Who will approve the storyboards? • Willthemediabelocalized?Ifso,intohow many languages? • Willthescriptchangeovertime,either during production or after the initial release of the media? Will the media need to be updated? • Ifhumanactorsareintheproduction,what characteristics do we need them to have (demographic as well as roles)? • Howprofessionallypolishedmustthevoicing and acting (if any) be? • Ifliveacting,howmanyscenesandlocations will be needed? • Whatdoweknowaboutourtargetaudience, and what will engage them? • Howwillwemeasurewhetherthemediais successful against our business goals?- What is the baseline? What criteria (aware- ness, opinion, behavior, action) are applicable?- How do we measure change from the baseline? • What’sthescaleoftheproduction?Willwe be dealing with tens, hundreds, or thousands of individual assets? • Whatwillhappentothemediaafterwe’re done? Will it need to be updated and refreshed in the future? • Whatisourbudget?• Whatchoicesdoesourbudgetrequireusto make about how much media; how, when, and where we create it; and who is involved?

Is our budget adequate to reach our business goals, and does the anticipated value of the result justify the investment?

sTeP 2: PLAn For ProduCTIonThe answers to questions in Step 1 will provide parameters for your production. Here is how:

Size of the production: If your production is sizable, it is critical to create an asset manage-ment strategy at the outset. This strategy can include:

• Testingallyourprocessestomakesurethey will work as you expect. • Establishandenforceassetnamingschemes before you start creating assets. • IfyouareusingaDigitalAssetManagement System (DAM), outline your process steps and ensure that your solution will deliver. • Listeachproductionpiece,andforeach piece, list each process step that is required. Look for commonality across media pieces so that you can group similar work for efficiency and minimize the learning curve.

Localization requirements: If the media will be localized, how can you manage cost? Here are production ideas we have used to manage localization costs and minimize language-specific work:

• Ifaproductionistobelocalized,consider avoiding the use of live actors to voice lines. We use a series of stills so that we can easily re-voice the pieces in additional languages without reshooting, using native-speaking actors. If we do need to reshoot for cultural reasons, the original U.S. English version provides a template against which we can shoot stills of culturally appropriate models, add them into the production, and re-voice. For example, for a computer-based counsel- ing tool for reducing human immunodeficien- cy virus/sexually transmitted disease (HIV/

164 TIMAF Information Management Best Practices Vol. 1

STD) risk, we use identical animated still sequences and drop in local audio for U.S. English and U.S. Spanish versions of the videos. For a version of the video to be used in Kenya, we will reshoot stills and drop them into the appropriate video slots, along with the local voiceover.• Maketextliveandeditable.On-screentext should be separated from images and over- laid using your editing tools, whether you are using Adobe Flash, Microsoft Silverlight, or layered Adobe Photoshop files. Keep text live as late in the production process as possible so that you do not have as many steps to carry out for each language. For Flash or Silverlight, store text content in XML files, so that localizing and updating text is a simple matter of editing the XML file (rather than the executable).• Ifyourmediacontainsimagesofapplications or websites, consider mocking up the user interface using live text as well. It will be tedious, but if you are localizing into many languages, using live text may ultimately save time over reshooting images in many differ- ent languages. This approach may also help you if the UI text changes: you will not need to reshoot. If you are only localizing into a few languages, however, going to this extreme may not be worth the trouble.

Plan for updates: If, in six months or a year, you need to revisit and update a large portion of the media you have created for a client, you can spend a huge amount of time unnecessarily.You can save significant time if you follow the practices for efficient production described here, and you have fully documented the project with media lists, naming schemes, processes, and settings (including project files, of course). Store this documentation with the media assets and provide a complete set to your client (for safety, accountability, and because they paid for it, after all).

Know your audience: What do you know about your audience? Identify anything from profiles on media-type penetration, technical sophisti-cation, and the audience’s attitude about the business goals of your media. For our technical audiences, for example, we know that we have to make sure that each media piece is going to tell them what they want to know, so we typic-ally tell the story newspaper-fashion. For our health care audiences, we know that literacy and state of knowledge vary widely and that younger audiences do not want judgmental lectures, so we use visually arresting beginnings that introduce a story before we get to the take-away lines.

identify the desired impact: Again, what is the impact you want your media to have, and how will you know if that impact was achieved? Planning to track click-throughs, downloads, tech support call traffic, or even direct customerfeedback on the usefulness of your media areall wisely planned at the outset, not only because these methods may need to be instrumented in some way, but also because knowing how the media will be evaluated can do wonders for focusing the development effort towards those goals.

Leverage the media type and ecosystem: You are presumably creating moving media, but how can the media use the context where it is presented to further promote the business goals? For example, if you are introducing a human persona in the media, which is streamed from your client’s website, can you use the per-sona in other places on the site to lend familiar-ity and consistency to the user experience?

sTeP 3: exeCuTe The PLAnCarrying out the plan—doing the actual production work—is not our focus in this step; instead, the focus is to identify what you can do during production to ensure that the work remains aligned to business goals. These points

16509 Jim Larkin, Bill Birney

emphasize doing whatever you can to check the design assumptions against reality. Such activities may include the following:

Produce and test the production in as low-fidelity a version of the production as possible, as early as possible. This point may seemcounter-intuitive—why low fidelity, and just what does that mean? Low-fidelity testing may be anything from mockups, to storyboards, to a simple simulation of the intended functionality of the media. This testing can show you if your production will be successful at a much earlier stage than you might imagine. In particular, you can start to see whether the stories that you are telling resonate with your intended audience. If they are not, making large or small changes in your stories at this stage is vastly easier than making them later in the production process. Anecdotally (meaning that we do not have evidence, but believe based on experience), the crudeness of the a low-fidelity prototype actual-ly removes the distraction of a more polished experience, and forces your test audience to focus on the stories. A case in point: for a com-puter- based health-care product that Resources Online created, we created paper prototypes at an early stage. The prototypes were little more than sketches of the key interactions intended for our product. We printed them out and then mounted them under a clear plastic sheet on a cardboard box —our “computer”—and did informal testing on the street to get naïve feed-back on the product. What we learned directly affected the design of the product at a very early stage, before we built the UI or functional-ity of the product.

test against the intended audience, not just the clients. Often, clients give direction based on little more than personal preference or gut feel. If you can, try to determine what really makes a difference to the intended audience, and focus on those sorts of changes. For example, is the language used in the production appropriate and credible for your audience, or

is it merely politically acceptable for the business? (We encountered this tension when we created sexual health media funded by government entities, for example.)

test usability iteratively. Testing against business goals should continue as the project proceeds.For example, you might proceed from the low-fidelity tests described in the preceeding paragraphs to more formal tests of taxonomy, user interface design, and information logic. All of these tests can be done early in the project, when the costs of making changes is much lower.

Make testing value-neutral and relate it to business goals. Often, our clients ask for changes based on personal preference. Some-times our job is to challenge these preferences. We try to do this by focusing on the testing and changes that will really affect business goals in as specific and particular a way as possible. We are not always successful—sometimes personal preferences (or institutional limitations our clients must work within) work against this goal. But even in these cases, we can at least make explicit the reasons that drive particular decisions.

sTeP 4: MeAsure suCCessHow do you know if your media project is successful? If you have defined your require-ments well, then your measures of success should fall in line with the requirements and business goals you established in Step 1. And, in fact, measuring success is an essential part of establishing each business goal. In other words, you cannot have a goal without some way of measuring how well your production has met that goal. As a media producer, if you and your clients have identified requirements and business goals, then you have an interest in how success metrics are defined and evaluated.

166 TIMAF Information Management Best Practices Vol. 1

So, depending on the type of media and its purpose you can use the following measures:

• Webmetricssuchastimeonasite,click- throughs, and downloads may be key indicators. Establish baseline numbers and performance goals.• Reductionsincallstosupport,customer satisfaction ratings, trial product downloads, and viral indicators such as YouTube hits. If baselines are available, by all means use them.• Forperception-changingcampaigns,devise and collect pre- and post-measures. • Ifyouaretryingtoreducetechnicalsupport calls, measure call rates, but also look at what the calls are about and whether, for example, callers viewed your video and if it solved their problems. • DoanA/Bexperiment—randomlyprovide the media to some customers and not to others and measure the differences in perception or action.

Admittedly, these are general approaches, and it is difficult to be definitive in the abstract. The key point here is that you should look to each of your business goals and try to determine how you might definitively say whether or not (and to what extent) the business goal has been met.

sTeP 5: MAnAge For MAssIVeness – A CAse sTudyThis “step” isn’t a step in the sense of the sequence of preceding steps, but rather an example illustrating how all the steps described in this article flow in a project, and how we have used this process to guide very large projects.

During the last few years at Resources Online, we have developed a media-rich computer-based counseling tool for HIV/STD risk reduction and generalized health screening

and counseling. We call this the CARE platform. For the various versions of CARE (multiple language versions for use in the United States and internationally), we create and manage thousands of images, audio assets, and videos in multiple personas and languages. This project has helped us refine our processes over the years to meet the business requirements of this product (which we distribute free of licensing fees for government and non-profits). We describe it here to illustrate how the business requirements we derived were translated into processes and, ultimately, success measures.

The CARE product is typically used in a clinical setting by individuals seeking sexually trans-mitted disease (STD) testing, or by individuals who are HIV-positive and need support to avoid secondary risks and to comply with medication regimens. The product can assess a patient’s state of knowledge, behaviors, and risks and provide meaningful, appropriate information to augment clinical care—or provide this inter-action when it is otherwise not available. Every interaction in the protocol is spoken (making it accessible to lower-literacy patients), often by up to four different personas (the patient chooses an avatar). The product also includes more than 30 skills-building videos that cover a variety of issues related to risk reduction.

Table 1 describes the high-level requirements and success measures for CARE media:

16709 Jim Larkin, Bill Birney

requirements

Scale for massiveness: The overriding requirement was to scale our efforts for massiveness:We needed to produce over 30 videos - more than 30 stories-along with over 5000 audio clips.

Organize production efficiently: Stories had to be complete and ready to shoot all at the same time to minimize production time. “Ready” included usability with low-fidelity versions of the stories to get feedback from our intended audience.

Consistent project taxonomy: All images, video and audio clips, and other elements had to adhere to a naming scheme that we devised before we shot a single frame or recorded audio.

Real-time status: We ensured that elements were tracked from preproduction through production and post production - a full-time job for a dedicated staff member.

Simple tools for the entire team: We needed simple, portable tools to assist us. We ended up relying heavily on Microsoft Office Excel to track the status of all media components and to create working scripts for voiceovers.

Ease of production: The first videos we created for these products were live action with actors portraying scenarios. This approach was expensive and not terribly scalable - changing a script meant a reshoot, and localization meant creating multiple productions. Over time, we developed a production model that was more appropriate to our scale: For each story, we shot hundreds of static digital photographs and then animated them in video. This meant that we could choose models (not actors) who were demographically appropriate for our story, and then have separate voice actors voice the audio. By separating audio and video, the producer could change scripts any time, and the videos were easy to localize. For example, the producer could rewrite an entire paragraph from the middle of a scene, and all the video editor had to do was rerecord the audio for the section and take out a few still images - far less expensive than bringing an entire cast and crew back to a location. The approach also made localization vastly easier - we just add a translated sound track, which can also be made culturally appropriate without having to remake our video.

Success Measures

Success measure: All our efforts had to scale to this effort, so the measure of success was also broad: Finish the project on time and on budget -or at least reasonably near budget.

Success measure: Ultimately, our budget for this requirement measured our success, but intermediate success measures included minimizing the required production hours (i.e. working efficiently with the cast and crew and minimizing the need for multiple days and retakes).

Success measure: Our success was measured by a negative - the absence of confusion, version control problems, and time spent locating arbitrarily named media.

Success measure: Our success was measured, again, by lack of confusion, rework, and lost media - or in other words, efficiency.

Success measure: We measured success by making sure that the entire team - including our medical consultants, talent, and production staff - could use the same familiar tools to work with the content. Our MD and PhD consultants, in particular, did not have time to learn new production tools, so we worked to provide content in forms that they could use.

Success measure: Reuse of source media for multiple languages and revised scripts.

168 TIMAF Information Management Best Practices Vol. 1

When we began this project, it was clear that a traditional approach to video production would not address our business requirements. So we didn’t begin with all our best practices completely formulated, but we did foresee many needs (such as project taxonomy and need for scale), and we adjusted and adapted others (such as our video production process). We ended up with a process that addressed our business requirements, and it’s a process that we use today.

The most difficult aspect of dramatic film production is capturing a performance, an event created by an actor that occurs linearly, over time. We solved this problem in a way that addressed our needs for granularity and flexibility, and that made it unnecessary to worry about a sustained performance. The images that conveyed the right feeling could be carefully chosen during editing. The audio of the characters’ dialog could be recorded later in a studio. An entire video could be created during post-production by choosing audio clips and stills and editing them together with the right pacing. Editors like to have a number of choices to work with, but with video, they may have to settle for pieces from two or three good takes. With hundreds of stills to work with, the editor could create any number of scenes.

The approach solved the problem of expensive actors, and allowed us to use amateur actors and HIV-positive individuals who were not really actors at all. We used many real locations, and simple lighting or just available light, so expensive crews were not required. And we did not have to worry about recording sound on location.

Above all, videos worked in this situation because telling a story in a compelling way was the highest-level requirement. Our success measure for this requirement has been the high level of user acceptance that the videos achieved in our product.

Ease of localization: Because our audio snippets were so granular, we could record efficiently in other languages using our Excel spreadsheet.

requirements

Ease of management: Our process was to divide content into small segments to provide flexibility. The tradeoff for this granularity is complexity; because an audio clip could consist of anything from a single word to, at most, a few sentences, we had to manage the complexity of thousands of audio clips and then assemble them into our product builds without fail. A consistent naming scheme for the media helped us keep track of it and identify missing segments.

Success Measures

Success measure: Lack of confusion, minimized need for unnecessary work and multiple redundant recording and production sessions. Clean media builds - and any missing media components - were easily discovered and fixed.

Success measure: Like other requirements, efficiency and minimized rework are indications of success.

taBLe 1 High-level requirements and success measures for CARE media

16909 Jim Larkin, Bill Birney

Summary and ConclusionManaging a media project to meet requirements takes a disciplined approach that continually returns to the question of how well production choices address business goals, and ultimately whether the stories we create are of value to the business. We have focused primarily on the practical aspects of media production guided by the need to tell a story; a much bigger subject that we merely touched upon is how to create those stories. This topic is largely beyond our current scope—where creative minds craft the story. However, the creative process must take into account much of what we do discuss in this article, because the creative work of storytelling must also satisfy business goals to be successful. The goals we seek depend on effective storytell-ing, production values, and the ability to create a compelling experience that the audience will relate to and find credible. These sorts of requirements will always be a part of creating effective media. Managing the production to meet business requirements is how we effec-tively get the story from concept to our audience, and how we make media content address business goals.

Adopting a business requirements approach is essential when the scale of media production is massive. We argue that there is no other way to understand, plan, execute, and evaluate a massive project without a “northstar” set of business requirements that you establish at the outset. Or, to put it more succinctly, without business requirements, a massive media project risks producing chaos, waste, and uncertain value to the business on a massive scale. Of course, this approach is applicable to smaller media projects as well, but the scale of cost, investment, and the consequences of error in a massive project are truly daunting, and differen-tiate large projects from small. If you are

running massive projects, you simply have to get it right; accounting for business require-ments in the media production process is the way to accomplish that.

TIMAF Information Management Best Practices Vol. 1170

StEPHAniE LEMiEux ([email protected]) is the Director of Search & Content at the Yellow Pages Group in Canada. She and her team are responsible for ensuring a relevant and intuitive search experience on www.yellowpages.ca, accomplished through many avenues such as search engine tuning, taxonomy optimization, SEO and content management. Prior to taking on this new role in 2010, Stephanie was the Taxonomy Practice Lead at Earley & Associates, where she worked with clients such as Best Buy, Motorola and AstraZeneca to improve their information organization through taxonomy and metadata best practices. She specialized in taxonomy design, faceted search, digital asset management, governance and content management across various technologies. Stephanie has a Masters degree in Library and Information Studies (MLIS) from McGill University with specializations in knowledge and content management, taxonomy and information architecture.

SEtH EARLEY ([email protected]) has been implementing content management and knowledge management (KM) projects for over 14 years and has been in the technology field for 20+ years. He is founder of the Boston Knowledge Management Forum and co-author of Practical Knowledge Management from IBM Press. He is former adjunct professor at NortheasternUniversity, where he taught graduate courses in KM and E-Business Strategy. He currently chairs the metadata committee for the Academy of Motion Picture Arts and Sciences (AMPAS.) Seth has developed search, content and knowledge strategies for global organizations and has developed taxonomy for a diverse roster or Fortune 1000 companies. He is a popular speaker and workshop leader at conferences throughout North America, speaking on intranet design, KM, content management systems, taxonomy development and other related topics.

CHARLiE GRAY has 24 years of industry experience in everything from robotic systems to sales and marketing. Charlie has a BSEET from DeVry and a MBA from Lake Forrest Graduate School of Management. His career with Motorola spans over 15 years, and over the last 6 years Charlie has been focused on content management strategy and has successfully deployed a single content management system that is used across all business units within Motorola today for all their public websites. Currently, Charlie is managing the day to day operations of www.motorola.com in roughly 45 countries and 25 languages, along with the content manage-ment strategy for Motorola. Charlie’s next major focus within this role is to drive the web towards a mobile experience that includes personal and relevant content for the user.

17110 Stephanie Lemieux, Seth Earley, Charlie Gray

Integrating taxonomy and content Management

Dynamic Content Through Taxonomy

dStephanie LemieuxSeth Earley Charlie Gray

10

172

AbstrAct

TIMAF Information Management Best Practices Vol. 1

To overhaul how content was organized, managed, and

presented to customers, Motorola brought project stakeholders

into the process early on so that they would understand the

eventual project goal (including the need for consistent

organizing principles), and share a vision for how content

would be managed. Motorola brought in experienced taxonomy

resources to help guide the process, and created detailed

functional requirements around how taxonomy terms would

be leveraged, both from a technical perspective and from a user

experience perspective. The taxonomy was integrated with

the Content Management System (CMS), as well as the user

experience design. This involved determining how the taxonomy

would be associated with content, be surfaced for the users, and

support their specific tasks (browsing for products, comparing,

purchasing, searching, filtering, etc.)

Finally, we needed to socialize the details of applying the taxo-

nomy in tagging processes that included content creators and

content managers. Motorola was successful in this endeavor

and created a website that supports thousands of products with

hundreds of attributes in dozens of countries and languages.

173

Background

10 Stephanie Lemieux, Seth Earley, Charlie Gray

In 2006, Motorola began an ambitious web content management project. Over the years, responsibility for customer facing content had been distributed among multiple business units and geographic regions. Different groups created and managed web content as they saw fit, using disparate technologies and inconsistent struc-tures. In some cases, tagging with metadata was nonexistent. In other cases, terms to describe and label information were used inconsistently. Creating and updating content was a manual, costly, and time-consuming process. Add to this mix the complexity of multiple languages and regional variations, and the result was an inconsistent, fragmented, and often unaccept-able customer experience. Performing simple searches using term variations (“cell” versus “cellular” versus “mobile”, for example) would yield surprisingly different results.

There was little consideration given to taxo-nomy, and no process for owning, managing, or applying taxonomy terms. Even if there were a defined taxonomy, systems and tools were not able to make use of classification and categorization. Like all businesses, Motorola had pressure to keep up with its competition by offering users a comparably dynamic, light-on-its-feet site in an industry where competition is fierce and the lifecycle of some products—cell phones, for example—is measured in months. Taxonomy was considered a foundational element in this plan.

The skILLs needed To PerForM ThIs BesT PrACTICeTo follow the steps in this best practice, you should have these roles available:

• Taxonomysubjectmatterexpert(SME): Someone who really understands taxonomy best practices, both at the level of taxonomy

principles and at the technical implement- ation of taxonomy. The taxonomy SME should also have content modeling experience and knowledge of database design principles.• Content modeler: Someone who can develop content object models based on stated requirements. The taxonomy SME will collaborate with this role to integrate taxo- nomy into the content model.• Content creators/managers: People who can provide requirements for the Content Management System (CMS) and front-end experience, as well as taxonomy use and management.• Open-minded it analysts: People who are interested in discovering new ways of con- figuring IT systems to use advanced metadata and controlled vocabulary for dynamic content. They will be responsible for determining what level of taxonomy integra- tion is possible and making it so.• taggers: People to undergo taxonomy training and apply the taxonomy as tags in the CMS.• An enthusiastic sponsor: Someone who really understands the benefits of taxonomy and content management and is ready to evangelize these concepts to all levels of management.

sTeP 1: eduCATe sTAkehoLders on TAxonoMySuccessfully integrating taxonomy with a CMS depends upon how well stakeholders and core team members understand and buy into the concepts of taxonomy and metadata. Making taxonomy a key element of CMS design and operation requires a core group of people who really “get it” and are willing to stand behind requirements and design recommendations. If team members do not understand the founda-tion and benefits of taxonomy, they will quickly

174 TIMAF Information Management Best Practices Vol. 1

dismiss CMS design changes for taxonomy’s sake, because those proposed changes imply custom development or additional effort.You can start educating people about integrat-ing a CMS with taxonomy well before you begin implementation. As you prepare to launch your project, consider holding an informal “lunch and learn”, or incorporate a “taxonomy 101” session into the pre-project ramp-up acti-vities. Such activities should continue throughout the life of the project, as new stake-holders join, or as you encounter challenges that require advanced knowledge.

During our project with Motorola, we recog-nized that a well-informed project team would help us meet our goals for integrating taxo-nomy. We began by holding conference calls open to any ‘Motorolan’ who wanted to know more about taxonomy and its uses in content management. These conference calls clearly showed uses of taxonomy that were easily understood within Motorola’s internal context, including end-user benefits (enhanced site navi-gation, reduced content management effort) and related industry case studies (How other tele-communications firms were using taxonomy for content syndication.). In meetings, we provided a brief education on taxonomy fundamentals whenever new participants came on to the project, or when stakeholders had to make certain decisions regarding taxonomy. This created a baseline of understanding to guide the direction of system design and implementation.

One of the crucial areas for stakeholder educa-tion at Motorola was helping marketers under-stand the difference between taxonomy for classification and taxonomy for navigation. On hearing about the taxonomy initiative, the first reaction from many marketers was concern—Would the taxonomy take away their ability to define and change the site navigation to support marketing goals? It was important for us to emphasize that taxonomy would be used for tagging content with metadata (classifica-

tion), and that this could potentially be used for dynamic navigation—where applicable and desirable—but that the main function of the taxonomy would be to create relationships between content and enhance search.

sTeP 2: BrIng A TAxonoMy exPerT onTo your CMs IMPLeMenTATIon TeAMNo matter how much education and evangeli-zing you do, it is unlikely that your project team will all become savvy taxonomists—nor are they expected to. A taxonomy subjectmatter expert (SME) will provide the necessary knowledge and advice during the design process. The SME will become the taxo-nomy advocate—or even lobbyist—ensuring that business goals for taxonomy integration are being met (or at least strived for). The taxonomy SME can be an internal employee or a contracted consultant, as long as they have a deep understanding of taxonomy and content management concepts.

During requirement and design sessions at Motorola, the taxonomy SME’s role was not only to ensure the team always had the correct expertise at hand, but also to challenge them to use taxonomy to drive content and functional-ity dynamically wherever possible. This re-quired staying on top of all design discussions, no matter how trivial or seemingly unrelated, and constantly questioning whether taxonomy could be leveraged for benefit.

sTeP 3: deTerMIne FunCTIonAL reQuIreMenTsWith a team that understands taxonomy and a taxonomy SME in support, the next step is to tackle functional requirements for the CMS. A big part of integrating taxonomy with a CMS is figuring out how that taxonomy should be managed and leveraged. There are lots of resources available to guide you through the

17510 Stephanie Lemieux, Seth Earley, Charlie Gray

process of gathering requirements, such as Content Management Bible (1, page 1122), and Managing Enterprise Content (2, page 592).

Taxonomy must be considered in three areas:

• Functionalitythatmayrelyontaxonomy elements. • Thestorageandmanagementoftaxonomy.• Theapplicationoftaxonomytocontentin the CMS (tagging).

As you proceed to define requirements, you will encounter many that may leverage taxo-nomy. In this step, all you need to do is recog-nize these requirements and ensure that you have the right level of granularity to drive system design in later steps. For example, imagine a requirement was to enable users to filter content on the website. The taxonomy SME should highlight this requirement as a potential area for implementation through taxo-nomy, and drive the team to provide as much detail as possible. How should users filter? On what attributes? This level of specificity helps to translate the requirements into detailed specifications for the content schema (content types, metadata elements, etc.) later in the project.

The storage and management of taxonomy is another key area. Notice that in the preceeding section, we did not specify the storage and management of taxonomy “in the CMS”. Unfortunately, most CMSs are not very good at taxonomy management. But instead of being constrained by what is possible in your CMS, spend some time outlining how you would ideally like taxonomy to be managed, regardless of whether it is done in the CMS or another external tool. Taxonomy functionality you should consider includes:

• Theabilitytochangetaxonomytermsor structures, and how these changes are propagated throughout the content.

• Theabilitytoviewthetaxonomy.• Theabilitytomanagepermissions.• Thestorageof:- Hierarchical relationships (broader, narrower terms), - Synonyms (equivalent terms), - Related terms (related terms), - Other custom relationships,- Translations (localized values).

By thinking purely about functionality, you will be able to see the gaps between how things should work and what is possible in your CMS. In a later step, you will define the level of effort required to achieve your ideal state of taxonomy management, and whether you will need a custom solution or third party tools. In the last category of requirements, you will outline how content creators will view and use the taxonomy to tag content. Again, while most CMSs have pre-set structures and interfaces for metadata tagging, these may not be optimal, so it is important to think purely about desired function.

At Motorola, we were highly concerned with user/taxonomy interaction because of the high volume of content and the amount of content entry that is outsourced ( not entered by subject matter experts internal to Motorola). Tagging interfaces had to be clear, easy to use, and allow for speedy content tagging. Thus, some of the Motorola requirements included the ability to tag multiple taxonomy values at a time (multi-select), the ability to browse the taxonomy tree, and the ability to search the taxonomy.

sTeP 4: hArMonIze reQuIre-MenTs wITh FronT-end desIgnOne of the biggest lessons learned during our project at Motorola was how important it is to make CMS development and front-end website redesign parallel efforts. Building a CMS back-end without knowing how the web design will

176 TIMAF Information Management Best Practices Vol. 1

look is like trying to build an auto assembly line without knowing what kind of vehicle you are making. You may be able to make some good educated guesses (you’ll need four wheels and an engine), but you really need to know the design specifications to move forward produc-tively (is it a convertible? how many doors?). At Motorola, project delays led to a three month lag between the beginning of the CMS design phase and the delivery of the initial wireframes for the website. In those first few months of joint application design, the team had to take their best shot at building a back-end system that could serve a multitude of potential front-end functionalities. When the wireframes were delivered, the CMS team had to identify what system functionalities might be missing and retrofit content structures to support the desired presentation, as well as decide how to integrate those changes into the project timeline.

In theory, a CMS should support your content structure and deliver that content no matter what the front-end design dictates. In reality, site design always brings in a few presentation elements and site behaviors that affect data structures and influence your system design. For example, if you know you need to filter products by color, you will build the data struc-ture so color is an attribute that can be easily searched—such as a taxonomy value—rather than make color a part of the product speci-fications text field. Ideally, your project time-lines should allow you to review a first- draft wireframe before or during the CMS functional requirements development. This review will avoid mid-project design revisions and delays.

sTeP 5: CreATe An InTegrATIon PLAnWith a complete set of functional requirements, you are ready to create your taxonomy integra-tion strategy. The taxonomy SME will review the list of requirements and identify potential ways to leverage taxonomy. These form the

basis of the integration plan and guide discus-sion during CMS design.

Although the sky is the limit when it comes creative ways of using taxonomy, here are key areas where taxonomy can provide gains:

Dynamic NavigationTaxonomy-driven dynamic navigation creates navigational options on the fly, based on the metadata tagged to the published content avail-able on that portion of the site. It is an alterna-tive to manually creating an access structure and publishing content to each browse option. For example, Motorola wanted products to be navigable by product category (Mobile Phones) as well as model/family (Razr), both of which are taxonomy values tagged to content. We therefore planned to have the site dynamically derive some of that navigation from the meta-data values available on the content.

content relationships & Dynamic DisplayApplying taxonomy tags can create relation-ships between content objects; you can use these relationships to drive dynamic content display. The relationships reduce the effort required to manage content; they minimize the number of direct manual relationships created between specific content items, and they har-ness metadata to increase content reuse. For example, a presentation template may have an area to show current news items related to the content. Instead of manually linking news items to each page, news items with a taxonomy value of X (where X matches a metadata valueon the page where the news is to be shown) would dynamically be related. Some areas where dynamic relationships are helpfulinclude:

• Productrelationships(Forexample, product-accessory).

17710 Stephanie Lemieux, Seth Earley, Charlie Gray

• Cross-sell/up-selloptions.• Relatedinformation/materials(documents, images, news, videos, etc.).

content Filtering/refinementWhile faceted search engines are great for allowing users to progressively refine content, some organizations cannot invest in these tools at the same time as they implement a new CMS. This was the case at Motorola, but we still wanted to enable users to filter product lists and search results, so we opted to build faceted search functionality in the CMS using taxonomy. Metadata values are surfaced to the user and can be selected to limit the result set, including colors, form factors (shape of the phone), and key features (e.g. camera).

controlling terminologyWhen you make certain data fields in the CMS dependent on taxonomy, you control their structure and enforce vocabulary. For content that requires strict editorial guidelines, taxo-nomy is a good way to ensure such control. At Motorola, product features and specifications always posed a problem for editorial consisten-cy. On the site, products features were listed using different terms for the same feature. For example, one product referred to “Battery Type” and another “Chemistry”; one listed “NiCd” as the value, the other “Nickel Cadmium.” This inconsistent terminology created confusion for users and difficulties in comparing products. Taxonomy would be an excellent mechanism for controlling terminolo-gy and simplifying the user experience.

PersonalizationWith appropriately tagged content objects, you can harness taxonomy to create person-alized content or interfaces. Personalization can be based on user profiles, whereby a user self-identifies their industry, age group, topics

of interest, etc. from a variety of taxonomy options. Upon login, users can be presented with tagged content that matches their profile. Transactional data can also be used to infer user interests anonymously; what product types has this person purchased? What content have they accessed in the past? What are their preferred brands? When you have the meta-data schema and tagging to support it, you can provide numerous personalization options. While personalization is still in future plans for Motorola, we ensured that the content structure would support this functionality by identifying personalization needs and attributes that would allow connecting users to content.

content feedsTaxonomy categories—product types, subject categories, industries, etc.—can be used to drive Really Simple Syndication (RSS) content feeds. Users subscribe to a feed based on the taxonomy tag and get automatic updates when new content tagged with that term is added to the site. This is another future capability that Motorola wanted their CMS to support. They envision business users subscribing to industry-specific feeds that will notify them when new products are available for their market.

content SyndicationThe ability to reuse content across channel partner sites was a key goal for Motorola, and one of the underlying justifications for making taxonomy a big part of the CMS design. Just as in the RSS feeds, taxonomy values are used to create collections of information that can be automatically sent to resellers, carriers, and other channel partners. These partners sign up for content syndication by selecting a combi- nation of attributes (document types, product families or unique products, etc.) and get au-tomatic updates based on their level of entitle-ment (also potentially taxonomy-based).

178 TIMAF Information Management Best Practices Vol. 1

LocalizationTaxonomies are often used to manage lists of countries and languages in which content is available. You can use taxonomy to manage how country and language-specific content is identified and presented. Motorola’s products are available in over 60 countries and 37 languages, so localization is a big element of the taxonomy.

You can use the preceding sections to help guide your thinking about taxonomy integra-tion requirements. But above all, be sure to think big in this step. Consider all the ways you will (or might want to) use taxonomy, because taxonomy will be the foundation for many interesting capabilities. Avoid limiting your thinking because an idea might be technically difficult to implement. Consider your goals instead: “What do we want to manage through taxonomy?” or “Where might we get benefits from taxonomy?”, not “How will we implement this through taxonomy?” There will be plenty of time in later steps to hammer out implemen-tation with IT and the rest of the team.

At Motorola, the time spent considering functional requirements and listing ways the taxonomy could solve problems or build capabilities fed directly into the development of a new taxonomy framework – a plan for the structure of the taxonomy. While Motorola was already using taxonomy heavily in one of its business units and content management systems, they wanted a taxonomy framework that would apply organization-wide. They wanted the taxonomy to represent Motorola’s structure across different functional areas, different tools (master data management, product life cycle management, customer relationship management, etc.), and different regions.

A taxonomy framework typically includes:• Taxonomyfacetstoinclude(Discretetop- level branches of the taxonomy, for example,.

industries, product categories and countries.).• Relationshipsbetweentaxonomyfacets(For example, “product lines” are available in one or more “countries”, “solutions” are geared towards one or more “industries”.).• Existingsourcesofvocabulary.• Potentialusesforthetaxonomyfacets.

If you already have a taxonomy, this is a good time to see if it needs some updating. The requirements above may highlight new term areas that you have not been managing in the taxonomy.

sTeP 6: InTegrATe TAxonoMy InTo The ConTenT ModeLNow that you have a solid understanding of the functionality the taxonomy must support, you have a foundation for integrating taxonomy into your content model. A content model identifies how content will be structured in your CMS, including:

• Contenttypes(whatcategoriesofcontent you have).• Elements(majorcomponentsofinformation of a content type). • Elementtypesandallowedvalues(rules or entering information into an element, for example, free text field, list of values, number, etc.) (2, page 1122).

Define taxonomy-controlled Fields in the content typesContent type definition—or modeling—is a major step in CMS design and the locus of taxonomy integration. A team of business analysts should be responsible for identifying content types and determining the elements that comprise them. In this process, the taxo-nomy SME’s role is to map the content type definitions to the taxonomy framework and integration strategy developed in Step 5. This determines:

17910 Stephanie Lemieux, Seth Earley, Charlie Gray

1. What elements should be taxonomy-driven. 2. How those elements should be structured/ 3. What additional taxonomy elements may be required in the content structure to support functionality.

To determine whether an element should be taxonomy-driven, consider the need for terminology control, consistent categorization, and dynamic content relationships. Howthe elements are structured depends on how intricately you want taxonomy tagging to be integrated into content editing, and whatlevel of enforcement you need around certain taxonomy-driven fields. For example, you might have a single field that is named “Taxonomy” and presents the user with the entire taxonomy

structure and values. This is a “free-tagging” approach, as you cannot really force any specific taxonomy selections. Alternatively, you can tightly integrate the taxonomy into the content type elements, driving some of the elements with taxonomy values and making them required fields.

The Motorola content model consisted of 10 content types, including Product Master, Product Detail, Document, Promotion, etc. We carefully analyzed each content type against the taxonomy framework and the functional requirements to identify the best way to weave in taxonomy. Table 1 shows how we organized the elements for each content type:

As a best practice, you should err on the side of inclusion when deciding what taxonomy elements to include in which content type. You never know what future functionality you might want that will depend on the ability to tag and leverage a certain taxonomy value. At Motorola, we made sure to include a taxonomy element in any instance when it might eventually be relevant.

As mentioned previously, one of the main choices we had to make early on in the process was whether taxonomy should be a single ele-ment from which you could select, without any

specific rules behind it. Given that Motorola was planning on using a lot of outsourcing for content entry and had robust plans for leveraging taxonomy, we decided that this would be too unstructured an approach that would quickly result in poor taxonomy tagging. We wanted to seamlessly integrate taxonomy into the content creation process, so we preferred to embed the taxonomy directly in to key content elements in each content type. However, as you will see, this approach had some technical challenges.

taBLe 1 Product Detail Content Type Elements – Sample table showing how to document the structure of a content type

Content Item Name

Product Category

Model Name

Product Description

Name of the content object visible in the management console

Type of product

Specific model of prod-uct

Marketing description of the product and main benefits

Text field

Taxonomy-controlled list

Taxonomy-controlled list

Text field

Y

Y

N

N

Name Description type required

180 TIMAF Information Management Best Practices Vol. 1

consider how content type Defini-tions relate to Your Data StructureYour content type definitions will eventually be translated into a system data structure (data-base tables and rows), so consider the following technical implementation issues when making decisions on content type design. One of these issues is the ability to easily add to or modify the taxonomy structure on a content type without causing serious problems down the road. Motorola’s content management system did not allow for easy modification of content types once content had been entered. In fact, changing the content type structure (e.g. adding a metadata field) within Motorola’s CMS required “dropping” all the content, making the change, then re-uploading the content. Having to add taxonomy fields post-design would not be a trivial exercise. So in Motorola’s case, we decided to take a hybrid approach to taxono-my: critical taxonomy fields that would drive specific functions and were integral to the con-tent type would be separate data fields, whereas other taxonomy values would be available for tagging in a general “taxonomy field.” This pro-vided the most flexibility for future needs and avoided costly system modifications.

Another issue to take into account is the system’s ability to “understand” and display hierarchy. Many content management systems do not have an innate understanding of hierarchical lists, and can only implement controlled content elements as flat lists. This means that whereas you might have had one element to describe a hierarchical taxonomy concept, you might need several to make up the different levels. For example, if you have a product type taxonomy that has three levels of hierarchy, you would need three elements to capture each level instead of showing the entire product type hierarchy in one single field.

Depending on your CMS’s options, the fields may be made into cascading lists, as shown in Figure 1. This means that the choice made in

the first element filters the available choices in the second, and so forth. So, if you choose the product category “Two-way Radios,” the sub-category field only presents you with types of two-way radios, and so on. This affects how you design your content type elements.

consider how taxonomy affects the content creation ProcessHaving the model name element be taxonomy-driven had a number of challenges. It implied that all new products needed to be entered into the taxonomy and available for tagging before a new product content item could be created. We had to decide whether we could put a gover-nance process in place far enough upstream to get the taxonomy value for new models into the system in time without causing a bottleneck in content creation. The sheer volume of values also caused us to pause, as managing the model names taxonomically would mean weekly changes to the taxonomy, and the need for an interface that would present taxonomy to the taggers in something other than a flat list. This challenge brings us to the following key deci-sion point.

The benefits of control and consistency make it tempting to make a lot of elements taxonomy- driven. However, you need to consider how taxonomy control affects your content creation process. Motorola had a real problem with

Figure 1: Hierarchical tagging vs. cascading lists

18110 Stephanie Lemieux, Seth Earley, Charlie Gray

product naming conventions. Model names were spelled a number of different ways depending on what page you landed on, so there was a clear need to manage model names in the taxonomy. However, in Motorola’s product launch process, some products may not have a model name until a few days before launch. Also, accessories are considered prod-ucts as well, and Motorola sells over 3,000 accessories on its business-to-business site alone.

sTeP 7: deCIde on A TeChnICAL APProACh To TAxonoMyYou have now hit a fork in the taxonomy integration road. Until now, you have been able to approach the problem of taxonomy from a theoretical and functional level. But now you are starting to deal in system specifics, and it is surely becoming clear that CMSs have many limitations when it comes to taxonomy. While most CMS’s do have some form of native taxonomy function, usually it is quite primitive and ill-suited to support any sophisticated taxonomy implementation out of the box.

Each CMS is different, so you need to begin by getting a clear understanding of your CMS’s native taxonomy capabilities and identify any gaps between what it can do and what you want it to do.

Identify Shortcomings of cMS taxonomy Functions

LACK OF ABiLitY tO MAnAGE SYnOnYMS

Many CMSs only allow simple lists of terms as controlled vocabulary and do not include the ability to relate and manage synonyms (equiva-lent terms, for example, cell phones = mobile phones) with taxonomy terms. This causes several problems. One concerns content tagging: content creators may not always know

what the correct taxonomy term is and may need some synonym support. Of course, this assumes that taggers can search the taxonomy, which is in itself a stretch for many content management systems. Having synonyms enabled for search inside the CMS may also be optional or essential, depending on how important it is that business users find content items quickly within the CMS. Motorola’s focus was on site search rather than the internal CMS search, so synonyms were considered optional. However, having the synonyms available in the tagging interface was critical, given the wide range of subject know-ledge held by users.

POOR DAtA StRuCtuRE / DiFFiCuLtY in LEVERAGinG HiERARCHY

How the taxonomy is stored affects how you can use it or what it can do for you. There are two main ways to store taxonomy:• Strings - Single values hold the taxonomy structure, for example, “Industries,” “Industries: Transportation,” “Industries: Transportation: Air Transportation”.• Database format - The taxonomy value is made up of multiple columns in a table (see Table 2).

At Motorola, the native taxonomy function in the CMS stored the taxonomy as strings rather than database format, making it difficult to create relationships between terms—even basic hierarchy and synonyms. It would have been necessary to implement each level of the tax-onomy as separate lists. This is a common issue across many CMSs. Applying taxonomy terms as strings typically makes it difficult to change taxonomy values over time. Should you need to change a taxonomy term (e.g. change the accepted color name from “red” to “scarlet”), you need to re-tag all content with the new term. If the taxonomy was tagged to content as an identifier (e.g. id=35678 which references a string in the database called “red”), you need simply change the name in the reference table

182 TIMAF Information Management Best Practices Vol. 1

22

44

45

46

Industries

Construction

Transportation

Air Transportation

22

22

45 Airlines, Airports

taBLe 2 Sample database taxonomy table structure

ID taxonomy Name Parent ID Synonyms

inABiLitY tO MAnAGE tRAnSLAtiOnS & OtHER RELAtiOnSHiPS.

A simplified taxonomy data structure generally implies that you cannot store and manage taxo-nomy translations, as well as other information related to taxonomy terms. Given that Motorola needed to translate the taxonomy into over 30 languages, this was a serious problem to be solved.

LACK OF tAGGinG & tAxOnOMY MAnAGEMEnt OPtiOnS

If hierarchy is a problem, the ability to easily tag content will also be a challenge. You may be stuck with clunky cascading lists, or perhaps massive flat lists with implied hierarchy in the value names, or other equally unintuitive tagging options.

Of course, you could use a third party tool to achieve a more intuitive interface. There are many taxonomy management systems available, many of which have pre-built connectors into the popular CMSs. These tools allow you to extricate the administration of taxonomy from the CMS, freeing you from its data structure limitations. Taxonomy management tools also often have tagging modules that allow you to supplant the CMS’s tagging interface with one from the taxonomy tool through an API. This allows you to use the specialized features in-

cluded in such tools, such as taxonomy work-flow, versioning, the ability to manage scope notes and definitions, etc. However, taxonomy management tools add cost and complexity that you must consider in your build vs. buy decision.You may also opt to customize your CMS to fill in some of the functionality gaps. While this is definitely a viable option, customization can quickly spiral in cost—both in terms of develop- ment and future maintenance.At the outset of the project, Motorola had a definite preference for using a third party tools and externally managing taxonomy. However, budget limitations precluded this option for the short term, so a workaround was required. We chose to customize the CMS to achieve some of our tagging interface and taxonomy manage-ment needs, perhaps not fully appreciating all that would be required to fulfill the very robust requirements.

Essentially, we ended up building a mini-taxo-nomy management tool within the CMS as a stop-gap until Motorola could get a third party tool. This allowed us to implement the addition-al functionalities required, such as synonyms, translations, and other relationships. However, getting beyond the CMS’s native data structure issues required that we implement the taxono-my as its own content type to be managed.

Here’s how we did it. Allowable values for list-

rather than have to update all the content.An overly simplified data structure also limits your ability to import and export the taxonomy.

Ideally, the taxonomy should be managed as unique values in a database, with additional columns to contain relationships.

18310 Stephanie Lemieux, Seth Earley, Charlie Gray

controlled elements are typically called “reference data”. Reference data is held in a simple table and invoked in the form of a drop-down list when you need to make a selection. However, as mentioned, this does not provide the ability to easily create hierarchy, custom relationships, or store any additional informa-tion about taxonomy values. To circumvent this limitation, we created new content types to hold taxonomy data, which allowed us to customize the structure of that data to suit our needs. We ended up creating two taxonomy- related content types: one for regular taxonomy values and one for features (or product specifi-cations) because the two content types hada different data structures. Features are not simple term hierarchies, they were “triples”: a combination of a feature name, a feature type (Boolean, text field, or list), and a feature value (depending on the type).

While this custom development addresses the major taxonomy requirements for the time being, it is not an ideal long-term solution. First, it is very time-consuming: creating and

managing the thousands of terms in the Motorola taxonomy in the same system where hundreds of users are editing content on a daily basis makes for slow response times. It can take up to five minutes to add one term with five synonyms and translations. Second, there are many advanced functionalities that were not re-created in the CMS, such as taxonomy workflow, advanced permissions management, and the ability to manage additional informa-tion with the taxonomy such as definitions and change histories. This lack of functionality requires the taxonomy team to manage a sepa-rate spreadsheet in conjunction with the CMS to hold this additional information. Externally managed taxonomy is definitely a goal for Motorola, and we hope to make it indispensable across the organization—using it to manage taxonomy not only for web marketing, but also customer relationship management, master data management, and product management.

If we were to do it all over again, we would definitely place a higher priority on obtaining a taxonomy management tool and avoid the

Figure 2: The Taxonomy Content Type

184 TIMAF Information Management Best Practices Vol. 1

custom development route. So before you commit to using your CMS’s native functions, determine what the cost of customizations will be to achieve an appropriate solution, and compare them to the cost of purchasing and implementing a third party tool.

sTeP 8: use CoLLABorATIVe MeThods To deVeLoP TAxonoMy-Powered FunCTIonALITyWith a clear mind towards requirements, con-tent types, and technical approaches to taxono-my, you are ready to start building the applica-tion. One of the best things we did in the

Motorola taxonomy project was to use a colla-borative approach to designing the system, which in our case was Joint Application Development (JAD). JAD is a participatory approach that allows business users andother stakeholders to work with developers in system development. With JAD, the team jointly reviews requirements to clarify needs for developers and participates in detailed design discussions that help business users understand technical constraints.At Motorola, collaboration was essential in making taxonomy implementation successful. Because the taxonomy SME participated in the JAD sessions, taxonomy requirements were adequately addressed in design decisions. The JAD sessions also enabled dialogue between IT, the taxonomy SME, and business users, in cases where taxonomy implementation choices had performance tradeoffs.

For example, the business wanted product-accessory relationships to be managed through taxonomy, so that the product model tags on an accessory would dictate the product pages on which that accessory would display. How-ever, it turned out to be too much effort to tag accessories with models, given how accessories operate with multiple products. For example,

one new product might be compatible with 17 existing accessories, so instead of linking the 17 accessories in the content item you are creating for the new product, you would have to go and open the 17 other accessory content items and edit them by adding a new product model tag. It would also be difficult to see what accessories were referencing a particular product from that product content item. The business agreed that in this case, it would be better to employ direct content item-to-content item relationships rather than leverage taxonomy.

In other cases, the business need was deemed to outweigh the performance difficulties, and IT was challenged to find a better way to manage the taxonomy integration. Some cases ended in a compromise between the two. In all situations however, JAD enabled the team to quicklyaddress issues because all the right people were in the room. JAD sped up development and ensured that all stakeholders understood the rationale behind key decisions.

taxonomy Integration choices at MotorolaTo illustrate how we pulled everything together at Motorola in the JAD sessions, we will high-light the specific taxonomy-driven functionali-ties we implemented during JAD sessions.

DYnAMiC nAViGAtiOn

At the outset, we wanted to make the majority of the site navigation taxonomy-driven. How-ever, marketing stakeholders wanted to retain flexibility to define the navigation as needed. The solution we came to compromised the need for dynamism and flexibility. We focused instead on navigational paths that were more straight-forward and stable, such as product lines.The taxonomy-controlled product model tags on product content items allowed us to offer dynamic product line navigation on the busi-ness-to-business site. For example, the Astria

18510 Stephanie Lemieux, Seth Earley, Charlie Gray

Figure 3: Dynamic product line navigation: Astria example

Figure 4: Dynamic product filters: mobile phones example

Figure 5: Creating a product-document relationship: dynamic display on the product detail template

COntEnt RELAtiOnSHiPS & DYnAMiC DiSPLAY

Taxonomy directs multiple content relationships and display rules, such as document to product. Marketing and support documents are tagged

with the appropriate models to drive display on the “Resources” tab.

CP Series product is tagged with a model name of “CP-1200,” which has a parent product line of “Astria CP.” This tag makes that product get

delivered dynamically to the “Astria CP” product line landing page, shown in Figure 3.

186 TIMAF Information Management Best Practices Vol. 1

Future plans include dynamic relationships to cross-sell products by industry, display FAQs and other troubleshooting, and more.

COntROLLinG tERMinOLOGY

Without taxonomy control, comparing product features would be difficult. The system would have no way of knowing that “Memory” and “Storage” were really the same concept and should be listed side-by-side. By making features taxonomy-driven, we could achieve better consistency of feature names and values and we could offer a user-friendly side-by-side comparison. Content creators choose applicable features from a controlled list and are prompted to either choose values or enter a text string, depending on the feature type (custom data element).

The feature content type structure feeds the various presentation elements. A feature is flagged as “comparable” in the custom taxo-nomy content type.

COntEnt FiLtERinG/REFinEMEnt

Content filtering is a huge part of the Motorola website, and it can be implemented using regular taxonomy values or the specially con-structed product features content type. In some cases, the presence of a taxonomy value makes the filter appear. If no products are tagged, the filter does not show. For example, these bar code scanners can be filtered by feature (fueled by the features taxonomy), or by technology (a regular taxonomy branch).

Figure 7: Consumer site product feature comparison

Figure 6: Consumer site product feature comparison

18710 Stephanie Lemieux, Seth Earley, Charlie Gray

The feature content type structure feeds the various presentation elements. In the features content type, a flag identifies that feature as “filterable”.

Finally, users can refine search results using taxonomy values tagged to content.

LOCALizAtiOn

All taxonomy values are available in each of the languages in which Motorola publishes con-tent. The taxonomy and feature content types were custom-built to support localized versions of terms: there is a translations “tab” for each taxonomy item where you can enter a language variation and identify the locale (a combina-tion of a country and language; FR-fr – France-French, for example). This is an important capability given that some taxonomy values are directly visible in the website.

Figure 9: Translations tab in the Taxonomy Content Item

Figure 8: Faceted filtering options on the business-to-business document library search are powered by taxonomy values

188 TIMAF Information Management Best Practices Vol. 1

SteP 9: SeLect aPProPrIate taggINg MechaNISMS to Make taxoNoMY eaSY to USeIn Step 7, we briefly described various taxo-nomy tagging interfaces. How users will tag taxonomy is a key criterion when deciding whether to use native CMS taxonomy functions, customize, or purchase a third party tool. Make this decision carefully. Taxonomy can only deliver results if the people who are responsible for maintaining it and contributing to it can do so easily. You need a simple and intuitive tagging mechanism, or your tag quality will suffer, and poor tag quality degrades your customer-facing functionality. Designing the tagging process and choosing a system with

a usable interface for tagging content—and thereby gaining all the functionality highlighted in Step 8—is an important part of the develop-ment process.

CMS tagging interfaces are notoriously challenging. They range from simple drop-down lists to complicated table transfer-style options with in-line hierarchy (see Figure 10).

Tagging interfaces may also provide cascading lists to simulate hierarchy (see Figure 11), but while a cascading list is perhaps better than a table transfer, it is definitely not user-friendly or quick.

Figure 10: Table transfer-style taxonomy value selector

Figure 11: Cascading list example: geographical region hierarchy.

18910 Stephanie Lemieux, Seth Earley, Charlie Gray

Anything that makes tagging painful or remote-ly complicated is a severe handicap in achieving your goal of well-structured and tagged content. At Motorola, ease of tagging was considered critical, so an intuitive tagging interface was a must-have. Ideally, the tagging interface would display a tree view for taxonomy-controlled elements, restricted to only the appropriate branch for that particular field. Unfortunately, Motorola’s CMS did not provide this option; a tree view was available, but it showed the entire taxonomy by default. We decided to customize it slightly so the tree automatically opened to the correct branch of the taxonomy to reduce errors (See Figure 12).

If your taxonomy is more than just a few hundred values, users also need to be able to search the taxonomy when tagging. You cannot expect business users to memorize a 4000-term taxonomy— they need to be able to pop in a keyword and search the taxonomy as well as its associated synonyms. This custom interface has its own problems: it is slow to load, and its original format did not enable users to select multiple taxonomy values at a time, so they had to re-open the widget for every value and suffer the slow load. In subsequent builds, we were able to include a multi-select ability.

It is also important to show terms in context in the search interface, especially if you have polyhierarchy (the same term existing in multiple places within a taxonomy). For example, at Motorola the term “MESH” refers to a type of technology, a solution, and a product line, so it appears multiple times in the taxonomy. However, when tagging, you need to be sure you are choosing the right value. We customized the search interface to ensure that the parent taxonomy folder would show alongside terms so that taggers could identify the correct version of the value.

sTeP 10: (re)eduCATe usersThis best practice started with education, and it ends with education. Hopefully, we have communicated how critical education is to integrating taxonomy with CMS. Taxonomy is an abstract concept for many, and unless they are closely involved in the development process, they are likely to resist the change. Tagging is work, and no stakeholders want to take on more work unless they are convinced that it will bring them some benefit, so you must educate business users about how taxo-nomy enables the specific functions they want for their site.

Content creators and content entry personnel need to understand how tagging content items will affect front-end display. Be as specific as possible: tagging with taxonomy value “X” makes the content item show up here in the site template, or not tagging with taxonomy value “Y” means your product will not show up when the user uses filter “Y”. For example, we explained to content editors at Motorola that if they did not tag the bar code scanners with a value from the “technology” taxonomy, their product would disappear when a user filtered the product list.

Training is the most efficient way to get users to participate willingly in tagging and to care

Figure 12: Custom tagging interface that shows the taxonomy hierarchy in a tree view.

190 TIMAF Information Management Best Practices Vol. 1

about tag quality. Tagging and taxonomy train-ing at Motorola is a dual-pronged approach, given the outsourcing of content entry. Motorola educates business users and contentcreators on how important it is to provide accurate taxonomy values to the content entry team and to the taxonomist who maintains the taxonomy. When users understand what the ultimate impact is on front-end capabilities, they are more willing to participate in taxo-nomy governance processes. Motorola trains content entry personnel on the practical aspects of tagging: understanding the tagging interface, the structure of taxonomy within the system, and the taxonomy fields that are critical for front-end display.

Summary and ConclusionIntegrating a taxonomy with a CMS is not always easy. Unless you have discovered a rare breed of CMS that already has robust built-in taxonomy functionality, getting a CMS to apply and manage taxonomy to its fullest will take some time, effort and resources. However, we hope this case study shows the wide range of benefits you can reap from investing in taxo-nomy. The effort you spend creating and implementing a taxonomy is never lost. Taxonomy is a foundation for increasing levels of content management functionality and maturity. You can add capabilities to your site in layers over time once you have the base organizing principles in place. Today it might be content filtering, tomorrow content syndi-cation, and who knows what future advanced capabilities might leverage taxonomy?One thing is clear from our experience at Motorola: the best thing you can do to help smooth the path of taxonomy integration is to continuously educate stakeholders. Encourage a dialogue between IT and taxonomy SMEs

to ensure that they can understand each other in functional and technical discussions and eventually come to compromises that suit business and performance needs. Give content creators and editors a chance to understand the taxonomy structure and why tagging is so important. They will be your best allies in the future maintenance of the system you have worked so hard to build.

references1. boiko, bob. content Management bible,

2nd Edition. Indianpolis,

IN: Wiley Publishing, Inc., 2005.

2. rockley, Ann. Managing Enterprise content.

Indianapolis, IN: New riders Press, 2002.

19111 Kelley Myers

2

For the last decade, KELLEY MYERS ([email protected]) has been finding creative solutions to acquiring free content. Kelley began her career as a writer and associate publisher for a small magazine in Northern Colorado. In 1999, she switched from print media to online content development when she launched an online web-based journal dedicated to military and defense issues in modeling and simulation in Washington D.C. Her experiences in Washington, D.C. helped her shape a thrifty approach to acquiring content for business purposes—often with no budget and limited licensing options. Today Kelley manages content development and online publishing for the Microsoft Advertising Website. Kelley lives in Bellevue, WA with her two children, her husband, and an aging overweight Weimaraner named Zoe.

Using Social Content and Media to Revitalize Prodoct Documentation

revolutionizing Product content

11Kelley Myers

192

AbstrAct

TIMAF Information Management Best Practices Vol. 1

The Microsoft Work Essentials website was conceived to go

beyond the usual notions of product documentation, providing

Microsoft Office users with custom information on how to use

the Office Suite in their specific job contexts. The aim of the

site was to provide occupationally-focused information on how

our products could address specific pain points in industries

including finance, real estate, healthcare, and law. However,

internal authors lacked the credibility to convince users that

we understood and could solve problems in their professions.

To meet our mandate we had to completely revolutionize our

notions of what kinds of information we produced and how

we produced them. Our team created a “hybrid” acquisitions

strategy that relied heavily on external authors who had

recognized occupational expertise to complement the content

created internally. Additionally, we expanded our content

library to include various content types (articles, videos, multi-

media demos, templates) that offered our customers multiple

ways to consume the information that we provided.

193

Background

11 Kelley Myers

In the last ten years, online information consumers have gone from simple web page readers to sophisticated multichannel inter-actants. Today online consumers:

• Visit Websites to ‘do something’. Time is short, and solving a problem requires a quick answer. People want navigation that helps them quickly define their problem and get to the solution. They want small targeted items of information that simply and directly present the problem and solution.• Expect advanced media but tolerate low production values. Beginning with email and continuing to websites like Facebook and YouTube, high-volume low-quality sound, images, and video have become the norm. This has allowed the same people who read information online to also provide informa- tion online.• Choose to participate. Users rate, review, rant, and rave about products online. They keep lists of cool sites and products and share them with friends. They use social network- ing sites to post links to good products, and to drive traffic away from poor products.• Can be experts in their own right. Users have a passion for products that can be harvested. Longtime users of a product have a perspective and expertise that other users respect and can learn from.

These considerations lead us to reconsider what content we create and how we create it. In the past, your team may have simply converted and posted approved print materials, but today that will not suffice. Customers will not wade through long works to find answers, learn about new features, or compare versions of products. They expect that product information will come in the same new forms as the other web information that they consume. Rather than an exhaustive 500 word article on a product, users

expect an FAQ, a set of audio segments, or even a product highlights podcast series. You can still use a 500 word article as a reference, or for the diminishing portion of your customers who prefer the long form, but the front line of your audience seeks immediate content.

Online consumers are accustomed to participa-ting in the creation of the content that they consume. They respect content created by themselves and their peers, and expect it to appear next to the content created by the site sponsor. For example, the product highlights podcast that you present can be recorded by a customer for customers. Because expectations of quality are low, production costs can also be low. Masses of text can be replaced by simple videos, online product simulations, user- generated comments, tips, links, and reviews.

There is tremendous opportunity for content publishing teams to completely transcend old notions of technical information. Content publishers can harvest information from the very people that the information serves, and deliver that information via a large (and in-creasing) set of formats which not only commu-nicate more effectively than long-form articles, but also spur user interest, engagement, and commitment to our products. Building out a hybrid approach to content acquisition requires good project management skills, previous experience with content acquisition, and a “nose for news”. Securing content from unpaid sources also requires creative problem solving skills, the ability to negotiate, and perseverance.

194 TIMAF Information Management Best Practices Vol. 1

The skILLs needed To PerForM ThIs BesT PrACTICe

Our team found the following roles vital:

• Content Project Manager: Drives each project by resourcing new content, nego- tiating contracts or agreements with authors, and working with corporate legal advisors. A content project manager also collaborates on ways to find new audiences, brainstorms ideas for useful content types, and acts as the liaison among writers, editors, and multi- media staff. The content project manager stays abreast of current industry standards and techniques to ensure the creation of effective content that achieves the organiza- tion’s goals. She is expected to manage content requests, partner with stakeholders to understand an initiative’s goals, and create segmentations of online customers. The content project manager relies on his experience and judgment to plan and accomplish goals.• Project Editor: Facilitates the technical and content objectives for assigned programs. The project editor is responsible for scoping projects, communicating with idea originators and the development team, and managing projects from inception through completion. He relies on experience and judgment to plan and accomplish goals. The project editor trains new publishers and maintains the knowledge base for global publishing processes.• Writer: Has subject matter expertise and understands the web. A writer creates articles and contributes to webcasts, researching and writing online content for the website. Writers participate in social media activities that may include any or all of the following: answering or posting to Twitter, updating Facebook product pages, writing blog posts, participating in discussion forums, reviewing RSS feeds from industry websites, and crea- ting scripts for podcast or video segments.

• Editor: Handles developmental, line, and copy editing. Additionally, the editor inter- faces with web production to post content online. Writes, edits, proofreads, and copy- edits a variety of documents. Coordinates with writers, producers, and other contribu- tors to the website to ensure consistency in style, tone, and quality of the organization’s site. Plans and prepares stories for online dissemination. Prepares articles from know- ledge of topic and editorial position of publication, supplemented by additional study and research.• Multimedia Developers: Develops multi- media content, taking direction from the global site manager, and working collabo- ratively with Writers and other content developers. A multimedia developer has an in depth knowledge of standard design applications, including: Adobe Photoshop/ Image Ready, Illustrator, Macromedia Flash, a working knowledge of web authoring tools, and proficiency in HTML, CSS, and XML.

sTeP 1: enVIsIon The new ConTenT sTrATegyBefore we embarked on our new approach, we evaluated our current Website.We began by reviewing our web trends, deciding which articles or templates received high traffic and high user satisfaction. Even if your system doesn’t allow for sophisticated methods of gathering user feedback, reviewing simple traffic statistics can reveal enough information about the habits and practices of your audience to help you decide which items of information are working best. In our case, each month our site manager compiled a content report for the content acquisitions team, executive manage-ment, writers, editors, product managers, and marketing communications group. Each item of content received a rating based on user satis-faction and customer comments. By reviewing this information, we were able to determine measures for success.

19511 Kelley Myers

For example, if an article received a high number of page views but low customer ratings (shown through a customer rating section on each page), we assumed that the content offered a compelling headline or topic but was missingthe mark on alleviating a customer pain point. We would pull down and re-work the content until our revisions increased customer satisfac-tion. Conversely, if an article received few page views but had high customer satisfaction ratings, we concluded that the content was relevant, but the article’s placement was off. In these instances, we would simply program that content to appear higher in occupational areas of the site, or run it as a top story on our home page.

We used existing feedback channels to draw significant conclusions about our content. For example, we watched spikes in traffic near product releases, and learned that the traffic plateaued more quickly than we would have liked. However, watching statistics like the number of unique users or page views were not sufficient metrics because they measure

volume, not user engagement. To solve this problem, we contracted with a company to source and maintain a user panel. Because many of our product lines reach across different consumer segments (small business, enterprise, IT pros, etc.), we created a user panel that reflected that segmentation. We also created a method that allowed us to quickly survey any segment of the user panel for more specific questions.

Based on feedback from our user panel we segmented our product offering differently. We found that we were too specific in our segmen-tation which made it difficult for our customers to find products specifically for them, so we broadened our product buckets. We also worked on developing more interesting content that addressed how our products were being or could be used. Finally, we made sweeping changes to our content offering to align with the goals of our business. We vetted all these changes with our user panel.Table 1 summarizes the steps we took to gather feedback and their relative merits:

Feedback Mechanism

Contracted with an outside vendor to manage a 20,000+ person user panel that spanned our occupational practices.

Launched site-wide surveys to measure user intent - what were users trying to do with the website and whether they were successful.

Collected web metrics regularly, including: content ratings, and customer feedback (verbatim), and the most accessed articles, entry points, and exit points.

Pros and cons

taBLe 1 Ways We Measured Feedback Online

PROS: Ability to quickly sample the user panel’s opinions on a variety of topics and on the user experience, using a large enough audience for quantitative analysis. This allowed us to assess both the content and the usability of the website.COnS: There was a limit to the number of surveys in a given time period, and it was a costly venture

PROS: There were specific surveys to capture intent of audience in the moment (task completion).COnS: Costly technical implementation, high drop-out rates

PROS: An in-house group could measure traffic, page views, etc. and complete the analysis.COnS: There was a debate over the effectiveness of the data gathered.

196 TIMAF Information Management Best Practices Vol. 1

We used the online feedback to determine how to expand our existing content types (articles, tips & tricks, template, for example). Our addi-tions included print-ready tips, video demos, and user- generated content. For example, many of the products we sold were used by non-technical consumers who preferred to print out directions. Knowing this, we could provide print-ready tips that would fit onto one page, which a consumer could post in his office or workspace. We learned from our user panel that a small subset of our audience preferred so-phisticated templates. As a result, we expanded our template offerings to include robust website templates that were pre- populated with infor-mation, formulas, and business intelligence.

Starting with our own website and moving on to other websites that were popular with our customers, we scanned forums, blogs, and discussion groups for mentions of our products. We looked at data about our types of consumers and developed customer personas. The personas helped us shape our notions of who our customers were, and to find authors who were able to write to the needs of those users. We also participated in online conversations, answering any blog comments we found, and responding directly to customers with feedback. Over time, we developed 1:1 relationships with several people we called “key influencers”, and later approached them to create new content.

Each time we created a channel for feedback, we strengthened the relationship between our company and our customer. Telephone surveys, managed user panels, online content ratings, and other customer feedback mechanisms were powerful tools for developing strong user relationships and finding expert users who we could call on to generate content for our site.

sTeP 2: geT Buy-In FroM your orgAnIzATIonIn our new hybrid content acquisition strategy, we wanted to split content creation between our staff and our customers.We needed to get organizational agreement that this strategy was a viable option for meeting our corporate goals in a fiscally responsible manner. The idea of incorporating content from outside writers and editors was already established in other teams within Microsoft; it was common for teams across the company to employ contractors and vendors in various roles. We presented our approach to creating content with customers as a logical next step.

We would measure our success by determining the cost savings created by the hybrid content acquisition strategy. First, we assessed where we were spending the most money and set a target for savings. As shown in Table 2, we also wanted to increase user satisfaction and drive more traffic to our pages.

goal

Drive down operational costs.

Increase customer satisfaction.

Increase website traffic.

Possible key Perfor-mance Indicators (kPIs)

how to measure?

taBLe 2 The key performance indicators we used to measure success

Drop in overall call volume to customer service.

Increased customer satisfaction.

Site metrics that indicate usage.

Determine baseline and measure.

Online content ratings.

Monitor unique users (UUs) or page views (PVs).

19711 Kelley Myers

We set up a series of meetings with our leader-ship team to gather their feedback and support before we moved forward. At the meetings, we presented our findings and recommendations, and our proposed solution. We showed how other teams at Microsoft were using similar techniques to create robust content stores through a hybrid approach to acquisition and authoring. After each meeting we reviewed the feedback and comments, incorporating some while defending our approach. It was critical that we proposed our solution during the budget planning cycle for Microsoft; otherwise, we would have had to wait to implement our strategy, or to haggle for funds. We were trans-parent throughout the process, showing both our approach and the impact we could have on the Key Performance Indicators identified in Table 2.

We were also prepared to hear “No”, and to present mitigating approaches. The need to anticipate management concerns forced us to have a functional and well-thought-out plan. Because of this, our team was able to garner executive support, allowing us to ramp-up quickly, and sign up for specific results. We reported back to our executive management team through monthly content reports, quarter-ly business reviews, and regular team meetings.

sTeP 3: IdenTIFy And InTegrATe ouTsIde exPerTsAlmost every product or service available today gets at least some attention on the Internet. Microsoft products get a lot. News groups, forums, review sites, and ratings abound across every aspect of our consumer lifecycle.

We began identifying outside experts by using this information to learn how our audience thinks about our products. We read reviews on product websites. We read the good and savored it, but we also paid attention to the average, fair, or poor reviews. We considered what was

being said, and more importantly, what wasn’t being said. We tried to get inside the minds of our customers and see where we could provide value in product information and user assistance content.

We devised the framework in Table 3 to guide us on the best use for each kind of author.

198 TIMAF Information Management Best Practices Vol. 1

content type

User-generated Content

Licensed Con-tent

Internal Authors

External Authors

attribution on Site

Process cost trust/Satisfaction

Provided bylines and/or attribution, highly specific topics.

Provided bylines/attribution, lim-ited use derivative content. Licensed content involved contractual obliga-tions and was usu-ally non-technical and highly task-oriented, requiring limited creativity in the subject matter.

No bylines or at-tribution, usually technical content. Internal authors use a consistent Micro-soft tone/voice.

No bylines or attri-bution (negotiable), may be technical or conceptual content. Microsoft retains all rights to this content.

Created by audience. No formal edit pass, but must be submit-ted for approval

Not edited by Microsoft staff (in order to retain “original” voice of author).

Edited by Microsoft staff.

May be edited by Microsoft staff.

Low to no cost.

Low to no cost.

Low cost.

Moderate to high cost.

Moderate trust by audience.

Moderate to high trust by audience.

Rated high trust by audience if techncial or help content; rated low trust by audience if conceptual or task-related.

Moderate to high trust by audience.

taBLe 3 Cost vs. customer trust relationship for different author types

19911 Kelley Myers

As Table 3 shows, user-generated and licensed content provide the best value. To get licensed content, we contacted trade associations for the occupations we served. For example, we con-tacted the International Association of Admin-istrative Professionals to help with content for our Administrative Assistants occupation section of the Website. Many of the organizations we contacted publish content specific to their in-dustry that we could syndicate on our website.

To find user generated content, we searched for authors passionate about specific areas— productivity, financial planning or marketing. Many of these passionate people are ‘vocal’ online in various forms of social media looking to further their personal brand. In searching online we read blogs, forum posts, Twitter, Facebook, or LinkedIn profile pages. These sources abound with information. These passio-nate individuals are keen to enhance their on-line presence and have often already produced suitable content or ideas for content.

Even though licensed and user-generated content were the best value, we still used high talent paid resources to insure that our content was deemed professional and expertly written. To find our professionals, we used technical literary agencies that specialize in finding authors for technical projects. Most agencies are open to creative writing projects that in-clude new authors. These agencies also helped us negotiate rates for writers, provided additio-nal copy editing resources, and even sourced new authors for our project. Regional represen-tatives offset the cost and opened up a pool of local talent for us to hire.

sTeP 4: resTruCTure your TeAM To InCLude ouTsIdersOur team was initially set up to author, edit, and publish with only internal resources. As our team moved toward a hybrid model, we had to dramatically restructure. Rather than being the

sole experts in product knowledge, we became information gatherers. Instead of always writing, we were often reviewing. Instead of brainstorm-ing on what our customers might or might not want to read, we facilitated the organization and distribution of what came into our new content pipeline.

The shift to outside authors was not without stress. Many of our writers and editors did not like the shift. They were were concerned about having external parties write content, fearing that it would be inaccurate or low quality. In our corporate culture we developed writers who were subject matter experts through research, product knowledge, and writing assignments. But we found out that some users had more passion for the product than even our best writers. Our customers use our products every day, and their passion is evident in their com-mitment to contributing content to multiple online channels; they dedicate blog posts and forum discussions to the nuances of our products.

Some of our team simply did not want to give up ownership of “their audience”, or were un-able to shift their work habits. We reviewed the team to see who was excited about the prospect of tighter customer interaction and selected those people to head up the initiative. We used this enthusiasm in peer-to-peer relationships, and shared customer information on work habits, product usage, and satisfaction with our content project managers. The project managers took this knowledge and developed customer scenarios for the web that spoke directly to known customer pain points.

Our editors, in particular, found the new approach challenging. Rather than having a traditional editing pipeline, as used in other Microsoft content publishing teams, we trimmed down the processes. Over time, each editor found a new freedom in our hybrid approach. Their relationships with the writers

200 TIMAF Information Management Best Practices Vol. 1

(both internally and externally) motivated them to think about the online space, and enabled them to suggest content, or videos, or blog posts. The new world of work offered choices.There are some instances where it is prudent to use internal authors, but for the majority of our content, the hybrid approach proved to be higher quality overall. For example, we agreed that highly technical content—procedural based information that was required for our audience to actually use the products— would be best maintained by our internal team of writers. We didn’t have internal expertise on sector specific information (such as running a financial plan-ning business), however. For that expertise we sought out authors who were recognized as experts in their field. We approached them about writing an article, and then negotiated costs, licensing opportunities, and even attribu-tion via the website.

Providing attribution turned out to be an important and challenging negotiation point. We published our content through an internal content management system that was used by multiple work groups across Microsoft. Most of those work groups were not providing attri-bution to writers, especially to internal writers. Before we could even use attribution on our article pages, we had to build a template that would allow for it within our page designs. We kept it simple: author byline at the top, small blurb and photo at the bottom, with external links to a writer’s personal website. Within another six months we began creating author attribution pages that listed out all of the assets that an individual contributed. This helped with search on the website, enticed authors to create more content (we had a business rule that you needed to provide at least three articles or templates to qualify for an attribution page), and differentiated our content from other content within the Microsoft network of sites.

We found that externally produced content was often much less expensive than what we

produced inside. For many authors, the free publicity was worth more than financial com-pensation. While we did pay for some content, we licensed much of it at little or no cost. We also learned that by providing attribution and inviting submissions from outside authors, we increased trust in our online content. Customers were more likely to believe an industry expert who discussed specific pain points and how to solve them with our products. When we wrote that type of content, it came off as marketing fluff, and while it didn’t decrease trust, it sounded hollow to our customers.

Emotional issues aside, there were many logis-tical issues we had to confront. We structured each team to allow for quick publishing cycles. As I mentioned earlier, we were dependent upon other publishing platforms and needed to fit into their publishing rhythms. We also wanted to divide our large team into smaller functional teams that allowed us to keep edito-rial processes efficient and feed into the same design and art production pipeline. In order to maintain occupational expertise, each content project manager focused on specific areas. For example, one project manager may have been responsible for finance, business operations, and healthcare. This allowed that project manager to go deeper into those topic areas and establish relationships with industry experts, community influentials, and occupational associations.

Each content project manager worked with a virtual team of internal editors and technicalwriters. We also had graphics production and design teams to help with art productionrequests. We augmented these resources with contractors in the following roles: copy editors, graphic artists, production specialists, userresearchers, and script writers. This did not include specific authors on contract for specialized content.

Our typical approach began with examining our planning documentation—occupational plans

20111 Kelley Myers

and personas developed to help the content project managers understand the core audience. We then pulled together a content plan that addressed a particular pain point. For example, small business financial planners who are start-ing a new business need to determine the legal structure they’re going to use—limited liability corporation, corporation, etc. The content proj-ect manager may begin by talking to external writers about this topic, and gathering ideas for content assets that address this need. After identifying the group of content assets (articles, templates, videos, etc.), the content project manager begins assigning assets to internal technical writers, external authors, or both.

Once the content was ready, the content project manager met with the project editor. The project editor was responsible for managing the content through production and publishing. We used the review cycles to keep the original intent of the content assets on track, but the project editor would request art, review scripts for demos, and work with individual authors to hone the content. After all of these steps they would hand off the content assets to production and the material would be put in the queue for publishing.

Additional team tasks included reviewing content feedback and incorporating changes into the original assets, attending industry events to gain customer perspective and insight, and continuing to acquire content.

sTeP 5: esTABLIsh An ongoIng workFLowDespite the number of content publishing teams at Microsoft, there is a lot of autonomy in establishing workflows for particular groups. While the content publishing discipline shares best practices, templates, and training with the entire employee base, when we launched Work Essentials the idea of publishing content acquired outside of the company was still new.

Since then, several content publishing teams and marketing teams have included content acquisition as a strategic element of providing content for a wide array of audiences.

For ongoing operations, we kept our team divided along occupations (finance, healthcare, legal, admin, to name a few), and kept the editorial processes simple. Each content project manager worked with a dedicated project editor, who in turn pushed content through the editorial pipeline to publishing. Our team flexed to accommodate the number of occupations we supported online. At full capacity, we had five content project managers (acquisitions editors), three project editors, two technical writers, and several contractors providing editorial support, art production, and publishing support.

We published monthly. Part of the production process included time for the content project managers to hand off content to the project editors as part of the regular rhythm. In addi-tion to publishing new content, we alsoreviewed metrics on particular sets of content. Since we were part of a larger network of Microsoft websites, we received thousands of page views. We also tracked each occupation individually (through scorecards), and provided results in a monthly content report. The report went out to all members of the content publish-ing team, marketing team, product planning team, interested partners, and our executive leadership team. It included metrics on page views, unique users, and click-through rates, along with occasional customer verbatims captured in feedback. We used this information, along with individual research, to figure out what to do next.

In some instances, there was nothing more to do; the content, individual assets, and templatesworked well together, and the traffic on the pages was sufficient. In other cases, if the product changed or if one of the templates didn’t work, we would include a fix in our next

TIMAF Information Management Best Practices Vol. 1202

publishing release. In a typical month, we published around ten new practices. Each contained between five new pages (articles) and templates (downloads) per practice. We were also able to see the impact of new content types—especially those that were not tradition-ally used in consumer websites for Microsoft.

One of the newer content formats we experi-mented with was video. During the early iteration of our websites, we included short video demos—less than five minutes of product information, delivered in an overview format. These were generally popular with our audience—they received high traffic —but we felt that we could improve upon the process if we included an interactive element. Other groups at Microsoft have taken the video development concept further, allowing user-generated videos to be posted as part of a product’s online content library. Some marketing groups have even used videos as a way to drum up excitement about a product—sponsoring contests which request videos from users.

Summary and ConclusionBy all indications, the strategy we evolved to acquire content and put the voice of our audience within the pages of Microsoft websites worked well. We had an impact on many consu-mers who used our products daily. One of my favorite stories came from a colleague: he met regularly with his contributors, who were part of an association of administrative assistants. When we published a new webcast on how to better use some of our productivity software, several of his contributors would bring lunch and reserve a large conference room to listen to the webcasts together. They would then discuss and put into practice what they heard that day.

Another colleague used a nugget of customer feedback to change a previously published template. During one review, he came across feedback from a doctor in the Midwest requesting changes to a template that helped a medical assistant collect pertinent medical information for a patient. My colleague contacted the doctor and worked with her to create a new form, which was then published on our website.

We enjoyed having a direct impact on how our customers were using our products. But we were spread too thin. We had to evaluate where we could have the most impact, and cull our occupational focus to the areas that saw the most traffic. We reviewed how we worked internally, and moved from traditional project management practices to Agile project manage-ment. We developed new processes to support a more agile approach to gathering, reviewing, and publishing content, and we prepared for the next version of productivity software. Ultimately, we stayed within budget and met the objectives that we set out to achieve.

We learned that we’re not always the experts. In extending trust beyond our corporate bound-aries and practices, we learned much more than we could quantify with revenue earnings and products sold. We orchestrated the opportunity for direct collaboration with our customers, and in doing so, allowed for a unique model of acquiring and publishing content at Microsoft. Several teams within the company continue to define and re-define how we are interactingwith our customers online, in our managed websites, and on third party websites like Facebook and Twitter. We are continuing the dialogue of consumers and marketers, audiences and authors, learning more about being advocates for our customers within our product groups.

20312 Graham Oakes

GRAHAM OAKES ([email protected]) helps people untangle complex technology, relationships, processes and governance. As an independent consultant, he helps organizations such as Sony Computer Entertainment, The Open University, the Council of Europe and the Port of Dover to define strategy, initiate projects and hence run those projects effectively. Prior to going independent, Graham held positions including Director of Techno-logy at Sapient Limited, and Head of Project Management for Psygnosis Ltd (a subsidiary of Sony). His book ‘Project Reviews, Assurance and Governance’ was published in October, 2008.

How to enable good decision making in complex organizations

creating clear governance

c 12Graham Oakes

204

AbstrAct

TIMAF Information Management Best Practices Vol. 1

Organizations fail to manage information well for a variety

of reasons – lack of skills, insufficient resources, unclear

objectives, inconsistent processes, etc. However, many of these

reasons stem from one root cause: different groups within the

organization make conflicting decisions about what to

prioritize, which standards to apply, which processes and tools

to use, and so on. These conflicting decisions in turn divert our

resources from the most important issues, increase costs (e.g.,

by creating duplication and rework), slow down execution, and

reduce the quality of the final outcomes. The politicking that

surrounds such decision making can also add significant costs

(financial and human) and delays to the decision making

process itself. Resolving these conflicts is the realm of

governance.

Governance is “the process whereby societies or organiza-

tions make important decisions, determine whom they involve

and how they render account” (1). Good governance allows

organizations to make effective decisions, to make them in an

efficient way, and to monitor and refine the outcomes of these

decisions so as to improve overall organizational performance.

By separating the decision making process and associated

roles and responsibilities from the decision itself, and by agree-

ing to this process up front, clear governance allows people to

focus their energy on understanding the issues and identifying

good solutions. Without this focus, people often spend a lot

of time deciding who needs to be consulted and defining a

decision making process, detracting attention from the

decision itself. At worst, decision making then degenerates

into politicking and indecision. This “best practice” looks at

some of the factors one might consider when establishing

appropriate governance structures and processes for

information management within an organization.

20512 Graham Oakes

The skILLs needed To PerForM ThIs BesT PrACTICeThe nature of an organization is tightly linked to its governance. For example, some organiza-tions favor highly centralized decision making whereas others devolve most decisions to local bodies. Some organizations are autocratic whereas others favor a more consensual style. There is not one right way to govern, but rather one should strive to define a governance frame-work that matches his organization’s culture, its competitive environment, and the nature of its products and markets. In order to do this, one will need to possess skills in the following areas:

• Strategic insight. The governance framework should focus attention on the most important decisions. In order to identify which decisions have most impact, one will need to be able to link information management processes, policies and priorities to her organization’s overall strategic goals.• Sensitivity. People care a lot about decision rights. Awareness of the personal and cultural issues that surround decision making rights and processes will be critical to success, as will begin able to match the governance framework to norms within the organization. Sensitivity to the political climate and power structures within the organization is also important, as these will determine many aspects of how governance is implemented. For example, this awareness will likely play into choosing the right battles, knowing when to push for a particular solution and when to back off.• Change management. Establishing gover - nance often involves making changes to other structures and processes. It may even be part of a conscious effort to change elements of the organizational culture. This requires all the skills that go with any sub- stantive change programme – communication, listening, persuasion, enthusiasm, planning, and so on.• Domain Expertise. Many people equate

defining governance with setting up information management policies and standards. It can be seen from the above that governance actually needs to address a broader, more strategic, remit if it is to succeed. Nonetheless, many elements of the governance framework may ultimately be implemented through detailed policies and such. Thus, one may also need to bring substantial domain expertise to bear, in order to define these elements effectively.

It can be seen that defining a governance framework is not a junior role. To be successful, you will need substantial experience both with information management and with general and change management.

sTeP1: TreAT InForMATIon As An AsseTA primary objective of governance is to ensure that assets are managed in a way that respects the wishes of their owners and other stake-holders. As such, corporate governance frame-works focus on ensuring that financial and other assets (real estate, intellectual property, etc.) are managed in accordance with the desires of the shareholders. Likewise, effective information governance is an exercise in asset management. One should do all she can to understand the value of this asset to her organization, and to help other people to appreciate its value.

sTeP 2: reMeMBer ThAT goVernAnCe Is PoLITICALDecision rights are about power structures. When we address governance, we set up struc-tures and processes to address concerns such as:

• Whoisempoweredtomakewhichdecisions?• Whatprocessmusttheyuseinordertomake those decisions? Which factors and criteria should they emphasize? • Whomustbeconsultedasthesedecisionsare made?

206 TIMAF Information Management Best Practices Vol. 1

• Howwillwemonitortheeffectivenessof the decisions, and hence hold the decision makers to account for the outcomes?

These are all inherently political questions. It is no accident that some of the best examples of clear governance can be drawn from constitu-tional law –to obtain a clear understanding of separation of powers and decision rights, it is worth studying documents like the US Constitution. To address governance effectively, one will need to think about how power is built and exercised within an organization. For exam-ple, some people will have power because they manage key resources (finance, real estate, etc.); some people will have power through rank and position; some people will have power because of expertise. For each decision, one will need to ensure that the appropriate mix of power is brought to bear, both to make the decision and to make it stick.

If these political issues are addressed effectively in governance structures (e.g., by ensuring that the right people are seen to be involved in steering committees and other oversight bodies), then the political issues will have less adverse effect on day-to-day operations. Conversely, organizations where political machinations dominate most actions often suffer from ill-defined or weakly-agreed gover-nance structures.

This “best practice” aims to enable engagement in this political arena in a positive way.

sTeP 3: don’T PLAy PoLITICAL gAMesIt is important to understand and work with the power structures in the organization, but playing political games (e.g., acting to under-mine people or build secret alliances) tends to lead to continued infighting rather than clear governance. The aim in addressing governance should be to create a clean and open framework

for decision making, one where issues, agendas, priorities, standards, policies, roles and respon-sibilities, etc., are clear and understood by all. Game playing along the way, can undermine the legitimacy of the resulting framework.

sTeP 4: CreATe CLeAr sePArATIon BeTween goVernAnCe And MAnAgeMenTAlthough many application vendors would like people to believe that governance is just a fancy word for management, the two concepts are dis-tinct. Governance identifies who is responsible for making which decisions and what process they should use in order to make legitimate decisions (i.e., decisions which will be accepted by the rest of the organization and its stake-holders). Management is about actually making the decisions – monitoring the environment, gathering information, balancing trade-off, and so on.

When defining a governance framework, one should focus on identifying who is responsible for key decisions and on the overall principles applying to those decisions. If a particular process must be followed (e.g., because legis-lative requirements or corporate policies dictate it) or has been demonstrated to be especially effective then that should be defined also. Beyond that, let people get on with making their own decisions. If governance encroaches too heavily on defining the details of decision criteria, outcomes, etc., then it risks running into one or more of three roadblocks:

• Political conflict and resistance. If gover- nance encroaches too heavily on people’s operational flexibility and professional judge ment they will begin to resist or undermine it.• Loss of overview. The more one focuses on the details of individual decisions, the greater the risk of losing sight of the overall priorities and perspectives.• Lack of information. Specific decisions are

20712 Graham Oakes

often best deferred until detailed information about the decision context is available. One may not have this information when defining the overall governance framework, and so he risks making poor decisions.

Governance sets the overall boundaries and principles within which people operate; if it tries to replace their skills and judgement it will probably fail.

Of course, some decisions will fit clearly within one’s own remit – many aspects of informationmanagement policies and standards, for example, are within one’s own competence andsphere of influence. As one defines the gover-nance framework, he may well begin to think about detailing these items. But even then, it’s often easier to separate the creation of these assets from the initial definition of the overall framework.

sTeP 5: IdenTIFy And engAge wITh sTAkehoLders AT ALL LeVeLsMany different types of decisions are relevant to information management – setting overall strategic priorities, defining information types and structures, defining taxonomies and other metadata schema, defining design guidelines or standards, etc. Governance can provide a coherent framework that ties all these decisionstogether. For example, by identifying the points where a design authority should review key decisions, good governance can ensure that individual teams don’t make decisions that are at odds with overarching principles and policies. In order to define such a framework, one will need to identify all the relevant stakeholders, understand their perspectives, and gain their buy in.

It is worth considering the following dimensions when identifying stakeholders:

• Strategic scope. Which executive-level people are involved with decisions that affect information management? Which middle managers affect or are affected by information management policies, processes and standards? Where does information most influence the actions of frontline staff?• Organizational structure. How does informa- tion management affect key organizational units (e.g., sales, finance, operations, etc.)? Which people within these units are particu- larly affected by information management, or influential over the way it is performed?• information types. Which people are involved in the lifecycle of different information types?• Geographical location. Do standards and information needs vary across different regions? • External stakeholders. Which external organizations influence the way information is managed? Regulators are particularly important, but there may also be bodies such as trade associations, joint venture partners, etc. that need to be considered when defining policies and standards.• type of decision. It is often useful to separate decisions about objectives (“what” we want to do) from those about strategy (“how” the objectives will be achieved), as different people may need to be involved in each type of decision. Undertaking independent reviews or audits of information quality, processes, etc., may involve yet other stakeholders.

It may be worthwhile drawing out stakeholder maps and other graphical representations of the various stakeholders: these will help one to understand her relationships and spheres of influence. Experience has shown that the first and last of the above dimensions can give a lot of insight into the roles of different decision makers and the relationships between them. Identifying the type and scope of decisions that different stakeholders are responsible for can do

208 TIMAF Information Management Best Practices Vol. 1

Figure 1: Mapping decision type and scope onto the “governance matrix”

much to ensure that accountabilities are clear and complete. The figure below shows a simple 3x3 “governance matrix” that can be used to map out these two dimensions. It divides strategic scope into three buckets:

• Steering. At this level, people are concerned with setting overall priorities and balancing trade-offs between different strategic objectives. Decisions are often about marshalling budgets and managing resource conflicts between initiatives.• Managing. Here the concern lies with allocating assigned resources in order to achieve the prioritised objectives. Decisions are about coordinating people, budgets, etc., in order to achieve targets and deal with risks and events.• Executing. This level is about performing the day-to-day activities involved with building systems, delivering information, and so on. While going about daily tasks, people make dozens of decisions, and one needs to ensure that these decisions are made in a way that aligns with overall objectives.

The matrix also divides decision-making into three broad types of activities:• Setting direction. These are the “what” decisions – What are our overall objectives? What standards and policies will have general applicability across multiple initiatives? • implementing. This is about “how.” How will we go about achieving those objectives within the context of any specific project or process? • Assuring. This is about keeping track of what we’re doing. Are people acting in accordance with the above decisions? Do our original assumptions still hold? Are we feeding accurate information back to the decision makers, so they can learn, fine–tune, and adjust course as necessary?

Each cell in the matrix illustrates the types of activities that may need to be performed to support a given decision-making activity (setting direction, implementing, assuring) at the different levels of strategic scope. In order to establish an effective governance framework, one must engage with and understand the people who are undertaking all of these types of activity.

20912 Graham Oakes

Like any model, this matrix is a simplification. However, it can give considerable insight when thinking about stakeholders and their relation-ships. Where broader questions arise, it may help to think of the matrix as a fractal, with different versions covering progressively narrower strategic scope, as illustrated in the figure below. In this diagram, it can be seen that information governance provides a more detailed view of one subset of the overall organizational governance structure.

At any point where there are overlapping groups within a cell, it may be worth considering the other dimensions to help think through how their roles and responsibilities can be demarcated. The matrix is a tool for identify-ing what clashes might arise and hence where one needs to focus his attention as he delineates decision rights. Good governance is primarily

about taking such action, not about writingpolicies and creating committees.

sTeP 6: esTABLIsh CLeAr roLes And resPonsIBILITIesOnce one understands the stakeholders and their perspectives, she can begin to map out their accountabilities and decision rights. RACI models, as illustrated in the table below, can be invaluable here. These provide a simple, clear way to identify who is:

• Responsible for a decision or outcome, i.e., she undertakes (or manages) the actions necessary to make the decision (gathering and analysing information, convening stake- holder meetings, etc.).• Accountable for a decision or outcome, i.e., she approves the final decision and is held

Figure 2: Information governance relates to broader organizational governance

210 TIMAF Information Management Best Practices Vol. 1

Body Decision

Define brand guidelines for page design

Define taxonomy and metadata schema

Apply keywords to individual articles

A/R C

R

C

A

I

C

A

I

C

R

Brand council

Info Mgt council

editorial Board

editor author

taBLe 1 RACI Models can be used to identify responsibility for key decisions, policies and activities

to account for its results. Note that she may delegate management of the decision-making process to the Responsible party, but she cannot delegate accountability for the decision itself. • Consulted about the decision, i.e., she must be consulted in the course of the decision making process, so that her perspective is considered. • informed about the decision, i.e., she must be informed of the final decision outcome, so that she can take it into account for her own actions.

The starting point for developing such RACI models will be the process models and other assets covered by other “best practices” in this book. As one identifies key decision points in these processes, he can begin to map out the RACI for each decision.

sTeP 7: esTABLIsh APProPrIATe oVersIghT BodIesMany key decisions about information manage-ment will fall into the cell in the middle of the left hand column of the governance matrix described above – creation and enforcement of appropriate policies, standards, guidelines, etc. This is the detailed work that many will recognise as the bread- and-butter of informa-tion governance. It is likely that one will need

to establish appropriate bodies to oversee this work.

Four sets of responsibilities are particularly relevant here:

• Whodefines policies and standards? This involves work such as identifying relevant standards, balancing trade-offs, framing guidelines, developing templates and checklists, creating examples, and so on.• Whoapproves these policies and standards? This establishes that the policies and standards are suitable for use across the relevant parts of the organization.• Whoenforces the policies and standards? What mechanisms are to be used for concerns such as auditing compliance with standards, handling exceptional circumstances (e.g., approving variations), or dealing with non-compliance?• Whoimplements the policies and standards? For example, which groups are responsible for applying standard keywords to documents, or for handling retention and disposal of records?

There is no one-size-fits-all way to allocate these responsibilities. The table below illus-trates some of the ways in which they may be handled. The appropriate oversight structure for each organization will be dependent on factors

21112 Graham Oakes

such as size, organizational culture, regulatory environment, and the diversity of geographical locations and professional disciplines in which it operates. Responsibilities can be allocated in many ways. A key trade-off is often the balance between centralized and devolved decision making.

It can be seen that a key consideration here is the trade-off between central and devolved oversight. As a general rule, central oversight makes it easier to ensure that policies and standards are applied in a consistent way. It may also allow one to optimise the utilisation of specialist skills and resources. On the other hand, devolved oversight can mean that decisions are made closer to the point of creation and use of information, where specific local circumstances are better understood. It can also help to build a greater sense of owner-

ship for policy and standards across the organi-zation.

The figure below illustrates a simple mapping of the first and last of the above concerns(definition and implementation of policy) onto this dimension of centralized versus devolved oversight. Typical oversight styles or bodies are then overlaid onto this map, as follows. (Note that the names used for these bodies can vary widely across different organizations; the figure illustrates some common names.)

• Anarchy: Individual units or teams define their own policies and standards, and hence perform any activities needed to conform to these policies.• information Management Council: Representatives of the individual units come together periodically in order to agree on

responsibilityDefinition

Approval

Enforcement

Implementation

Ways in which it might be allocated

• Adhoc–individualgroupsdefinetheirownpoliciesandstandards• Council–membersofindividualgroupscometogetherinorderto agree on policies and standards which they will all use• Central–acentralunitdefinesthepoliciesandstandardsthatwill apply across the organization

• Devolved–whoeverdefinesthestandardsisalsoempoweredto approve them• Executive–anexecutivebodyoverseesandapprovespoliciesandstandards

• Self–peopleorteamsareexpectedtoadherewithouttheneedfor independent audit or policing• Peerreview–compliancewithstandardsandpoliciesischeckedthrough a process of peer review• Audit–anindependentauditfunctionassessescompliance,andreports to an executive function that then handles issues of non-compliance and enforcement• Police–anindependentauditfunctionassessesandenforcescompliance

• Self–peopleorteamsareexpectedtoperformanyworkneededto conform to standards and policies (e.g., authors are expected to apply appropriate keywords to documents they create)• Central–acentralsupportfunctionundertakesspecializedfunctions (e.g., a central group of archivists applies keywords in accordance with the corporate taxonomy)

taBLe 2 Ways in which responsibilities might be allocated

212 TIMAF Information Management Best Practices Vol. 1

policies and standards. The units are then responsible for performing any activities needed to conform to these policies and standards. • information Management Centre of Excellence(CoE): A central team (the CoE) defines policies and standards. Individual units are then responsible for performing any activities needed to conform to these policies and standards. • InformationManagementCentreof Excellence with audit: In addition to defining policies and standards, the CoE may also act to enforce these standards, e.g., by conduct- ing reviews or audits. This brings it closer to implementation as well as definition of policy.• Central information Management unit: A central unit is responsible both for defining and implementing policies and standards. Thus it may undertake functions such as maintenance of taxonomies, keywording of documents, records management, etc.• Central information Processing unit: Individual units or teams define their own policies and standards and then contract with a central unit to undertake any activities necessary to implement these policies. (This pattern is also typical of many outsourcing contracts.)

Many organizations separate definition of policy from its implementation. Oversight may then be structured to balance central versus devolved control. When defining the structure of these oversight bodies, it will also be necessary to define their operating procedures, answering questions such as:

• Howfrequentlywilltheymeet?• Whatconstitutesaquorumforthese meetings? • Howaredecisionsmade?(e.g.,bymajority vote, by consensus, or by executive decision of the chair?)

sTeP 8: FoCus on whAT’s IMPorTAnTPeople make hundreds of decisions every day. Let them. There’s a word for governance structures that focus too heavily on the minutiae of day-to-day decisions while losing sight of the bigger picture:bureaucracy. Effective governance focuses on the smallset ofdecisions that have a big impact on overall performance.

How do you recognize what’s important? Here are some questions to ask:

• How do information assets relate to wider strategic objectives? Assets that contribute directly to strategic objectives need to be governed more carefully, e.g., with active ownership throughout their lifecycle and strong enforcement of applicable policies.• How do information assets impact execu- tive and management decision making? Information that directly supports decision making also needs to be governed carefully, with particular attention to managing concerns such as the accuracy and availability of relevant information.• How do information assets relate to external stakeholders and perspectives? Information that must be provided to regulatory bodies, partners, customers and Figure 3: Central versus devolved control

21312 Graham Oakes

other external agencies is likely to need different treatment than that which is purely for internal use.• What other resources are affected by this information? Information may contribute to the management of other assets. For example, architectural and engineering diagrams may be necessary for ongoing maintenance of property and plant. Again, such information needs to be governed carefully, e.g., through attention to archiving and retention policies.• How do information management processes and policies add value to other processes? Good governance directs attention towards the assets that are most helpful to other units within the organization. Thus, the oversight groups discussed above should focus their energy on helping process owners, project teams, etc. do their jobs. Oversight groups generally do this by providing clear guidance on policy and best practices and by helping to resolve difficult issues, not by being “process police.”

Focusing on what’s important also makes it easier to implement a governance framework: expending too much energy and goodwill on minor decisions, leaves no time to get people to adopt the framework for major decisions.

sTeP 9: ensure The deTAILs Are ATTended ToAlthough it’s important that executives, steering bodies and oversight groups don’t get caught up in the weeds, it’s also true that “the devil is in the details.” Good governance will separate overall vision and strategy from the details of policy and execution and ensure that there is clear ownership of all elements. Typically, over-sight bodies will take responsibility for defining the detailed elements of the framework: RACI models, principles and policies, standards, etc. Ensure that ownership of these elements is clear and that work on them is appropriately priori-tised and resourced.

It is worth noting that audit is an important element of governance: its role is to give inter-nal and external stakeholders confidence that policies and procedures are appropriate and are being followed in practice. So the governance framework needs to define the relationship between oversight groups and auditors and ensure that the details necessary for successful audit are attended to.

sTeP 10: enABLe ACTIonOversight bodies and steering groups should not be talking shops – they exist to enable rapid and purposeful decision making. In order to maintain a focus on action, consider:

• Devolvingdecisionstoindividualteamsand units wherever possible. Decision makers who are close to the “front line” tend to have the most current and relevant information. Policies and standards can then be framed to support them, e.g., by defining consistent decision criteria across the organization. • Buildingincentralreviewpointsandgate- ways for high-impact decisions. These help ensure that local decisions are consistent with overall objectives. • Creatingaclearframeworkofpriorities. Again, this supports local decision-makers to act in accordance with overall objectives. (This is also an important success factor for any initiative to set up an information gover- nance framework. One will generally make most progress by focusing on a small number of areas at any one time, only moving to lower-priority items once the high-priority ones have been addressed. This helps one to visibly deliver value from an early point, thus building morale within the team and credibi- lity across the wider organization.)

sTeP 11: MAnAge reLATIonshIPsPeople need to work together to make decisions that contribute to overall objectives. Policies

214 TIMAF Information Management Best Practices Vol. 1

and standards support this, but the key to success is to establish and maintain good working relationships between the stakeholders. As you establish the governance framework, seek to:

• Buildcommonunderstandingofhowroles and responsibilities are allocated. While developing models, such as the RACI tables described above, ensure that people clearly understand the boundaries of their responsi- bilities, the mechanisms for consultation, handover points, etc. If people have differing views of the boundaries and processes, then they will continue to make inconsistent decisions.• Clearlyseparateroleswhereconfusionoften arises. For example, project management and process management are different– a project is a one-off activity while a process is on- going. Think sprint versus marathon. It is possible for one person to manage the implementation of a new process and then to transition into the process owner role, but only if he consciously changes his their emphasis as he makes the transition. Make the differences clear to people, so they know who is responsible for every decision.• Ensurethatoversightbodiesarerespected. These bodies may be directive, advisory or even policing in their overall approach, depending on the organization’s culture and their position within it (e.g., firms with highly centralized authority will probably position their oversight bodies to create and enforce mandatory policies, while those with highly devolved lines of authority may tend to place these bodies in more of an advisory role). But to be effective with any approach, oversight bodies need to have the respect of managers, editors, authors and other stake- holders. If they build adversarial relation- ships, then they will probably fail, even with strong executive backing (unless the executive body is truly dictatorial). Conversely, if they earn people’s respect and

become the place people turn to for advice and assistance, then they can exercise effective influence even without direct authority.• Usemorecarrotsthansticks.Successful organizations ensure that people are motivated to do the right thing. If they do need to use a stick, they ensure it has execu- tive backing – nothing damages credibility as quickly as people discovering ways to go around an oversight body by talking to “higher authority.”

sTeP 12: BuILd A FrAMework For CLeAr CoMMunICATIonPeople need information to make good decisions. They need to understand overall objectives and strategy, and to be aware of the current status and context. They also need to know what work others are doing in related areas. So while building a governance frame-work, one will need to ensure that it is supported by appropriate information flows. Pay attention to concerns such as:

• Buildingaclearcommunicationplan–who needs to be told what and what channels will be used to communicate – from the outset. (And remember that communication should be two-way: build channels to gather feedback as well as broadcast policy.)• Activelycommunicatingrolesandresponsi- bilities, objectives, policies and standards, etc. as they are established and refined.• Settingclearexpectations.Forexample,be realistic about the scope of policies and about how exceptions will be managed.• Buildingclearlinkagesfrompoliciesand standards to corporate goals. As well as defining policy, explain the rationale behind it. Most decisions involve some element of judgement and even politics. By setting out a clear rationale for policies and standards, one ensures that decisions can conform to the intent as well as the letter of policy. Consider

21512 Graham Oakes

using techniques such as Balanced Scorecard strategy maps to clarify the links between goals and standards.• Managingthemechanicsofmeetings.Itis not enough to bring people together in over- sight bodies, councils, and forums. When these groups meet, they must operate effectively. So ensure that basic meeting discipline is followed – clear agendas, materials circulated in advance, defined procedures for making decisions (is it by consensus or by majority vote, for example, or does the chair decide? What constitutes a quorum?), and so on. Think carefully about the frequency of meetings. If a steering group meets too infrequently, decisions will be delayed and initiatives will be held up. On the other hand, meeting too frequently wastes resources and creates the temptation for executives to micromanage. There is no universal “correct” frequency for meetings, every organization moves at its own pace (and at the pace set by its competitive environment). It is important to become attuned to this pace.• Providingconsistentmessagesacrossmultiple communication channels. Don’t lock information up in documents, reports, or portals. Use a variety of tools, and pay careful attention to setting up appropriate information radiators (publicly visible, easily understood, and regularly updated displays of key messages and performance indicators).

sTeP 13: ACCePT And Address ConFLICTConflict is a natural state within organizations (and especially in complex organizations, joint ventures, alliances, etc.). People have differ-ing goals. Resources are constrained. Different perspectives and experiences lead naturally to different opinions. Don’t try to ignore or hide this conflict – it will eventually surface. And if it’s not managed, it will probably surface in extremely unhelpful ways.

Aim to clarify how areas of potential conflict will be managed. Think about where conflicts typically arise – for example, contention for key specialists at certain times of the year – and begin to map out how these conflicts will be managed. How will priorities be decided? Who has the authority to decide? Who must they consult and what factors must they consider? What are the bounds to the trade-offs and compromises they can make?

sTeP 14: BuILd In FeedBACk And LeArnIngIt’s impossible to get everything right. And even if one was able to do so, what’s right today will cease to be right as an organization and its competitive context evolves. It’s important to create feedback loops to learn from experience and adapt and improve governance over time. Feedback can come from mechanisms such as:

• Reviewsandretrospectives.Buildinpoints to step back from the day-to-day operation of a project or process and think about how it can be improved.• Well-definedmetricsandperformance indicators. Good performance indicators aren’t just pulled out of the air: “Wouldn’t it be nice if we could measure the percentage of successful searches.” They come from a model of cause and effect: “If we improve our document tagging workflow here, then people will pay more attention to entering the right keywords, and hence the percentage of successful keyword-based searches will rise.” With such a model, the performance indicators provide feedback: if they don’t go in the expected direction, then either the original model needs to be rethought, or its execution needs to be improved.• Listeningtopeople.It’samazinghowmuch you can learn ...

Try to build in feedback at two levels. First, there’s feedback about individual decisions –

216 TIMAF Information Management Best Practices Vol. 1

how does the decision making process feed back information about whether decisions are correct, and about how they can be improved? Second, there’s feedback about the decision making process itself – how will one know if it’s operating effectively, and how it might be improved?

And after obtaining feedback, one should act on it. If people see that feedback is not being responded to, they’ll stop providing it.

Now I’ll discuss some risks and pitfalls.

PITFALL 1: geTTIng sTuCk In The weedsThis is the number one risk for governance initiatives – people get bogged down in the details of policies and compliance.

Policies and standards are often the most visible part of an organization’s governance; so many people equate them with governance itself. But policies and standards result from and support good governance – they don’t define it. They are the tactical elements that emerge after defining the overall approach.

Likewise, compliance is the backward-looking part of governance. It helps demonstrate that decisions were taken in accordance with agreed processes and policies. The forward-looking part of governance creates structures that help people make good decisions in the first place. These good decisions are what add real value for the organization.

If one focuses too much on policies and compliance, then he risks missing this bigger picture – the overall framework of stakeholders and well-considered decision rights that is the core of good governance. One also risks los-ing the attention of senior management. Most executives “know” that governance is impor-tant, but they get bored by the details. They may even resent having their time “wasted”

by concerns such as compliance, when they really want to be focused on things like meeting revenue targets. When discussing governance with executives, focus on the overview – how governance helps meet overall goals – not the details.

Symptoms of this pitfall include:

• Tryingtocovereverypossibility.Ifoneis trying to define a policy to handle every detailed case and exception, he is in the weeds. It generally makes more sense to address the overall principles and then deal with exceptions as they arise.• Focusonpolicingratherthandirection setting. Oversight bodies that concentrate on reviewing decisions after they’ve been made and telling people where they’ve gone wrong will find it hard to exert any real influence in most organizations.• Oversightbodiesthatbecomedecision bottlenecks. This is often a sign that these bodies are trying to make too many decisions, that they’re caught in the weeds of tactical decision making rather than focused on the small percentage of strategic decisions.• Analysisparalysis.Ifpeoplespendalotof time arguing about trivia, or thrashing back- wards and forwards between two options, then they are probably missing some element of the overview.• Internalfocus.Ifpeoplefocusoninternal benchmarks (e.g., doing a little better than last year), then they are at risk of being blindsided by external developments (e.g., missing an important industry trend).

A common thread to all these symptoms is that, by losing the overview, a team or oversight body not only fails to devote sufficient atten-tion to strategic concerns, but they also damage their own credibility and authority. As they get enmeshed in tactical concerns, their authority to set overall direction is undermined.

21712 Graham Oakes

PITFALL 2: gAPs In CrITICAL AreAsClosely related to the above, it is easy to focus on some areas of governance and leave gaps in others, especially when caught in the weeds. Gaps can arise in areas such as:

• Stakeholders.Peoplemisskeystakeholders, for example, focusing too much on regulatory compliance and overlooking the need to support the information needs of key internal stakeholders.• Ownershipandaccountability.It’seasyto overlook key assets or process areas. If any area has unclear ownership (no owner, or multiple people who think they own the same asset); that’s a recipe for conflict and wasted resources.• Policiesandstandards.Peopleoftenfocuson the standards they know about – the industry standards that everyone’s talking about, for example. But just because it’s sexy in the industry right now doesn’t mean it’s the most pressing concern for your organization.• Linkagefromstrategicobjectives.It’salltoo easy to define a whole raft of “best practice” policies and standards, without considering how they link to organizational objectives.

If key stakeholders are not considered up front, then they are likely to block decisions or require late changes to them. Likewise, unclear ownership of resources or decisions is likely to lead either to overlap and conflict, or to having important concerns “fall through the cracks.” And if the link from business objectives to operational policies and standards is missing or poorly understood, then people are probably going to make poor decisions when exceptions or complex cases arise.

PITFALL 3: FudgIng ITPeople often leave stuff unclear in an attempt to avoid conflict, or to leave scope for freedom of action. This doesn’t work. You need to get the

overall framework as clear and complete as possible (without getting caught up in the weeds, of course). There will always be plenty of ambiguities and unexpected circumstances – there’s no need to create unnecessary ones.

PITFALL 4: FAILure To BuILd Buy-InEven with strong executive commitment, it is all too easy for powerful stakeholders to block the establishment of clear governance structures. At an operational level, people can always find ways to bypass agreed upon proces-ses and policies. This may be done because they lack overview of the broader context, or because they feel they have something to gain by doing things their own way. Whatever the case, it is almost always worthwhile investing significant effort in listening to people, trying to understand their perspectives, and helping them to understand the overall objectives.Without their buy-in, it is a struggle to establish effective governance.

PITFALL 5: FAILure To deAL ConsTruCTIVeLy wITh ConFLICTIf decision makers and oversight bodies spend too much time engaged in head-on conflict, they risk alienating key stakeholders and diverting energy from key decisions. Conversely, over-reliance on conflict avoidance can lead to important decisions being delayed while consensus is sought and to meaningless compromises when decisive action is required. Effective governance is about neither avoiding nor creating conflict – it is about finding constructive ways to resolve it.

PITFALL 6: deATh By CheCkLIsTsChecklists are useful tools, but followed slavish-ly they lead to creation of documents and other assets simply to check the box. So this practice doesn’t contain a list of policies that must be

218 TIMAF Information Management Best Practices Vol. 1

defined. What one needs will depend on specific situations – different organizations, information types, business models, regulatory environ-ments, etc., call for focus in different areas. The other practices in this book should be used to help identify areas of importance. This practice should be used to help set up an appropriate governance framework around chosen focus areas.

Summary and ConclusionWell-defined governance allows people to focus their energy on the decision, not on the decision-making process. It ensures:• Weknowwhichdecisionshavethebiggest effect on our objectives. • Weknowwhotoinvolveinmakingdecisions – there is no need to rethink this for every decision. • We’veagreedwhotoinvolveupfront– people don’t derail the process with boundary disputes and last minute queries. • We’veagreedtothedecisionmakingprocess up front – there is no politicking and arguing about decision rights.•We’veagreedonhowtotrackoutcomes.

This all increases the likelihood that good decisions will be made. Conversely, poorly-defined governance leads to:

• Wastedresourcesandinefficientdecision- making. People investing energy in politicking and boundary disputes. Question- ing of the decision making process, rather than a focus on the decision itself. Time spent on peripheral concerns. Decisions taking longer than necessary.• Decisionsthatdon’tstick.Peopleundermine or ignore decisions they consider to be “illegitimate.”

• Poordecisions.Keyinformationandstake- holders are ignored or considered too late. Decisions are driven by power bases, not organizational objectives. People create conflict or unclear decisions.

Organizations that avoid discussing governance ultimately spend a lot of time on it, addressing it over and over again for each decision as they argue about accountabilities and due process. They end up with little energy left for the deci-sion itself.

At one level, addressing this is simple. Good governance is merely a matter of defining who makes what decisions, how they make those decisions, and how to track the effectiveness of these decisions. At another level, untangling the complex web of relationships in most organiza-tions can be incredibly complex. Doing this in the face of an accelerating pace of change can be challenging indeed. Organizations that do it well will have a distinct competitive advantage. Such organizations will use tools like the ones discussed above to:

• Setupaclearframeworkidentifyingthe roles, responsibilities and relationships of decision makers and the processes and criteria they will use when making decisions.• Supportthisframeworkwithawell-defined set of policies, standards and guidelines, with suitable oversight bodies to develop and enforce them.• Communicatethisframeworkwidelywithin the organization and actively build people’s buy-in to its operation.• Usetheframeworktoaddresscritical questions of prioritization, resource contention and other conflicts, rather than submerging these issues in politicking, bland consensus building, and conflict avoidance.• Buildinmechanismstogathervalidated information about decision-making performance, and hence use that information to trigger learning and improvement.

21912 Graham Oakes

• Insimpletimes,wemaybeabletolivewith poor governance. As the competitive environment gets more complex or as decisions get more time critical, then we need to ensure that our governance structures and processes are clear.

references1. Institute on Governance, http://www.iog.ca/

page.asp?pageID=3&htmlarea=home

220 TIMAF Information Management Best Practices Vol. 1

22113 Liz Orna, Jane Jennings

ElIzABETh(lIz)ORNA ([email protected]) made her first information audit more than 30 years ago—before the term came into use. She has long practical experience as an informa-tion manager and information consultant, has written extensively on information auditing, and has told the stories of many audits in her books.

At the time of the information audits detailed in this article, JAnE JEnninGS ([email protected]) was the Knowledge Management Co-ordinator for one of the largest local authori-ties in the UK. Essex County Council covers an area between London to the west and the North Sea to the east. She has worked for this organisation for many years and in a variety of roles, all relating to librarianship, information management, knowledge management, or publishing. Jane is now the Internet Strategy Manager responsible for developing a new, much-needed website for the county.

How to Make Information Auditing Part of Your Repertoire for Strategic Information Management

auditing Your Information Management Practices

Liz Orna Jane Jennings

13 3

222

AbstrAct

TIMAF Information Management Best Practices Vol. 1

School governors complained that although they were

inundated with information, it was difficult for them to obtain

the right information when they needed it. To find out what

was happening with the provision of information to schools

and governors, it was agreed an information audit might

provide the answers to the problem.

The audit resolved the specific problem by drawing on the

relevant inside knowledge of organization staff members who

acted as the audit team, supported by the expertise of an ex-

ternal consultant, and the experience of the users in trying

to do their work. Users took an active part in shaping the

recommendations for action that resulted from the audit, and

in carrying them successfully into practice. Steps 1—5 describe

how this was done. The results and the experience gained in

this first audit led to continuing work within the organization.

This work has contributed to resolving a general problem

associated with information auditing: how to get value from

a first application of the process, while enhancing the process’

value by incorporating the knowledge and skills gained into

the organization’s information-management practice. Accounts

of doing this over a period of time are in very short supply;

Steps 6 and 7 provide the inside story of the original audit

team leader’s practical work within the organization to resolve

this problem over the years since its first use of information

auditing.

223

Background

13 Liz Orna, Jane Jennings

Since information auditing plays a key role in this case study, we begin with a definition of the term; it is the one used in our work, and under-lies our practices. The definition was formulated by Aslib Information Resources Management Network (now NetIKX) in the mid-1990s and has been widely cited in the literature:

“A systematic examination of information use, resources and flows, with a verification by reference to both people and existing documents, in order to establish the extent to which they are contributing to an organiza-tion’s objective.”

Common purposes for information audits include:

• Establishingabasisfordevelopinginforma- tion strategy or policy to meet business goals. • Definingtheroleoflibrary/information services in meeting information/knowledge requirements of corporate objectives. • Findingoutwhetheressentialinformation reaches the people who need it, and what prevents its communication. • Improvingthemanagementanduseof information from the organization’s outside world of customers, clients, public and government. • IntegratingtheuseofITandinformation content. • Findingoutwhereinformationandknow- ledge are most critical for good performance in an organization, as a basis for making best use of all the knowledge its staff possesses, acquires, and creates. An audit also allows you to use information resources produc- tively to support internal and internal knowledge activities.

Audits have been used for specific purposes as diverse as:

• Findingouthowassessmentiscarriedout in different faculties and departments of a university, as a basis for an integrated infor- mation approach.• Establishingthescaleofthecollectionsand the state of their documentation in a large museum, as a starting point for developing an integrated information strategy.

We offer this case study because there are few accounts of information auditing that include evaluating the method as part of an information audit, or implementing the audit’s recommen-dations. None that we know of describe how an information manager in the organization continued to use and develop the method in other situations over a period of years. The story of the first audit is briefly told in ‘Informa-tion Strategy in Practice’ (1), and the evaluation is described in full in ‘Discovering domains: Using an information audit to nurture a community of practice’, an unpublished thesis by Jennings (2).

the challengeAlthough school governors in the UK are all unpaid volunteers, their role and duties are statutory. It is their responsibility to act as critical friends to the head teacher (head) by challenging his decisions and making sure the school meets government-set targets. Governors appoint staff, including the head, and oversee both the budget and curriculum. There are times when the role of the governors and that of the head become blurred. This can, and sometimes does, lead to conflict. It is important, therefore, that governors understand their role and their statutory duties. It is also very important that they have the means to share information and knowledge, both with their own governingbody, and with other governors who would benefit from shared expertise.

224 TIMAF Information Management Best Practices Vol. 1

A group of school governors in the county of Essex got together on their own initiative to discuss information and communication. They felt overloaded with information, yet lacking the information they really needed. They took the problem to the County Council, their Local Education Authority (LEA), and asked the help of the Governor Services Manager in setting up an electronic Community of Practice (CoP) to help them share information. One of the most quoted definitions of CoPs is given in ‘Culti-vating Communities of Practice’ (3): “Commu-nities of Practice are groups of people who share a concern, a set of problems, or a passion about a topic, and who deepen their knowledge and expertise in this area by interacting on a ongoing basis”.

Wenger et al. go on to say that communities consist of:

• Anidentifieddomainorfocusofinterest.• Thecommunitymembers.• Thepracticethatcontainsinformation resources the community shares.

A first attempt to find out by questionnaire what information the 9000 Essex school governors needed failed to deliver any meaningful results. The Governor Services Manager then turned to the Council’s Know-ledge Management Coordinator (KMC) for advice on other approaches.

a New tool to Meet the chal-lenge—Information auditing

Their discussion revealed that, although Governor Services provided school governors with all the information they thought governors needed, Governor Services did not know:

• Whathappenedtotheinformationonceit reached a school. • Iforhowitgottothegovernors.

• Iforhowheadteacherskepttheirgovernors informed. • Iforhowgoverningbodiesmanagedtheir information.• Whatothersourcesofinformationgovernors used or had access to. • Ifgovernorssharedinformationandknow- ledge and if so, with whom.

This scenario suggested to the Knowledge Management Coordinator that an information audit could produce answers that would give a basis for creating a Community of Practice to meet the governors’ actual needs.

The skILLs needed To PerForM ThIs BesT PrACTICeTo achieve best practice in information auditing, the team needs the following blend of skills and knowledge:

• Projectmanagement.•Interviewing.• Dataanalysis.• Reportwriting.• Proofreadingandediting.• Administration(agoodadministrator,with guaranteed time to devote to the task is critically important for success). • Presentation.

These skills were vital, but were missing from the skill set the Project Manager and the KMC thought were needed for the governors’ audit. The Project Manager’s skill set consisted of:

• Projectmanagement.• Informationmanagement.• Knowledgemanagement.• IT.• Informationsecurity.• Administration.• Corporatestrategy.• Knowledgeofgovernorservices.

22513 Liz Orna, Jane Jennings

Not only did this list lack key skills, it also in-cluded skills that we did not need. IT, informa-tion security, and corporate strategy were not required skills, and the people in these roles were hard to place in the audit.

Experience shows that it is best practice to define roles clearly at the start, and for all team members to take on more than one role. This helps keep the team to a manageable size, and allows the team members to become fully engaged and familiar with the process. It can also give opportunities for members to develop new skills and knowledge.

sTeP 1: deFIne your TerMs oF reFerenCeUse this step to safeguard the audit against these typical threats to its success:

• Startingtheauditwithoutsupportand guaranteed resources from the top of the organization.• Notmakingsurethatallthekeypeopleon the client side have the same understanding of the objectives and outputs of the audit. • Notachievingunderstandingwiththepeople whose co-operation you need as information sources. A questionnaire from someone in the organization who they have never met, or an email from their manager telling them to attend an interview with the audit team, is more likely to make sources feel threatened than cooperative.• Handingtheauditovertooutsideconsultants without making sure they work with in-house information staff and progressively transfer responsibility to them.

Define Your PurposeBefore starting an audit, representatives of the main stakeholders must define its purpose. This might seem obvious, but it is not always done. Once the audit’s purpose has been agreed, stick

to it!

In the governors’ audit, the purpose was to determine and establish:

• Howgovernorsreceivedinformation.• Whatinformationtheyreceived.• Whetherthisinformationmettheirneeds as to content, format and presentation. • Areasofgoodpractice,shortcomings,and areas for action.

The Governor Services Manager and the KM Coordinator agreed with the consultant’s suggestion to develop the definition into these objectives:

1. To consider the objectives of schools and governors, assess what those objectives imply in terms of using information to support the relationship between both groups, and identify what should be happening in that respect.2. To investigate what is happening now. 3. To compare the the current situation with the objectives, and identify good and less good matches between them. 4. To propose changes which will build on examples of existing good practice, help to overcome the shortcomings found, and to recommend a programme of implementing them which will make a strong foundation for the “Community of Practice”.

Start by making a basic definition of the nature and scale of the problem that the audit is to address. Define audit objectives and make sure from the start that everyone concerned has the same understanding of them. Do not let them change or drift in the course of the project, but do look at them as part of the evaluation to assess for future reference whether they were appropriate.

In making a business case for an information audit, base it on corporate objectives and your

226 TIMAF Information Management Best Practices Vol. 1

understanding of the parallel world of organiza-tional politics and power relations. Management can hardly repudiate the first; the second helps you tailor the case you are making, and win influential support for it.

create governanceThe Head of Information Services, a member of the corporate board, agreed to sponsor the audit, select a steering group, and provide the necessary budget. The Governor Services Manager became Project Manager, with the Knowledge Management Coordinator as deputy and lead auditor. Once these roles were estab-lished, the they set about assembling an audit team.

As there was no in-house experience with information auditing, the group sought outside expertise. A consultant with experience in car-rying out information audits and training audi-tors was appointed to train the in-house team and advise and support them as required.

The following lessons will help to build your team:

• Spotpotentialsponsorsofauditinginitiatives, build good relations with them, and—when an opportunity offers—educate them as apostles for information auditing who can enlighten senior management.• Selectasmallauditteamwiththerightskills, who understand what they are doing and fulfill multiple roles.• Includemembersofthesteeringgroupinthe audit activities.This gives them a better understanding of and a greater commitment to the process.• Findatop-notchadministratorwhowillkeep everything and everyone on track. The administrator should be able to carry out other roles such as interviewing or report writing. • Buildthinkingandtalkingtimeintoplanning,

both within the audit team and between the team and the people from whom they will need information.

If you decide you need outside support, choose an appropriate consultant, and agree on their brief. Never allow your audit to be run entirely by outside consultants, with in-house people used as servants rather than equal partners. Be prepared to fight this in reasonable terms, and make sure the case against it is on record!

Define Stakeholders and ScopeOnce the audit’s purpose is clear, you can agree to who should be involved—who your stakeholders are—and any constraints on those stakeholders. Our audit’s purpose indicated that those who created and supplied information—the local education authority staff—as well as those who received it—governors and head teachers—should be involved in the audit. However, the size of the LEA and the number of schools involved (more than 500) meant that we had to choose several schools to work with, rather than include all of them. Parents were also stakeholders. They supplied governors with information and had a decided interest in the running of the school, but because of the constraints determined by the school year, they had to be left out of scope of the audit.

The following lessons will help you define your stakeholders and your scope:

• Identifyallthestakeholders(especiallyany who may have reasons to oppose the audit), and win their support. Use the most powerful supporters to contain any risks from poten- tial opponents.• Longexperienceintheorganization,and knowing a lot of people in it, is key to success. If you do not have this experience yet, meet as many stakeholders as you can, and build relations of mutual help and trust. Being able to give your stakeholders useful

22713 Liz Orna, Jane Jennings

information is a great calling card.

account for constraintsThe school year was a determining factor in planning our audit; it had be completed by the end of July 2003, when the school year finished, so that we could implement its recommendations after the main summer holiday. All audits should have a time limit just long enough to do the job properly, but tight enough to maintain the pace. That way, nobody risks getting bored, and pressure adds to enjoy-ment.

tip: Identify constraints to be met, and decide how to deal with them; make sure everyone concerned knows the decision; prioritize to do what is most essential, and sacrifice what is least critical.

Match Methods to the Situation and objectivesKey features in the situation that led to the audit, and the objectives proposed, were:

• Theinformationusersandtheirneedsare the heart of the initiative. • Theinitiativehadcomefromagroupof potential users. • TherelationbetweenEssexCountyCouncil and governors was not employer/employee, but the two groups had critical mutual responsibilities.• Informationproducts(notjustinformation content, but the format and presentation of the ‘containers’ embodying it) have a central part in the relationship, and a new informa- tion product (an electronic CoP) is part of the desired outcome.• Bothpartieswanttofindoutwherethe products match users’ needs well, and where they fall short, and to do something to improve the match.

The situation pointed to the specific information

auditing method referred to in the Background section. While the definition from which it is derived originated in the UK in the mid-1990s, it has interesting echoes of the less often cited but powerful ideas of the respected US thinker R. S. Taylor (4).

If the audit is based on that definition, it will have a focus for its questions and observations,and a point of reference for determining how well information contributes to achieving objectives. Beginning with the question “What should be happening?” ensures that the effort invested in the audit is purposeful and econo-mic, and that auditors do not find themselveswith a load of data but no idea of how to derive value from it. As Charles Darwin remarked: “All observation must be for or against some view if it is to be of any service” (5).

tip: Make sure the audit team and steering group understand the method and why it is being applied in this instance. Encourage them to question, object, and discuss until they are satisfied that they have had a fair hearing and their reservations have been noted.

The information audit began on 2 April, 2003 with a two-day training workshop for the audit team. It ended on 28 July, 2003 at the audit team’s final meeting. An audit team successor group, composed of school governors and LEA staff, continues the work of carrying out the actions from the audit and establishing the CoP.

sTeP 2: eduCATe The AudITors And MAP “whAT shouLd Be”An audit team cannot do its job unless its members understand what they are going to do, how it will be done, and why this particular way of doing it is the right one for this situation.Understanding is most likely to take root and grow if team members have an immediate chance to apply what they have been told. Early workshop sessions gave the team members

228 TIMAF Information Management Best Practices Vol. 1

Figure 1: ‘What should be’ map, as used in our initial workshop

this chance. In the first session, the consultant introduced the basic ideas underlying the audit-ing method, and in the next session he involved the team in applying that method to create the audit’s first vital document —a map of What should be happening.

tip: Do not talk to the group for too long! Interleave telling with doing.

Team members were asked to draw on their individual experience to set out their own understanding of how the County Council should provide information for the governors, and how the governors should be using this information in their work.

The individual team members built up a collective “map” of their answers to questions on the key themes, shown in Figure 1. The questions and themes evolved in the course of

the consultant’s work over several years. They are designed to help auditors to work out what the organization’s overall objectives—and its objectives for the audit—imply for what it should be doing with knowledge and information.

The use of a strongly visual map format, dis-playing the questions on different themes on separate branches and clustering brief answers around them, also came through experience. It developed partly from working with design students and learning from them the value of visualizing ideas, and partly from seeing the method in use in an audit in a UK public regulatory body.

Each member started with a copy of the map and a supply of Post-it notes, on which they were invited to write their answers to the map’s questions. As they wrote, the project managers

22913 Liz Orna, Jane Jennings

and the consultant collected the Post-its and put them up on a large version of the map on the wall. This displayed a collective picture of the team’s ideas, built up from their individual answers.

This low-tech method was chosen because writing their own individual and anonymous answers means that each member of the team makes an equal contribution,. This equality allows for balanced discussion when the whole range of answers, grouped under main themes, is visible to everyone.

The workshop ended with the audit team answering the question “What is actually hap-pening?” in the same way—in the light of their own experience. They used their answers to create a parallel map to set alongside the one of what should be. This “What is” map proved in-valuable when it came to evaluating the audit’s

impact (see Step 6).

tip: Keep to the process and the sequence; it has been tested and it is best practice. Do not be tempted to leave out the questions in workshops. Make everyone answering the ques-tions work alone, and follow this exercise with discussion, rather than doing it the other way around.

After the workshop, the Project Manager and KMC analyzed the maps that the team had created. They entered the answers to each of the questions on both the What should be and What is maps into a spreadsheet and scored them against the stakeholders’ answers.

The Project Manager and KMC then used these answers to create a “template” of criteria for successful use of information. The team appliesthis template against the audit’s findings on

Figure 2: ‘What is’ map, as used in initial workshop

230 TIMAF Information Management Best Practices Vol. 1

what is actually happening, showing them where reality fits with what should be happening, and where it does not (6).

sTeP 3: PLAn To CoLLeCT InFor-MATIon And FoLLow ThroughIt is important to keep the momentum going so that auditors move quickly from laying the groundwork to collecting the information that builds on it.

The Project Manager and the KMC produced their analysis within a few days and used it to:

1. Create a statement of What should be: the criteria for success under each of the five key themes of the audit. These criteria would act as a template against which they could match the governors’ responses on the actual

situation as they experienced it.2. Identify themes and patterns in the answers to each question.

For example, one of the criteria for success was whether the governors knew how confi-dentiality and other related issues could affect the sharing of knowledge and information.

The Project Manager and the KMC used the main issues that emerged to help devise appro-priate questions for the governors (see Step 3). On the consultant’s advice, they sought more insight by interviewing key stakeholders and using that experience to refine the questions. Interviews with experienced local-authority staff, school governors, and head teachers identified new areas for exploration. For example, this question was suggested by a head teacher: Who manages the information

Figure 3: ‘What is’ map as used in Governor’s workshop

23113 Liz Orna, Jane Jennings

that comes to schools and is intended for the governors?

The KMC and Project Manager tested the questions at a pilot workshop for LA staff who were also school governors, and for clerks to governing bodies. The members of the audit team and the consultant ran the workshop. It followed the same format as the map-building part of the training workshop, but the gover-nors were asked only to apply their experience to the “What is” questions, because this was a totally unknown area. (see Step 2). The results of the pilot were used to refine the questions into their final form. The team used this refined question set at a series of governor workshops that it went on to run.

The team asked the chairs of governors at representative schools if they or a member from their governing body would be willing to attend one of the workshops. There was a slight worry that the majority of participants would be chairs, but in fact the attendance included all types of governors, from new parent-governors, to those with many years of experience.

In all, four workshops were held—three in localities close to where participants lived, and one at County Hall. While those in the localities were well attended, County Hall turned out to be the favored venue, possibly because occasions for governors to visit the headquarters of the Council were rare. Indeed, one governor said in a later interview (see Step 6) that visit-ing County Hall had been the best part of the audit. We applied this lesson in future audits.

tip: Get the exchanges with stakeholders right from the start, and keep them going all the way to the end.

sTeP 4: AnALyze And InTerPreT whAT you CoLLeCT And shAre ThAT dATA wITh sTAkehoLdersThe most important lesson of this step is to keep your communication with stakeholders going from start to finish. You depend on the stakeholders for the quality of information you gather, and even if you have treated them politely to get it, experience shows (1) that you will endanger the audit if you do not show them how you have used what they told you.

Once again, the Project Manager and the KMC analyzed workshop outputs. They compared the actual situation as governor stakeholders expe-rienced it with the What should be template, revealing risks, opportunities, and situations needing urgent action. It became clear, for ex-ample, that many governors were unsure about issues of confidentiality. As they often deal with personal information and sensitive issues, they needed to know that they could share informa-tion safely without breaching confidentiality. Training and processes to reduce the risk of inappropriate information sharing were an urgent need.

The Project Manager and the KMC were now able to start report writing, with advice and support from the consultant. They presented an interim report to the steering group, who approved it for presentation to the governors. The governors were invited to County Hall for a meeting run by the Project Manager and the KMC, and observed by the consultant. The turnout was very good, far larger than expected. But the governors did not like the report as presented by the Project Manager. They felt it was too long, too academic, and not always in plain English. They said it was not what they had asked for.

The KMC took notes which could be used to make amendments to the report. She also noted issues that were emerging for the first time, such as the observation that 25 percent of governors

232 TIMAF Information Management Best Practices Vol. 1

did all the work, while others did not take their share. The problems faced by new governors were also voiced here. Both issues proved significant and had to be addressed.

sTeP 5: rePorT And reCoMMend To MAke The AudIT The sTArT oF ACTIonOf all the steps in information auditing this is perhaps the most critical, as it is the one with most influence on whether there will be any return on the effort and resources invested in the audit.

revise the reportThe rejection of the interim report came just two weeks before the date booked for present-ing the final report to the sponsor and the governors. The audit team had to work quickly. At the team meeting that followed the meeting with governors, the team decided to do the editing then and there. Each team member took a section of the report and significantly reduced it. After this exercise, the Project Manager and KMC re-read the whole report to ensure it made sense, made necessary corrections, and added the appendices. These contained material that had been cut from the main body of the report, so that the information was still there, but presented in a format that did not hold up the main flow of the report.

Details from the findings that had been removed from the main report were also used to create an information product specifically for governors: a culture check (7), based on the types of situations the governors had experienced.

create action-Focused recommendationsA long list of detailed recommendations is liable to delay decisions. In some situations, it can be an excuse for inaction, in others, it makes those on the receiving end feel they cannot find a starting point and will never manage all of the recommendations. On the consultant’s advice, the number of recommen-dations was kept to a minimum, with emphasis on the action needed to achieve the changes that the audit showed were required.

When the final report was presented to the sponsor, the Head of Information Services, he approved the recommendations for the LA, and made resources available to implement them. When we presented the report to the governors, they acclaimed it as a true account of thecurrent situation (which confirms the value of courtesy to those who provide information for your audit).

One of the most significant proposals in the report was to create a “successor group” to implement the recommendations. A group of governors agreed to form the group, to carry out the recommendations, and to report back on the outcomes (see Step 7).

Follow recommendations with actionWithin a year, the LA launched an eGovernors Website to support school-based Communities of Practice, and made some drastic changes to the information products it provided (1) (6). The new formats and content were now in line with what the governors said they required.

Lessons:

• Keeprecommendationstoaminimumand make them action-focused. • Includetheresourcesrequiredforactionin your recommendations.

23313 Liz Orna, Jane Jennings

• Nominatepeoplewhowillberesponsible for taking the actions that the recommen- dations require. • Startactionimmediately,keeptrackofit, and communicate the results.

Six years later the system is still in use, and forms the heart of a network of Communities of Practice. Governing bodies throughout the authority continue to use the findings from the audit to improve the way they work.

This looked like a successful audit that had achieved the desired outcomes. But, in practice, how well had it really worked and how much had it told the LA that it did not already know?

sTeP 6: eVALuATe And LeArnInformation audits seldom include a formal evaluation at the end of the process, which looks at the experiences of both the auditors and the information providers and elicits lessons for the future. They should. This one did, as a result of a fortunate coincidence; at the time of the audit the KMC was studying for a higher degree, and she was able to make the evaluation the subject of her dissertation.

Evaluate Purposes, Questions and MethodsThe purpose of the evaluation was to judge the success of the audit, assess the suitability of the process for carrying it out, and learn lessons for use in future audits.

It sought to answer these questions:

• Howwellhadtheprocessworkedfromthe point of view of the audit team and the school governors?• Haditdeliveredvaluableinformationabout school governors and their information requirements?

The methods used to answer the questions were:

• Anafter-actionreviewwiththeauditteam, followed up with individual interviews with team members and a member of the steering group.• Acomparisonofwhattheauditteam thought the situation was before the audit began (the What is map they produced, see Step 2) with the audit findings on what it actually was.• Individualinterviewswithasampleofschool governors.

Lessons emerged which could be used in the future, on quality of work,on criteria for team selection, and on defining the focus and purpose of the audit. Much of this information came only from the interviews, and there is a lesson in that too: do not rely solely on what people say in groups; listen to individuals on their own as well.

teamworkTeamwork brought both negative and positive experiences. On the minus side:

• Timeforteamworkwasnotsafeguardedfrom other demands. • Somememberslackedclarityonrolesand responsibilities and on audit focus. • Somememberslackedknowledgethesubject of the audit.

Endeavors to keep the whole team in the loop led to over-communication. In particular, the deluge of emails was seen as exasperating rather than informative. An associated problem: the team’s experiences revealed the importance of skilled administrative support, provided by staff with guaranteed time for the task. This support was recognized and provided for in the course of the work, but not from the start.

234 TIMAF Information Management Best Practices Vol. 1

Interviews revealed that the team had not allowed enough time and attention for report-ing on the audit. The final version of the report, in particular, was seen as the victim of the hasty editing.

But there were the positive experiences too:

• Workingwithothersfromdifferentback grounds. • Learningnewskillsandnewwaysof working.•Developinganappreciationofcultural openness and non-hierarchical relations, and the wish to share that with senior management and Human Resources.

Overall, team members enjoyed their work on the audit. The general view was that the chosen audit process gave results that could not have been gained any other way.

consultant’s contributionThe interviews showed appreciation of the consultant’s knowledge, but varying expecta-tions of the audit among the team had not been addressed at the start, and there was a perception that the lead had been handed over to the Project Manager and the team expected to fend for themselves a bit too early. There are lessons for the consultant in those responses; she should have inquired into the initial ex-pectations, and explained why this particular approach was being applied in this situation. She should have assured the team that the early handover was because she was confident in their abilities and wanted them to gain maxi-mum experience for the future, and have made it clear that she would not desert them when they needed her.

Before and after analysisThe interviews suggested that the team thought the results simply confirmed their suppositions

before the audit. After the interviews, the team compared those suppositions, as documented in the What is map from the initial workshop, with the audit’s actual findings.

Significant unanticipated findings:

• Therewereseriousshortcomingsofhead teachers and governing bodies in “areas not usually discussed”.• Somegovernorshaddifficultyingetting access to information intended for them: some head teachers kept it locked away.• Manygovernorskeptpersonalcollections of information—for which they needed some IM training.• Governors’preferencesformediumrevealed an electronic/paper divide that was totally unknown.• Using“linkgovernors”todistributeinforma- tion had been thought good practice; it turned out that governors wanted summaries with links or references to documents in- stead, so they could judge for themselves.• Nothinghadbeenknownaboutexchange of information within schools; the audit showed great variations.• Governors’knowledgeofwhatinformation they should/could exchange: the team thought it was well understood, but there was actually serious ignorance about confidentiality.• Governorsdidnotknowwhotocontactin the local authority, a discovery that revealed their need for a directory or intranet access.

Lesson: Make sure the audit team sets out what they think the situation is before they start auditing; it is the evidence for evaluating learning from the audit.

governors’ experienceThese were the unknown stakeholders who provided critical information for the audit. The interviews show the value of treating these

23513 Liz Orna, Jane Jennings

stakeholders fairly from start to finish; their responses added new knowledge that has been used to advantage in other audits. Experienced governors volunteered mainly due to interest in their governance system; new governors were motivated by feeling excluded in their own governing bodies. Most enjoyed the workshops and being able to answer questions individual-ly—new governors because they could say what they felt in a way that would not have been possible in a group discussion, experienced ones because the themes gave a focus for thought. And they valued recognition by the education authority—invitations to meetings at County Hall had symbolic importance.

The audit promoted empathy: experienced governors admitted not delegating enough; new ones explained how they felt excluded and unable to contribute as they wished—a situation that partly explained the perceived division between those who worked hard and the “passengers”. Governors realized that others suffered the same insecurities as they did, and saw the need for more communication among themselves.

evaluation of the evaluation ProcessThe evaluation showed the importance of inves-tigating the value of the auditing process before bringing it in-house as part of an information management repertoire. Using a variety of methods to access the experience of key stake-holders was essential for bringing to light issues and lessons that could be applied with positive effects in future. The full range of issues would not have been exposed without applying all the techniques.

Lessons:

• Alwaysevaluatetheauditprocessas experienced by both the audit team, and the stakeholders who provided the information.

• Documenteverything,especiallylessons learned, and make the documentation part of the organization’s information resources.

sTeP 7: MAke AudITIng PArT oF The IM rePerToIreThis step is the one that seems to be rarely taken, but it is one that information auditors should aim for. Senior management can think it has ticked that box and “done” information auditing. This is liable to happen when senior management has not grasped what information auditing is about, when no efforts have been made to educate them, and when there is a lack of management support for the process. Other problems can arise from:

• Decidingtohandoverthewholeprocess to outside consultants. An outsider is often more interested in getting more work for himself than in engaging the staff audit team in the process and ensuring that they take ownership of following up on the audit’s recommendations. • Makinginadequateprovisionforwhowill be responsible for follow-through actions, or for resources to ensure those actions’ success. • Nothavinglong-termrecommendations/ resources for bringing information auditing in-house as part of the organization’s information competencies.

Become the In-house expertPrior to the information audit for school gover-nors, there was neither in-house expertise in this method nor any real understanding of what an audit could deliver. Following the training provided by the consultant however, the KMC got a taste for this approach and has used it to conduct four major audits. She has also used the principles of contrasting What is against What should be to develop information systems such as the one used by the Council’s contact centre

236 TIMAF Information Management Best Practices Vol. 1

take audits into New areasSubsequent audits have, like the first, disclosed invaluable information and highlighted risks that the organization had not recognized. An audit for parish councils (the smallest unit of local government in the UK), revealed that the organization was maintaining three separate databases of parish councillors’ details. None of the staff maintaining them knew who to notify when they received information about a change of circumstances. This situation ensured that one database might be updated while the other two were not. As a result, letters were being sent to councillors after the parish council had told the organization that the councillors in question were now dead. After the audit, one database replaced the three, and a parish council coordinator was appointed to maintain it and to manage all communications with parish councils.

In the same audit, parish councils were asked if all correspondence from the County Council could be sent by email. Responses revealed that some parish councils did not have a computer and were too poor to buy one, so the County Council provided all parish councils with funds to purchase the necessary electronic equipment.

An outstanding finding from two other audits—one which looked at all systems used by the County Council, and the other an investigation of its use of geographical information—was determining the number of staff who repre-sent potential “single points of failure” for the services the Council provides. These are people who have been in their jobs for years, have a wealth of knowledge, and carry out tasks on which the Council and the public it serves depend—and who are the only ones who know how to do these tasks. They constitute “invisible dependencies”; the organization depends on them, but they and their work become invisiblebecause they have been there a long time, dotheir jobs well, and keep things runningsmoothly. The audits showed that taking them

for granted was a risk that the Council needed to address, especially as some were nearing retirement age.

Make auditing cyclicalAny information audit is just a snapshot in time. To find out what changes it has brought about, it is necessary to conduct another audit. That is rarely done but in best practice, it should be. The rewards will be there, either to highlight areas where improvements can be made, or to show how much was actually achieved by the first audit—probably a mixture of both.

In the case of the school governors’ audit, we know that the successor group made use ofthe audit findings because they published their report, with case studies on what had been done, as a follow up to the audit report (8). We also know that the flow and quality of information has been improved. But we do not know whether or not the objectives towards improving the performance of governing bodies and of the schools they run have been achieved. We do not know if relations between governors and head teachers have improved, or if new governors are having an easier induction into the governing body. The only way to answer such questions is to carry out another audit.

accept that Some audits Won’t Deliver for reasons Beyond Your control, but Don’t give Up!Some audits do not deliver. The one on geo-graphical information had all the hallmarks of success— all the stakeholders were engaged, the final report was well received, there was a will to ensure the actions were carried out, and then ... nothing. Why? Because the most impor-tant aspect of any audit had been ignored. The original sponsor had left, and despite constant nagging by the KMC for the project manager to secure a new one, it was never done. That work was useless, at least for the moment. However,

23713 Liz Orna, Jane Jennings

the findings are still valid and the opportunity will arise for their actions to be carried out. The project just needs someone who will take the opportunity to improve the current system by implementing the report’s recommendations. This case illustrates why it is a good reason to have an in-house auditor, as a consultant would have been long gone.

Benefits for allAn in-house information auditor has obvious benefits for the organization that go beyond just saving money. It means there is a known expert to turn to for problems where other methods of finding out what is happening or why things are going wrong, such as the questionnaire approach, have failed. An in-house informationauditor who knows the organization and the people in it can use networks, knowledge of how things work, and of who does what, to make the job easier and quicker. It certainly helps in gaining buy-in from those who have already undergone a failed approach, and are suffering from being over-consulted. As the preceding story shows, an auditor’s continued presence means that he is on the spot to recognize second chances and take them.

The best benefits, however, are reserved for the auditor and the audit team. The audit exposes you to parts of the organization that you have never explored. It provides an insight into the work of others. It increases your understanding of the organization, and in doing so increases your value to it. The more you do the more expert you become; the more expert you become, the better the results and the actions that follow. People come to believe in you and trust you to deliver. And in any job, what could be better than that?

Lessons:

• Learnandapplylessons;usethemandthe in-house expertise gained on further audits.

• Makesureseniormanagementrecognizes information auditing as a core competency, and acknowledges the value of the staff who possess it. • Lookbeyondtheaudit.Plantofuture-proof the gains against organizational hazards of change at the top: loss of sponsor, changing external pressures, new fads among decision makers, or organizational oblivion. For “how- to” tips, see ‘Information Strategy in Practice’, Chapter 8 (1).

Summary and ConclusionWe would not claim that this six-year case study about a single organization represents all there is to be said about best practice in information auditing. But it is certainly an example of actually doing, over an extended period, what many authors agree in theory should be done. The experience of actually doing it, and the results, confirm that it constitutes good practice.

We recommend it, not as a rigid model whose every detail is to be religiously observed, but as a flexible and adaptable approach that pays off by:

• Minimizingtheriskofzeroreturnonthe resources invested, and maximizing the chance of getting good value.• Revealingarangeofdifferentperspectives on the organization—how people interpret it and feel towards it, and how that affects their actions—in relation to the information problems that the audit investigates.• Givingtheauditorsthepotentialforopening up new ways of relating to organization, creating new channels for information exchange, and experiencing new kinds of relationships within the organization.

TIMAF Information Management Best Practices Vol. 1238

• Offeringtheorganization’smanagementnot only effective solutions to the immediate information problems, but also alternatives to existing organizational structures and culture that can release new and productive energy in the use of knowledge and information.

While the approach is flexible and open to development, the key feature that gives it its value, and should be maintained in every application, is its essential focus on how human beings really think, feel, and behave in relation to information in the context of organizations, and the effects of this relationship—on their work, and on the success of the organization in meeting its aims.

The number of steps in the process is less important than doing the activities in the right order, making sure that the people involved in the audit start with the same understanding of objectives, outputs, and methods, evaluating as you go, recording the lessons you learn, and using those lessons the next time round.

The method used in the case study was deliberately low-tech and the evaluation primarily qualitative—even though a major part of the outcome was in electronic form—

both because of the nature of the problem it was meant to solve, and because of the characteristics of the population of school governors who raised the problem in the first place.

There is certainly scope for developing methods of quantitative measurement of the results that are appropriate and cost-effective, and finding uses of current ICT developments that have potential for supporting the essential focus on human interactions. It would be good to see other accounts of trying them in practice, over a period of time and in a variety of contexts.

references1. Orna, Elizabeth. Information strategy in Practice. Aldershot, UK: Gower, 2004.

2. Jennings, Jane. Discovering domains: Using an information audit to nurture a community of practice. 2004.

3. Wenger, Etienne, richard A. McDermott, and William snyder. cultivating communities of Practice. boston:

Harvard business school Press, 2002.

4. taylor, robert s. “Organizational Information Environments. Information and the 1. transformation of society. Ed.

sweeney, G.P. North-Holland Publishing company, 1982.

2. Gould, stephen Jay. Dinosaur in a Haystack. London: Penguin books, 1997.

3. Orna, Elizabeth. Making Knowledge Visible. Aldershot, UK: Gower, 2005.

4. Essex county council. Governing for success: school Governor’s Information Audit. 2003. http://www.essexcc.

gov.uk/vip8/ecc/EccWebsite/content/binaries/documents/Governing_for_success.pdf?channelOid=null.

5. Essex county council. the Next chapter of Governing for success. 2004. http://www.essexcc.gov.uk/vip8/ecc/

EccWebsite/content/binaries/documents/Governors150/the_Next_chapter.pdf.

23914 Alan Pelz-Sharpe

ALAn PELz-SHARPE ([email protected]) is a Principal at Real Story Group, covering a range information management technologies and practices. Formerly, he was a Strategist at Wipro and VP North America for Industry Analyst firm Ovum. A 20 year veteran of the informa-tion management industry, Alan has written extensively on document, web, and records management topics, and delivered keynotes at events around the world.

And not End Up With a Lemon

how to Successfully Procure Information Management technology

(Alan Pelz-Sharpe

14

240

AbstrAct

TIMAF Information Management Best Practices Vol. 1

According to research, the vast majority of IT projects fail or

fall short of expectations. One recent survey (1) stated that 24%

of all projects are cancelled prior to completion or delivered

and never used, and a further 44% of projects are late, over

budget, or fail to deliver expected features. In many cases,

this high failure rate is due to selecting the wrong technology

or technology supplier, ensuring failure from the start.

Information management technology often impacts a much

broader array of processes and people within an organization

than other types of technology. Information management tech-

nology typically changes the way people work, so the process

for selecting technology and technology partners needs to

be rigorous. A rigorous approach does not have to be overly

complex, however. You can apply best practices that will likely

lead to a much higher chance of selecting the right supplier

than if a more generic procurement approach is used.

241

Background

14 Alan Pelz-Sharpe

Generic procurement processes are often inadequate when it comes to procuring informa-tion management technologies. Best practices gleaned from working with clients around the globe can provide refinements to the standard practice. These refinements result in better tech-nology selection, and a reduction in overall risk.

The skILLs needed To PerForM ThIs BesT PrACTICeThe most important skill required to implement this best practice is communication. Though there are important legal and procurement elements that should be addressed by qualified professionals, the selection process and team will be made up of a broad range of stake-holders. Ideally this should be seen as a conversation, or as an interview process that works both ways. It is just as important for the supplier to understand your needs and limitations as it is for you, the buyer, to under-stand the abilities and limitations of your sup-plier. Somewhere there is an ideal match—not a perfect match— but an ideal whereby both the supplier and buyer can work openly, honest-ly, and effectively.

Communication is important, as it is not simply the technology you are assessing, but also the supplier and the relationship you will have with them over time. Additionally, you will need to conduct different conversations at technical, commercial, and business process levels. These conversations need to feed back into a central decision making environment.

sTeP 1: undersTAnd rFIs And rFPsA substantial part of my work at involves work-ing with Requests for Information (RFIs) and Requests for Proposals (RFPs) on our client’s

behalf (also called “tenders” in many parts of the world). We have written these requests, we have assessed responses to them, and we regularly help our clients make decisions based on them. We will look at RFIs and RFPs in detail elsewhere in this article, but for now we define the RFI to be an initial communication to a wide range of suppliers, inviting them to express an interest in bidding for your business. The RFP is a follow up communication that is typically much more detailed, and contains a list of technical and commercial requirements. In an RFI, a supplier typically simply responds by indicating their interest in proceeding; in an RFP, a supplier provides an analytical break-down of their offering, along with a commercial proposal.

Based upon my experience, it is no exaggera-tion to say that the whole process of issuing such requests is one of the clumsiest and most ineffective processes in the IT business, and that these processes are often the root cause of many projects failing or falling short of expectations. Though costly and often pain-stakingly undertaken, the traditional RFP based procurement process is ill-suited to the task of selecting Information Management technology.

Rather than providing a critique of existing practices, this article describes a streamlined and more effective best practice approach. This approach begins with a research period, and ends with the buyer selecting the right supplier to meet his specific needs.

sTeP 2: reseArCh your oPInIonsThe research period is critical to success, as the buyer needs to understand his potential options. Buyers also need to have some grasp of the marketplace dynamics within which the

242 TIMAF Information Management Best Practices Vol. 1

potential suppliers operate. This can be a very long and tiresome research project, but efforts here are seldom without reward. It is notnecessary to undertake all your own research—indeed doing so is an expensive and inefficient way to work—it is much better to buy research from respected and independent sources. There are many providers of detailed competitive technology research.

At this early stage in the project, buyers should research what we might call “viable options”. For example, if I am tasked with selecting new software to build a corporate internet site, I might immediately research Web Content Management (WCM) software suppliers, but they are not my only option. I could research portal suppliers or Enterprise Content Manage-ment (ECM) suppliers that might be a better fit, depending on my specific needs. If I have particularly unique needs, I could consider building my own solution or bringing in a firm to build a solution for me. I might consider open source options or a hosted service. I might even audit my existing software investments to see if I have options that I could leverage, rather than buying more software. The goal here is to assess a broad array of options and understand which of those are a good fit for my own specific needs. At the end of this stage, I may end up with a shortlist that consists of very different options that can each potentially achieve the same goal, rather than a shortlist of nearly identical suppliers.

We find that many buyers pigeonhole them-selves and their requirements far too early in the process in the buying process. Sometimes this is due to the use—or rather misuse—of analyst research tools such as the Gartner Magic Quadrant and the Forrester Wave. It is not uncommon to find that buyers have created short and long lists of potential suppliers based on little to no research, simply by selecting the suppliers that sit in a particular quadrant or area of one of these analysts charts. To work

this way is to set yourself up for failure. These charts do have value, and a lot of work and analysis goes into building them, however, they are not tools for selecting suppliers—they graphically depict “market leaders”.

At some level, you may feel that you want to work with market leaders, as there appears to be a comfort in working with perceived leaders. But the reality is that much of the leadership status of these suppliers is based on their sales record and market visibility. For example, IBM, Microsoft, or Oracle will almost certainly appear on any of these charts if they have a product in that particular niche, simply because they have the market presence and the sales ability to sell many licenses. They can become “leaders” very quickly, even if their products are not particularly good. And even if the product is good quality, what is good for one buyer may be a disaster for another. Even within markets there are many variances. These differences allow a market to be vibrant and for innovation to thrive. Take the ECM (Enterprise Content Management) market, for example. Industry leaders in this sector include IBM, Microsoft, EMC, and Oracle. The segment also includes lesser known companies such as Open Text, Hyland, Nuxeo and Alfresco. Every one of these suppliers has a different solution to sell you. If you are a mid-sized European bank looking for a document and workflow solution, you might favor one or two of these suppliers over the others; if you are a large globalbank you would likely favor different suppliers.If you are a pharmaceutical firm looking to increase knowledge transfer and collaboration, you might look at quite different suppliersagain. The best solution is the solution that is best for you—not the solution sold by a market leader. My advice is that you get some outside help in the form of published reports or one to one advice from independent experts. Use this information to select some broad and feasible options to pursue further. This weeds out op-tions that are unlikely to succeed within your

24314 Alan Pelz-Sharpe

organization at the earliest possible point in the process.

In a simple project, you can complete this stage of the procurement process in as little as a week or two. In more complex projects, the work may last one or two months, assuming you make use of expert research sources. If you were to undertake detailed research by yourself, it is not uncommon for this process to last between three and four months (in some cases longer). The simple point is that by accessing expert research, you can dramatically reduce the length of the procurement cycle.

sTeP 3: MAke use oF An rFIAt this point, your project is well under way—you have looked at the options available to you and have an understanding of your budget, requirements, and timelines. You are in a position to move to the next stage of the process: the Request For Information (RFI). If not, you need to take a step back and consider the missing elements further. Let’s look at these elements in turn.

BudgetsBefore you contact suppliers, you need to have a rough idea of how much you can spend. This is an important qualifier as to which options you can consider and which you cannot. It is also an important element as to whether a particular supplier will consider you a suitableclient to pursue or not. Put more bluntly, there are suppliers that consider any deal under $500,000 hardly worth the time to bother, and others that would consider a $500,000 project their Christmas and birthday combined. To have a constructive dialog, you need to match your budget and your supplier’s expectations. For example, a supplier that typically works on multi-million dollar deals is unlikely to invest time and expertise in deals that fall way below this level. Conversely, a small supplier may be

overwhelmed by a project that runs into the millions, and unable to cope with the scale and complexity of your needs.

You should also consider that in most informa-tion management projects the cost of the soft-ware is typically only a fraction of the overall project cost. For example, a project that spends one million dollars on information management software may spend another $2-3 million on development and consulting services, plus $200-300 thousand on related hardware and software upgrades, etc. Even though you will not discuss specific budgets with a potential supplier at such an early stage, suppliers are experienced enough to know how much your project will likely cost you. You need to be aware of this general price range as well, other-wise you may waste time pursuing options that are not a good fit for you.

requirementsIn some respects, understanding your specific requirements is even more important than knowing your budget limitations. One of the first questions I ask my customers is “Can we see your process charts?” If a client produces detailed process charts of the existing situa-tion and the proposed new process (the ‘As Is’ and the ‘To Be’), then I know that they have a good grasp of their requirements. If they do not, they are unlikely to have studied the current situation in enough detail, and will struggle to articulate their specific requirements to a poten-tial supplier.

timelinesTimelines are crucial both for you and your supplier. You may want to be up and running with your new system in six months, but a supplier may advise you that this is not pos-sible. The most important timeline to a supplier is when precisely you intend to purchase. This is information that you need to know as a buyer

244 TIMAF Information Management Best Practices Vol. 1

but that you must hold back from a supplier, at least in the early stages of negotiation. By being too open about your timeline, you can easily set yourself up for an unnecessarily unpleasant and overly forceful procurement cycle. If you have an aggressive timeline and announce this to potential suppliers, do not be surprised if you are inundated with equally aggressive sales approaches. If you intend to select and close a deal within the current quarter, you may well represent a sales executive’s best chance of meeting their quarterly revenue goals.If your project is large and is likely to go on for some time, you may want to stagger the procurement of a supplier’s goods so that you only pay for what you need, when you need it. There are many permutations here that play out in the real world; the key thing is for you to have a realistic set of timelines to work with at this stage.

At this point, most prospective buyers expect to enter the actual buying process by creating and issuing an RFP. But in any large or more complex project, or indeed any project where you have a large number of options, be prepared to issue an RFI first. Those of us who have evaluated RFP responses have many stories about completely incompatible suppliers who somehow end up on the shortlist. Your shortlist should only contain those who can meet your needs and no one else; a good RFI provides a pre-filter that determines a shortlist. The RFI allows you to gather informa-tion with which you and your suppliers can make better decisions, and is a key tool in eliminating those suppliers who cannot, should not, or do not want to be working with you. Once you have an RFI, any subsequent RFP becomes a key mechanism for a scoped and well- thought out project which only involves suppliers that have been pre-qualified and can do the work that is being requested of them.

RFIs are not complex to issue, and by following some simple rules they will be of considerable

value. For example:

• Thegoalsoftheprojectshouldcomefirst and foremost in any RFI document. What are you hoping to achieve? State this honestly up front and invite suppliers to respond creative- ly and cost effectively. Content Management is as much about business processes and user acceptance as it is about speeds and feeds. Allow potential suppliers to address both issues, and suggest viable partnerships.• Similarly,totheextentthattherewillalways be consulting work involved in any type of information management deployment (and in a big ECM deployment this will surely be the bulk of the cost), consider sending the RFI to systems integrators or other consul- tancies in addition to software suppliers. You will get different sorts of responses from services firms, and though they may not help you to narrow your supplier shortlist, you will learn from their suggestions. This can help crystallize for your selection team whether the RFP should ultimately go out to technology suppliers or integrators.• Makeitclearupfrontthatyouwillrequire references at an early point, and that you will vigorously pursue them. When you do pursue them, make sure that the supplier is not involved on the call in any way.

This is all common sense of course, but common sense is not all that common. The bottom line is that your procurement process should result in the best possible solution to your problem, not simply selecting a piece of software. That solution will certainly involve technology, but it will also involve services and a relationship with people who may spend a lot of time with you over the coming months and years. Your ideal solution may include many elements or only one. Being clear about what your needs are, and equally clear that you have not pre-determined the outcome, allows you to keep your options open, letting suppliers suggest creative solutions to meet your needs. A good

24514 Alan Pelz-Sharpe

RFI opens the door for a constructive conver-sation to begin. It also provides an early exit route for the buyer to reject clearly unsuit-able options, and for suppliers to walk away from bad fit situations before they become too deeply invested.

The RFI stage does not need to be long and labored. It can be a fairly swift activity, executed over a couple of weeks. The goal is simply to invite responses, allow suppliers to qualify themselves in or out of the process, and for you, the buyer, to get an initial feel for who should be on your shortlist of four to six suppliers.

Six is a good number of suppliers for a short-list, but four is also acceptable if you are sure that all four will eventually bid. Remember that procurement is a two way process, and that suppliers walk away just as often as buyers when it becomes clear that there is a poor fit, insufficient budget, or that the supplier has little chance of winning the project. This last point is worth some further consideration.

In many Information Management procurement activities, there is a preferred supplier— whether or not this is explicitly acknowledged. For example, some firms will openly state that if IBM or Microsoft have a product that might fit their needs, they will consider that product first. I once worked with a Fortune 500 firm that is information intensive and spends tens of millions of dollars every year on its infor-mation management systems. They asked me to work with them to review RFP responses from a range of suppliers, for a high value and very specialized activity requirement. To their dismay, only one supplier from the shortlist actually responded. I was asked to contact the other suppliers and find out why they did not respond to such a lucrative opportunity. When I spoke to them, they both told me they had bid for work at this firm on many occasions, but that the projects always went to the same

supplier (the one that had responded). Over time, I discovered that these allegations were in fact true, and that the only reason the project had gone to tender was that internal procure-ment regulations demanded that a competitive bid be undertaken.The buyer was only get-ting the other suppliers to jump through these expensive hoops to keep the procurement and legal teams happy; they had already decided who they were buying from. Sadly, this is so common that it is expected, and increasingly many suppliers refuse to go along for the ride—they know full well they have no hope of win-ning, regardless of their solution. It is a rotten situation that is unethical and damaging to the industry as a whole. Smarter buyers recognize the value of standardization, but keep the incumbent on their feet by regularly awarding contracts to competing suppliers.

Assuming you are not going down this unethical route, and that you have a fair expectation that the chosen suppliers will actively engage in the RFP process, it is time to create and issue the RFP to the chosen subset, based on their responses to the RFI.

sTeP 4: FoLLow uP wITh A sCenArIo-BAsed rFPIt is worth reiterating something we mentioned earlier in this article: there is no “best” system. We do not believe in magic quadrants, leaders-and-laggards, and other horse race-style evalua-tion approaches. Neither should you. The best solution for you is the one that best matches your needs— your budget, scope, and type of project—in short, the one that fits best for your scenarios. When you start to write your RFP you need to ensure that the document focuses on matching your needs against common patterns. This is a radically different approach to creating a purely “functional” RFP—one that lists needs, and expects a supplier to check a box one way or another. It is not that you should ignore this approach altogether; you

246 TIMAF Information Management Best Practices Vol. 1

should consider it incomplete because it does not tell you the deeper story behind the product or the supplier. Be sparing on the functional requirements list and focus a good deal of effort on functional scenarios instead.

Many suppliers do not like to focus on scenarios. Their sales and marketing people do not want to be put into a market niche. Larger suppliers can show you customers across broad industry types and scenarios, but the fact remains that all products have profiles that constitute their “sweet spot”. Research suggests it is possible to identify those profiles. More pointedly, if a supplier tells you they are good at nearly everything, beware. It may well mean that they are good at nothing, and significant implementation effort and ongoing support liabilities will fall to your team. In our experience, information management solutions almost never fit plausibly across more than a few of the many potential scenarios they face. To be clear what we mean by “scenario”, lets look at several examples:

example aPeter receives many thousands of multi-page documents each year; these need to be scanned and made into electronic format. He receives content in the form of PDFs, XML documents, HTML, CD-ROM, and an ever increasing number of file formats. The content covers 15 foreign languages. It is critical that each document is captured and indexed as soon as possible, typically within hours of receipt.

Currently Peter achieves this by scanning identified content through three production scanning stations and converting the captures to TIFF. These TIFF images are later viewed by workers who extract key information for indexing.

Peter needs to be able to use Optical Character Recognition (OCR), to create PDF files in addi-

tion to the TIFF images, and to do all of this automatically. The OCR system needs to handle all or many of the various languages and character sets. Additionally, Peter would like to be able to recognize and extract images from the files, and to configure the system to recog-nize key words and collections of words and characters, to enhance the future production process.

example BCurrently the work processes and associated tasks at ACME are driven via reliance on a range of legacy systems, and on expert know-ledge within the company. ACME wants to move to a more rules driven set of processes, enabling workers to concentrate on value added production. Hence ACME will be looking for a Business Process Management (BPM) system that is easy to configure, model, and manage in real time, and is also a rules process (or engine) that can be centrally or locally managed, allow-ing them to make best use of existing know-ledge and information regarding analysis and indexing of chemical information.

example cACME receives a very large amount of content each year. Only a small percentage of it isactually of high value and will complete a full indexing and analysis process. ACME wants to archive all content on receipt, and to cull this content based on retention schedules. Likewise, they desire to archive all content of some value, and either retrieve this content at a future date, and/or mine it for trends and further information. Archiving the content needs to be seamless and transparent to end users, active and readily accessible, and low cost to both run and maintain.

Each of these scenarios explicitly details a current situation, and a desired improvement to that situation. This gives an astute supplier

24714 Alan Pelz-Sharpe

an understanding of what is important to you and what is not, enabling the supplier to con-struct a response to your RFP that specifically meets your requirements.

This is the opposite of what normally happens in an RFP process, where buyers list reams of requirements. Suppliers have little choice but to simply go through, check the relevant boxes, and return their response. I’d like to argue that this is a largely pointless exercise, as almost all suppliers are very experienced in responding to such RFPs, and typically decide whether to respond or not based on a combination of the expected value of the project and their likelihood of winning. Whether their product or service is the right fit for your specific needs does not enter the equation, in large part because reviewing hundreds of generic functio-nal requirements does not tell them what your specific needs are.

rFP StructureA good RFP typically runs between 50 to 80 pages. The table of contents runs along the following lines:

• Communications- Contact Information - Confidentiality Notice - Use of Terminology• IntroductionandBackground- About Your Organization - Purpose of the Request for Proposal• Process- Due Dates- Proposal Format - Narrative Proposal - Pricing Proposal - Proposed Prices - Timeline and Review Process - Respondent’s Q&A - Respondent’s Technical Demonstrations - Due Diligence - Schedule

- Implementation Model• UserScenarios(For example) - BPM and Workflow - Dynamic Publishing - Document Tracking and Reporting - Legacy System Integration• OtherRequirements(For example) - Dynamic archiving - Standards - Rendering• ProjectManagement- Schedule & Plan - Staffing and Project Management - Risks• SupplierBackground- Respondent’s Financial Condition - Qualifications - Product History - References - Service & Support - Training and Documentation• GlossaryofTermsUsed

This RFP structure explains your working requirements, your limitations, and your expec-tations to the potential supplier, but it is in assessing the suppliers’ responses to the RFP that this approach really comes into its own. It provides a structure for the supplier to respond to you in a narrative format that, in addition to providing you with key technical information, allows the supplier to explain how they would approach your specific requirements, via the scenarios. In short, scenarios elicit a deeper and richer response from the supplier than a 400 page series of checklists could ever deliver.

note:Never get a supplier to help you draft an RFP! This may sound like common sense, but I am amazed by the number of RFPs I receive that have the clear mark of one particular supplier or another on it. If this is the case, you cannot expect rival suppliers to make much effort in responding. Remember, suppliers live in the Information Management world full time,

248 TIMAF Information Management Best Practices Vol. 1

while this is (hopefully) a one-off deployment for you. Suppliers can spot the influence of a rival a mile off.

sTeP 5: MeeT The Vendor And The ProduCTOnce you receive the vendors’ responses, you will typically see that some are consider-ably better than others. This makes creating a shortlist of suppliers that you will take to the next stage, demonstration, a generally straight-forward process.

Note that we use the term demonstration, not presentation. At this stage, you invite each of the short listed suppliers to demonstrate their solutions to you. For a large project, you will need to allocate a full working day to each sup-plier; in smaller projects a half day will suffice. The structure of the demonstration day should run along the following lines:

- 9 AM to 9:30 AM Introductions and General Orientation - 9:30 to 10 AM Offering Overview - 10 to 12 Noon Product Demonstration - 12 Noon to 1 PM Lunch- 1 PM to 2 PM Product Demonstration - 2 PM to 3 PM Q&A Session - 3 PM to 4 PM Technical Discussion - 4 PM to 5 PM Pricing Discussion

The key to this simple structure is that once introductions are complete, the supplier has a 30 minute opportunity to go through their standard powerpoint slide deck, and then put it away. The rest of the day is devoted to truly demonstrating the product. To be clear, this is not the same as a proof of concept—all you are asking the supplier to do is show you the product in a working format.

This may seem an obvious thing to do, but in fact very few buyers do this. Most are content to watch “canned” presentations consisting of

screenshots on PowerPoint, with a few pre- configured demonstrations of the actual product thrown in for illustration. In our alternative approach, prior to demonstation day, you ask the supplier to be ready to demonstrate various capabilities of the product—for exampleBPM (Business Process Management) and GUI (Graphical User Interface) builds. You are not asking the supplier to pre-build anything for you, though of course a smart supplier will at least do some homework based on the informa-tion you provided them in the RFP to ensure they demonstrate capabilities in a manner that is consistent with your needs. Remember, you are not placing an onerous burden on the information management supplier, rather you are simplifying and improving what is in our opinion a broken process.

To work through this particular demonstration example, you first invite the selected suppliers and schedule dates for them to demonstrate to you. Typically you will schedule concurrent days, as it is easier to compare competing demonstrations if they are seen back to back. Once dates are agreed to, you send the suppliers a short document that provides them withguidance for their visit. Structure the document along the following lines:

• Timetable.•Venue.•Audience.• GuidedDemonstrationStructure(based on the scenarios detailed in the RFP).

In this last section, you explain that you want to see the product used, either by running it on the supplier’s laptop, or by connecting remotely. Here you should specify exactly what you want to see. For example:

• Createanewmetadatacategorywithina document type. • Showushowyouwouldmap,create,and generate a simple workflow.

24914 Alan Pelz-Sharpe

• BuildusasimplegenericGUIusingyour GUI design tool.

Within the document, you also need to state who you want in attendance at the demonstra-tion. Typically you want the people who will actually work with you if they win the deal. You should insist on this, even though most suppliers will push back hard at the request. A supplier wants to provide their best presenters for the demonstration, but of course a buyer will never see those presenters again, should the supplier get the contract. You need to know who you will be working with, and be sure that you want to work with those people.

Clearly it is right and proper that a supplier will send a sales person and potentially a partner level consultant to lead the day, but again, you as a buyer need to understand that the work of the sales person and partner is simply to seal the deal with you—they will then move onto the next deal.

Even if a supplier cannot or will not provide the team members that would work with you, they must provide people with the right tech-nical skills to answer all your questions, and drive the demonstration of the product that day. Although you can over-emphasize “chemistry” in the sales phase—after all, copacetic sales-people will go away when the real work starts—the supplier’s preparation and willingness to engage is an important indicator of how good a match they think your project makes.

I have participated in many of these demonstra-tions over the years, providing the buyer with independent advice and helping structure the questions and discussion that the supplier will be involved in. I can tell you unequivocally that the demonstrations are the single most impor-tant activity in the entire procurement process. Suppliers, particularly large ones, are simply not used to the buyer being in the driver’s seat. This new experience for them, and often

exposes their product in a level of detail that may not always be flattering. Moreover, larger suppliers have a tendency to focus less on your needs and more on their own product capabili-ties, particularly as they consider their product to be “market leading”. This is something else that you can explore during demonstration days. If the supplier with the best technology cannot grasp your needs in a full-day pitch demo, they are unlikely to make a good match over a multi-year relationship.

Remember that the purpose of a demonstration day is not to skewer a supplier, it is to get to see the product and the supplier for real—to see if there is a fit, if there is a potential relationship that could work, and if the inevitable compro-mises that will need to be made on both sides of the equation will result in an acceptable outcome.

In our best practice demonstration day schedule outlined in the preceeding paragraphs, note that I left the pricing/commercial discus-sion until the end of the day. This is not acciden-tal. As a demonstration day goes on, you will often see information management suppliers using modules and functions in their demon-strations that have not been listed in theirpricing quote. You may also see products used in the demonstration that did not work particu-larly well, or that you felt would be betterreplaced by a best of breed element. Because of this, you need to review the pricing quote by the end of the day—you will be in a better position to discuss and question it based on the day’s presentation. It is also fair to say that psychologically, you are in a stronger position to negotiate or set pricing expectations at the end of the day, as it is highly unlikely that the supplier’s day of demonstrations went without a hitch. In our experience, some demonstrations are nothing short of disaster for the supplier, which can be seen as a close escape for the buyer.

250 Information Management | Best Practices Vol. 1

sTeP 6: seLeCT The rIghT suPPLIer For your needsGenerally speaking, the demonstration days alone will make clear which supplier you prefer. But that is not the end of the process—you also need to follow up in detail with references. In most cases, we recommend that you talk to a number of references who have similar requirements to your own needs. There is no point at all in talking to Fortune 500 customers in a different industry whose requirementsdiffer from your own. You need to stress to thesupplier that you want somebody who shares similar experiences with you.

When you contact the references, you should never allow a supplier to be present, either in person or on the phone. Be aware that many reference firms are compensated for providing positive references. I have spoken to many reference customers over the years, some of whom are referees simply because they are happy customers, but many because they were given special pricing, special access to support, free travel and accommodation at exotic resorts (read conferences), or were simply given the product for free. It is vital to check references, but they are always far more revealing and valuable when the supplier is not involved. You can get references from sources other than the supplier, via user groups and online communi-ties, or from consulting and analyst firms that cover the supplier’s products and services.

Once you have selected a supplier, it always makes sense to start with a Proof of Concept (POC) exercise to be sure that you have made the right decision. The POC can effectively be stage one of your project. The supplier should be compensated for their work, even if the POC was unsuccessful and you decide that this is not the supplier for you. It was your mistake to choose them, not theirs to pick you. We mention the POC here only to illustrate that the selection process continues, as does the relationship. Presuming you have followed

these best practices, there is a very good chance that the POC will be a success and a learning experience for both you and the supplier to glean information about one another and move onward successfully.

Summary and ConclusionBased on our experience and that of my clients, modifying the standard RFP approach toprocurement to emphasize the value of RFIs, scenarios, narrative based responses, and struc-tured product demonstrations, helps a buyer of information management technology to not only simplify the procurement process, but also to have a higher likelihood of selecting the “right” supplier.

This approach is more interactive than traditional methods, and recognizes the true complexity of information management. Infor-mation management technologies impact a particularly wide range of stakeholders and will undoubtedly change the way you work. As such, you need to thoroughly understand the options you have, and recognize that selecting a supplier is not as simple as selecting the “market leading” technology. Rather, it is about selec-ting the best fit for your purposes—that is, a technology and a supplier that can come to understand your specific needs and can add value to your organization. The selection process is as much one of matchmaking than anything else— indeed, “matchmaking” is a good metaphor to carry with you throughout the process.

The best practices shared here have been used many times by both midsize and large organiza-tions across the globe, and have proven to be highly effective. Nevertheless, this approach to procuring information management technology

25114 Alan Pelz-Sharpe

remains distinct and in the minority as of today. It is my hope that by sharing the approach more widely I can encourage others to institute it, ensure that more buyers select the right supplier for their needs, and improve the high failure rates for projects like those discussed at the start. Projects that start with poor supplierselection seldom succeed, yet selecting the right supplier for you is not a matter of luck—it is a matter of following the right procurement process.

references(1) standish Group http://www1.standishgroup.com/

newsroom/chaos_2009.php

252TIMAF Information Management Best Practices Vol. 1

25315 Charlotte Robidoux, Stacey Swart

CHARLOttE ROBiDOux ([email protected]) is a Content Strategy Manager at Hewlett-Packard Company (HP) and has over 17 years of experience in technical communication. At HP, she oversees the single sourcing strategy and implementation for the StorageWorks Division. Charlotte holds a Ph.D. from the Catholic University of America in rhetoric and technical communication. She is the author of ‘Rhetorically Structured Content: Developing a Collaborative Single-Sourcing Curriculum’ published in Technical Communication Quarterly. She is co-editor of ‘Collaborative Writing in Virtual Workplaces: Computer-Mediated Communication Technologies and Tools’.

StACEY SWARt ([email protected]) is the Content Management System Administrator and Strategist at StorageWorks Division of Hewlett-Packard Company (HP). She has over 16 years in the tech industry in areas ranging from technical support to technical communication, and is certified by HP as a Lean Sigma Green Belt. Stacey holds a B.S. from the University of Kansas in Education and English.

Develop a Metadata Strategy in Eight Steps

Streamlining Your Path to Metadata

Charlotte Robidoux Stacey Swart

3 15

254 TIMAF Information Management Best Practices Vol. 1

AbstrAct

A Content Management System (CMS) allows a business to

streamline its content development processes, using and

reusing content from a single source for multiple purposes.

Fully leveraging this capability requires the ability to access

and manage your content, and managing your content

efficiently necessitates a robust metadata strategy. However,

developing a metadata strategy can be intimidating, onerous,

and costly. The sheer amount of time needed–time to research,

evaluate, synthesize, implement, and maintain a viable

solution–can prompt even the most dedicated among us to

abandon a strategic effort altogether. For this reason, it is

essential to find a streamlined approach to metadata strategy

development. This case study explores how can groups stream-

line their metadata development without cutting corners and

without undermining the purpose of having a CMS.

Establishing a metadata strategy using a gradual approach

makes the process more streamlined and manageable. The

key to this solution is to define metadata components that are

meaningful in your environment. After creating these compo-

nents, you can determine the optimal configuration for your

business and customize a taxonomy that makes your content

easier to find. This means that comprehensive metadata solu-

tion is both directly managed by users who assign predefined

values from controlled vocabularies and system-driven.

The solution also depends on input from all team members

involved in content development, from content developers to

editors and administrators. This article discusses the essential

steps needed to streamline your metadata strategy.

25515 Charlotte Robidoux, Stacey Swart

Background

At times, research on metadata can make the concept seem more like a metaphysical journey than one related to any practical outcomes. Yet as long as there has been a need to categorize objects and the information describing them, metadata has been the essential means for managing information collections or reposito-ries. In our modern age, the need to manage and access data on a large scale in a global economy is no less important. Metadata is central to modern authoring environments.

For example, it is an integral part of automa-ting technical documentation development; documentation which enables users to operate the complex technologies that help to drive business transactions. More generally, it is vital to administer metadata efficiently, as indicated by metadata expert Murtha Baca: “Institutions must streamline metadata production andreplace manual methods of metadata creation with ‘industrial production whenever possible and appropriate’.” (1, page 72)

The skILLs needed To PerForM ThIs BesT PrACTICeSuccessfully implementing this strategy requires one or more people in each of the following roles:

• Content Librarian: Oversees the quality of modular content in the CMS and assists writers with opportunities for reuse across the database. • Editor: Manages edits at the sentence level and reviews content against style guidelines.• Content Developer: Uses content mapping to define and create reusable content modules. • tools Administrator: Configures and manage the tool set.

Including these roles as a part of your strategy is key to your success. Without them, you will find holes in what could be a more streamlined approach. Step 7, Assign Metadata Tasks to Roles, describes this in detail.

sTeP 1: deFIne whAT MeTAdATA MeAns To your orgAnIzATIon And why IT Is IMPorTAnTIf you have found that some simply stated definitions of metadata are hard to make use of, and other highly technical ones are hard to understand, you are not alone. Metadata is an intricate subject that has become increasingly technologized. While stripping the term to its bare essence—such as “data about data”—helps demystify it, such definitions leave us with few clues about how to move forward. Finding comparative definitions that make sense in your organization can serve as a useful starting point for understanding the concept: card catalogs in a library, directories in a grocery store or mall, playlists on an iPod, for example.

For our team, the most compelling comparison was right in front of us—metadata as an index in a book or a help system. The index compari-son enabled our team to appreciate why meta-data is important—it helps us organize and access content. We also related the concept of metadata to our own environment by review-ing established metadata standards to see if, and how, they would fit our needs. Standards or schemas are rules for uniformly managing information within and across repositories. They fall into various types:

• Structure;• Value;• Content;• Format.

256 TIMAF Information Management Best Practices Vol. 1

For example, the Dublin Core Metadata Element Set (DCMES) (2) is a general standard that provides guidelines for structuring informa-tion into categories. It was formulated to describe documents on the Web. DCMES defines 15 metadata elements, including:

• Title;• Creator;• Subject;• Description;• Date;• Type.

This standard features a succinct set of elements or categories and has been endorsed in other standards like ISO Standard 15836-2009 (3) and ANSI/NISO Standard Z39.85-2007 (4). The general nature of DCMES elements make them applicable to many organizations.

However, if you want to streamline your path to metadata, avoid getting lost in the sea of standards available. Because “[t]here is no ‘one-size-fits-all’ metadata schema or controlled vocabulary or data content (cataloging) stan-dard”, consider drawing on aspects of variousstandards that will fit your organization (1, page 72).

No specific standards seemed to target computer documentation, but our team did consider standards related to structure in order to verify that we were targeting all of the key elements. We only evaluated other standards, including one from an HP marketing group, if they seemed pertinent to our environment. For example, we drew on a value standard, ANSI/NISO Z39.19-2005 (5), to find guidelines for developing controlled vocabularies, as discussed below.

The challenge in defining metadata was learning to appreciate the power inherent in distinguish-ing content from the descriptors used to access and manage that content effectively.

sTeP 2: deTerMIne The goALs ThAT drIVe your MeTAdATA sTrATegyKnowing what you want metadata to achieve is fundamental to developing a sound strategy.Once your team agrees on a definition of meta data, turn their attention to identify the primary goals that will drive the strategy. Experts suggest “working backwards” from your goals to the metadata needed to reach your goal. “Deciding which aspects of metadata are essen-tial for the desired goal and how granular each type of metadata needs to be” is essential to the process of formulating a strategy (6, page 193) (1, page 19).

We began by listing the various kinds of information that would be useful to us: track-ing types of content and components, content status (new, a draft, approved), who originally created the content, who revised it and when, what content is reused, where the content is reused, workflow tasks, multimedia objects available, version details, profiled content, sys-tem performance, and reports related to these items. Next we compared this list with several types of metadata: descriptive, administrative, and structural. While experts refer to the number and names of these types differently, our team drew on the types identified by NISO (7). These types are described in Table 1.

25715 Charlotte Robidoux, Stacey Swart

When looking at these types of metadata, we saw that items on our list could be understood in terms of these categories. From this view, we began formulating and prioritizing our goals, short-term vs. long-term. Given our focus on gaining efficiency, we determined that being able to retrieve and reuse content was a paired goal. Another important goal was to minimize the risk of content being reused prematurely. Longer term goals included tracking the percentage of content we reuse, determining what reuse opportunities are still untapped, ensuring the quality of our deliverables, and identifying what content is being localized.

Through this exercise, we could see that all these metadata could help us achieve our goals. The focus on metadata types helped to stream-line how we thought about our goals. Our next step was to understand what specific metadata components would help us attain our short-term goals.

sTeP 3: IdenTIFy The MeTAdATA CoMPonenTs ThAT heLP you oBTAIn your goALsSelecting metadata components is extremely important in the process of establishing a metadata strategy. The ability to decide on the optimal number of metadata components is not easy. How do you pick just the right number, not too many or too few? Which ones will have the biggest impact and help to minimize risk? Here are some sample questions you should consider (5, page 193-194 and 196):

• Whattypeofcontentisit?• Whatelsedoyouneedtoknowaboutthe content to ensure the correct piece of content is retrieved? • Inwhatformwilluserswanttoretrieve content? • Howwillusersspecificallyidentifythe desired content?

taBLe 1 Metadata Types

Metadata type

Descriptive

Administrative

Structural

Identifies and describes collections resources.

Used in managing and administering collections, versioning, and reten-tion.

Delineates the organization and relationship of content units within a system.

Assists with queries and the ability to locate types of content that can be reused. This includes:

• Contenttypeandstatus• Trackingtypesofcontent/components• Profiledcontent• Multimediaobjectsavailable

Enables creation and management of collections and configuration of tasks, permissions, status, and history:

• Whocreatedcontentandwhen• Workflowtasks• Versiondetails• Reusestatistics• Systemperformance

Supports navigation and means of combing components into deliverables:

• Whatcontentisreused• Wherecontentisreused• Wheremultimediaobjectsreferenced• Reporting

Purpose relevance to our environment

258 TIMAF Information Management Best Practices Vol. 1

Research into structure standards showed our team that we should focus on components that describe the subject of our content (one of the Dublin Core elements). These components would be the basis of user queries. The best way to streamline this step is to look at your own content for the answers. Once again, the index serves as a valuable tool for understanding what terms might be queried, along with the table of contents, providing clues about the hierarchy of terms as they relate to the subject. Linking the

index concept to metadata was useful in helping team members understand metadata hierarchies and how the components related to each other.

Following Ann Rockley’s advice to select three to five components, we chose four that were subject related and two that were user related, as shown in Table 2.

taBLe 2 Metadata Elements and Attributes

element

ContentType

Product

Keyword

Abstract

Originator

Reuser

Exactly 1

1 or more

At least 2

At least 1

Exactly one

Exactly one

The largest “container” used to describe major topics that make up our documentation. “ContentType” describes the subject matter of the content.

A smaller “container” used to qualify how a topic applies to various products. “Product” designates the name of the product for which the content was written, including the model and / or version.

The smallest category that further limits the relevance of a topic. “Keyword” helps to further narrow search results.

Provides a synopsis of content that authors can use to determine if reuse is appropriate, describing the subject of the content, why it is relevant, and guidelines for using the content.

Who originally created a reusable piece of content.

Authors are reusing a piece of content.

occurrence rule Purpose

25915 Charlotte Robidoux, Stacey Swart

After choosing our components, we had to consider how to manage them in our CMS. We streamlined the process by drawing on options that our CMS already supported. Our CMS allowed searching on fields such as “Status,” “Create date,” “Edit date,” and “username,” but we needed to search on more specific subject-related content as well. Our DTD, which is a subset of DocBook, only contains “keywordset/keyword.” To fill the gaps, we developed custom elements and attributes, adding custom elements for “ContentType” and “Product,” and two attributes for “Originator” and “Reuser.” We chose elements when we might need to use multiple values, and attributes when we wanted to enforce only one value.

While it was clear that the goals of retrieval and reuse could be achieved by building related metadata into our content, we felt that the goal of minimizing the risk of premature reuse needed additional CMS support. To achieve this, we organized our content into two collections: “Working” and “Approved.” Working collec-tions would contain work in progress; only the “originator” content developer could reference this content; reuse by others was not supported.

In contrast, “Approved” collections would contain finalized content that had been reviewed by an editor as well as subject matter experts and could not be changed; any author could be a “Reuser” of the content contained here. Separate collections ensure that original content will not change if reused. Instead,“Reusers” must copy content from an Approved collection to a Working collection to propose changes. After those changes are made, the author initiates a Change Proposal work-flow, illustrated below, via the CMS. The workflow automatically notifies the assigned stakeholders that a change to content is being proposed. Some of the automation is possible because of the metadata attributes “Reuser” and “originator.” The CMS is able to determine

who initiated the change proposal and who the changes will affect. The workflow content librarian also employs automated email notifi-cations and task assignments. Two options are possible: either the approved content is updated to reflect the changes agreed upon by the review team, or new content is added to the Approved collection because the original content is still needed as first written. Byorganizing the CMS collections this way, and by creating a workflow that leverages user-related metadata, we effectively streamlined our use of metadata and found a way to leverage elements to minimize risk when reusing content. That is, a strategic approach to metadata from the outset triggers additional efficiencies; streamlining metadata cascades into workflow and CMS implementation.

sTeP 4: IdenTIFy MeTAdATA VALuesWithout question, identifying metadata values to create a stable list of terms-a controlled vocabulary-is the most time consuming and contentious step of the process. Deliberating over synonyms and laboring over documents to test the appropriateness of the values seems endless. The best way to streamline this partof the process is to form a small workgroup of three or more members who can begin to evaluate document conventions and create lists of terms related to the components selected. (A workbook works well for managing the terms on separate spreadsheets.) As mentioned earlier, our team drew extensively from ANSI/NISO Z39.19-2005, Guidelines for the Con-struction, Format, and Management of Mono-lingual Controlled Vocabularies.

This standard helped the workgroup and users appreciate why a controlled vocabulary is so important, given that “[t]wo or more words or terms can be used to represent a single concept” and that “[t]wo or more words that have the same spelling can represent different

260 TIMAF Information Management Best Practices Vol. 1

concepts” (5). When creating the lists, the work-group relied on the Standard’s recommendations for conducting “top- down” and “bottom-up” assessments, determining “the correct form of each term,” and for following key principles such as: “eliminating ambiguity,” “controlling synonyms,” “establishing relationships among terms where appropriate,” and “testing and validation of the terms” (5).

Once the lists were created, workgroup members began vetting these lists with seasoned authors, many of whom were not co-located. The ability to engage teams across the organization when the workgroup had little authority was especially challenging. We relied on many virtual collaboration techniques to streamline our efforts so that we could complete the work. Do not overlook the importance of showing the value of metadata to the users-they need to understand and believe in the purpose of their work, and realize that metadata:

• EnhancesquerycapabilitiesintheCMSby enabling “effective retrieval” (6, page 18).• Allowsuserstolocatetheirowncontent,as well as other content that they could reuse or leverage.• Reduces“redundantcontent”(6,page185), making content developers more productive.• Reducescosts(Managementmaycaremore about this, but in today’s work environment, a content developer who is saving the company money is a content developer worth keeping.).

Additionally, employing a controlled vocabulary saves the content developers time by increasing the amount of content that can be successfully retrieved.

The ANSI/NISO Z39.19-2005 standard provided essential principles for maintaining a controlled vocabulary, especially how best to manage additions and modifications as well as a history of changes (5, page 97). The change

history was especially critical when updating the values in our tools. These processes are contained within a single resource that we refer to as metadata guidelines.

Documenting the metadata process is a must. Bob Boiko discusses the idea of a metatorial guide containing “a set of rigorous metadata guidelines,” similar to an editorial guide (8, page 495). Boiko goes on to say that the metadata process must ensure (8, page 508):

• Metadatacompletenessandconsistency.• Contentmanageability,accessibility,and targetability (intended for the appropriate audience).

A thoroughly documented set of rules and procedures helps take the guesswork out of metadata application. As Boiko explains, “in a sizable system with numerous contributors you can almost guarantee that you will find wide variation in the ways that people interpret and apply even the most precisely stated tagging rules” (8, page 509). Providing a link from the tool’s support menu to the metatorial guide puts the information at the content developers’ fingertips, giving users easy access to the meta-data processes and guidelines. As previously discussed, proper application of metadata is critical to ensure quality search results. Making the guidelines as accessible as possible will help ensure that they are followed.

Once guidelines are documented, you need to determine what type of user will apply the metadata. Should content developers add all user-driven metadata, or should a content librarian assist them? What are the roles regard-ing metadata application? Boiko contends that “a different set of skills is necessary to deal with this metadata” (8, page 495). Some users can be trained to apply metadata. However, as he goes on to say, users “rarely have the where-withal to discover what content others are sub-mitting and exactly how to relate that material

26115 Charlotte Robidoux, Stacey Swart

to what they submit.” Someone on the team with an eye for detail like a content librarian is more appropriate for this role.

While content developers understand their content and usage better than anyone else, as noted by Peter Emonds-Banfield, they might not have the “expertise necessary for meta-data creation, nor the time to keep up with it”; whereas “... metators (= editors that manage metadata) can play a role by educating content teams on metadata creation” (8, page 509). As previously discussed, some tools can be configured to enforce certain rules; however, some standards require the human eye. In those cases, the content librarian can audit metadata application before content is approved, ensur-ing the metadata values chosen by the content developer meet quality standards. You can liken the role of a content librarian to that of an editor. Instead of reviewing content against structure and style rules, the content librarian reviews metadata against metatorial guidelines, ensuring that metadata application is consistent throughout all content in the CMS. Boiko refers to this as “a central point of decision” (8, page 511). The more complex the metadata and content, and the more users who access it, the more critical such a point of decision becomes.

On the other hand, is having the content librarian audit metadata application by content developers sufficient, or should the content librarian apply all metadata to content, com-pletely releasing the content developer from such a burden? According to Boiko “the task of looking after the metadata health of a CMS is crucial if you want to be confident that the content in the system will show up when you expect it to.” (8, page 511). This the point for content to be retrievable so that you can then reuse it. If you want to be completely sure that metadata is applied consistently across all content, regardless of who originated it, then having a content librarian perform this task is as close to a guarantee as you might get.

However, some organizations do not have the resources to staff a content librarian. In that case, an editor might take this on as a new role. If resource constraints are an issue, some organizations must rely on content developers to apply user-driven metadata. In this case, the metatorial guide is what you are betting on, and it must be rock solid.

In our case, we rely on content developers to apply user-driven metadata. The editors are charged with reviewing metadata as they would any other content. The content librarian is consulted when questions arise, and also audits content in the CMS for consistency. Ultimately, the content librarian is the point of decision and is responsible for educating others and maintaining the metatorial guide. We have also staffed a trainer who works with the content librarian to develop metadata training for all (content developers and editors). The primary reason we have this model is to share the work-load; we do not have the resources to assign such a role in a full-time capacity. Regardless of who is doing it, applying “metadata well requires a lot of human energy” (8, page 495).

sTeP 5: deTerMIne whAT MeTAdATA CoMPonenTs CAn Be AuToMATedDetermining which metadata components, if any, can be automated, is important at this stage in developing a metadata strategy. Some components need the human touch for quality purposes, or because tools such as the CMS are not able to automate the application of such metadata. However, when possible, utilize auto-mation. The options for this will vary depending on the tool. In our case, we looked to the CMS for automating the application of metadata.

Why automate? Automating metadata applica-tion lessens the burden on the content develop-ers and helps avoid inconsistency. In addition, if it is “up to the author to remember to add

262 TIMAF Information Management Best Practices Vol. 1

the metadata in all the relevant places”, it is a “recipe for missed metadata” (6, page 200). As Boiko writes, “without a rigorous consis-tency and careful attention, metadata becomes useless” (8, page 495). He goes on to say that “someone or, better, some system is necessary to ensure that people handle metadata thoroughly and consistently” (8, page 495). So if the CMS can handle it, automate it!

What metadata makes a good candidate for automation? From our experience, metadata with a yes or no value should be automated if the question can be answered by data that is accessible to the CMS. For example, to answer the question “Is the content being reused?”, populate the reuse attribute with either “yes” or “no.” In our case, if content lives within a specific CMS collection, then it is reused. Otherwise, it is not. Our CMS is smart enough to answer this question based on the location of the content – in a certain collection, so we let it answer that question for us.

Metadata containing a value that is definite should also be automated. For example, the originator attribute can be populated with the username of the person who created the con-tent because the CMS knows who that person is. Likewise, the CMS knows who is reusing content because it can follow the reference to the content back to the username who created the reference. As a result, we let the CMS capture the username for us by adding it to the reuser attribute.

On the other hand, what metadata should not be automated? Metadata requiring a discerning human eye should not be automated. For example, a person is needed to determine the subject of the content. One could argue that if the content contains a title, the subject could be leveraged from the title. However, not all content chunks include a title. As a result, we do not automate the ContentType

metadata element.A gray area might be keywords. In our case, we depend on a person to assign keywords. This person is typically the content developer, with some assistance from the content librarian if required. As content grows, new keywords might be necessary. If they are not part of the controlled vocabulary, the content librarian can make note of that and modify the list as needed. From our experience, controlled vocabularies are certainly living lists, as previously discussed.

Table 3 shows our system-driven metadata, including metadata used to manage the status of content (whether or not it can be reused).

Be sure to also consider the risks of automa-tion. Boiko states that “the problem isn’t to find the blanks, but to correctly fill them” (8, page 509). The key word here is “correctly”. Similarly, Rockley explains that “[i]mproperly identified metadata ... can cause problems ranging from misfiled and therefore inacces-sible content to even more serious problems ...” (6, page 185). In our case, inaccessible content would be a deal breaker since our primary goal is retrieval for reuse. It is critical that metadata applied automatically by the CMS is done with the highest quality standards. There can be no room for incorrectly applied metadata or for the possibility of inaccessible content.

Consequently, if you rely on the CMS to auto-mate the application of metadata, make sure it is fool- proof (tool-proof).

sTeP 6: ensure ThAT users wILL APPLy The MeTAdATAOnce you have determined which metadata components can be automated, the remaining components will be user-driven. The next step it to ensure that users will apply it. As Rockley notes, metadata is “only valuable if it getsused” (6, page 200).

26315 Charlotte Robidoux, Stacey Swart

taBLe 3 System-Driven Attributes

System-Driven attributes

Status

Collection

Reuse

Originator

Reuser

WorkingApproved

SectionChapterGlossentry

YesNo

Username

Username

External use in the authoring environment. Used upon extract to work with style sheets to lock content from changes if approved.

Used to properly reload content to the correct collection.

If yes, content is from an approved collection. Used to color-code approved content so that reviewers and editors know it has already been approved.

Who created the content; used by CMS workflow.

Who is reusing the content; used by CMS workflow.

Value goal

One method to ensure users apply metadata is to configure your tools with metadata require-ments. The DTD behind an authoring tool can utilize occurrence rules to require specificmetadata components to be added (such as at least two keywords must be present). A CMS can be configured to enforce the same rules. In our case, we have rules established in both tools. Regardless of which tool the metadata is applied in, the user must meet certain require-ments. The tools alert the user when those requirements are not met.

We have found that the CMS provides greater specificity than our authoring tool in such requirements. While the DTD behind the authoring tool can require that metadata components be present, it cannot enforce that values be added to those components. For example, in the authoring environment, a user could add two keyword elements, but leave them empty with no values assigned. Techni-cally, they would meet the DTD rules. The CMS provides the additional reinforcement. In our case, content shows as incomplete unless the

metadata components are present and they contain values. For example, two keywords are present and the values are this and that. Content within the CMS shows as incomplete unless all metadata requirements are met, and because all content is ultimately managed in the CMS, it becomes the final checkpoint.

To fully utilize the benefits of metadata, however, users must do more than just apply metadata to their content. They must apply the appropriate metadata to their content. A well designed metadata strategy ensures that the metadata components and values are tailored to the needs of the user; metadata guidelines assist them with the tasks they need to accom-plish and include terms they will use whenretrieving content. But as previously discussed, users do not all think the same way. This is where having a controlled vocabulary is a must. Even though one user might be inclined to search on “America” and another might search on “U.S.A.”, they will both work off of the same list of terms, which in this example could include “United States”. Such search standards

264 TIMAF Information Management Best Practices Vol. 1

can be taught, and will ensure effective search results, rather than wasting the user’s valuable time.

There are other ways to assist users in metadata application. One is to provide templates that are pre-populated with the required metadata components. We have done this in our author-ing toolset; the content developer only needs to assign values to the components. Another method that both assists users and provides a level of control that can be a partner to occur-rence rules is described by Boiko as “categori-zing metadata by its allowed values” (8, page 506). For example, we use a “closed list,” which allows users to select a value from a predefined set of terms, or a controlled vocabulary (8, page 507). In our case, the controlled vocabulary is built into the authoring tool and the CMS. The user cannot type in metadata values; the only option is to select them from a list.

sTeP 7: AssIgn MeTAdATA TAsks To roLesTo ensure your metadata goals become part of your business processes and tool environment, assign roles to team members who can imple-ment the metadata strategy. These assignments streamline the implementation effort. Table 4 describes each of these roles.

Metadata is dependent on many contribu-tors. While tool administrators can ensure that system- driven metadata and automation are set up behind the scenes, they are not the sole contributors. The realization of your strategy becomes much easier with all team members involved.

26515 Charlotte Robidoux, Stacey Swart

role

Content Librarian

Editor

Originator (Content Developer)

Reuser (Content Devel-oper)

Responsible for the quality of modular content in the CMS as well as for flagging opportunities for reuse across the database. The content librarian’s tasks include:

* Assisting content developers when needed in understanding the metadata guidelines.* Maintaining the metadata guidelines document and the master list of values.* Reviewing and accepting or rejecting new metadata value requests.* Notifying Tool Developers when new metadata values need to be added to the tool set.* Auditing the quality of metadata the values that content developers apply before the content can be made available for reuse.* Overseeing content change proposals for reused modules and validating the requests for changes.* Facilitating the review process to ensure all Reusers participate by either accepting or rejecting their changes* Implementing the final result by either overwriting the original “approved” content in the CMS, or by creating a variant of the original “approved” content.* Populating the CMS with common queries to assist content developers with locating content to be reused or leveraged.* Assisting content developers when more specific search criteria is needed for database queries to locate content to be reused or leveraged.

Manages edits at the sentence level and reviews content against style guidelines. The editor’s responsibilities include:

* Reviewing metadata values as part of the literary edit to ensure consistent usage.* Maintaining an eye toward content that can be leveraged or reused when a content developer opts to create new content.

Identifies the need for and creates reusable modules of content using content mapping. The originator’s responsibilities include:

* Identifying unique, similar and identical content across the deliverables set.* Capturing metadata values for identical content. * Analyzing similar content for opportunities to make it identical.* Creating reusable topics of information.* Requesting new metadata values as needed via the CMS workflow.

A content developer who uses metadata to query the CMS for reusable content. The reuser’s responsibilities include:

* Reusing “approved” content by referencing it in deliverables.* Initiating the change proposal workflow as needed to request changes to “approved” content.* Reviewing change proposals from other reusers.

responsibilities

taBLe 4 Roles and Responsibilities >>

266 TIMAF Information Management Best Practices Vol. 1

Tool Administrators

DTD Developer

Authoring Tool Developer

Authoring Tool Developer

CMS Administrator

Responsible for configuring and managing the tool set. Examples of tool administrators include:

* DTD Developer * Authoring Tool Developer * CMS Administrator * Publishing Tool Developer * The tool administrator’s responsibilities include: * Addressing requirements versus options. * Automating processes where possible. * Ensuring that the tools support the reuse and metadata strategy.

The DTD developer’s responsibilities include:

* Managing DTD elements, attributes, and occurrence rules.* Communicating with the CMS Administrator when DTD changes are needed.

The authoring tool developer’s responsibilities include:

* Making templates available for new content creation.* Automating adding required child elements when a parent element is selected.* Maintaining pre-populated menus with required user- driven metadata values.* Providing links to support documentation from the authoring tool menu.

The authoring tool developer’s responsibilities include:

* Making templates available for new content creation.* Automating adding required child elements when a parent element is selected.* Maintaining pre-populated menus with required user- driven metadata values.* Providing links to support documentation from the authoring tool menu.

The CMS administrator’s responsibilities include:

* Managing content collections for editing, loading, and extracting behaviors, including extracts directly to the publishing tool.* Maintaining user roles and privileges. * Configuring the CMS to ensure alignment with DTD rules.* Setting up CMS-specific elements and attributes as needed.* Tightening structure rules by requiring text to be present and/or valid values to be used for applicable elements and attributes.* Making components, properties and operators available to help ensure effective query options.* Maintaining pre-populated menus with required user- driven metadata values.* Implementing visual aids to assist users when viewing content in the CMS.* Automating the capturing of system-driven attributes.* Creating workflow configurations to support CMS- assisted procedures.

Publishing Tool Developer The publishing tool developer’s responsibilities include:

* Creating and maintaining style sheets for use in the authoring and publishing tools.* Developing authoring tool scripts to provide visual cues for reused content to content developers, editor, and Subject Matter Experts.

26715 Charlotte Robidoux, Stacey Swart

It is critical that the team members have a clear definition of their roles and the importance of each role in contributing to the overall success of the strategy.

sTeP 8: ProVe ThAT your sTrATegy Is soundReadiness to release metadata to productioncan take months or years, depending on the complexity of your strategy. Because our organization did not always have dedicated resources to devote to this implementation, tracking our progress via a schedule was absolutely essential. As priorities shifted in our organization, deliverables were either pulled in or pushed out as needed. Shifting priorities and balancing resources may ultimately determine the time needed to develop and implement your metadata strategy. While it took our team a number of years, we understood the return on investment. Had we given in to the pressure to release any sooner, we would have had a less effective, less efficient, and less robust metadata strategy. And because metadata is truly the backbone to our reuse strategy, skimping was not an option.

It is also necessary to come to an agreement with management as to what qualifies the strategy to be ready for release. For example, we negotiated to have a full quarter of simula-tion testing, and agreed that we would only release to production if simulation testing resulted in zero process or tool issues. Test scenarios should be as realistic as possible. In our case, we used actual customer content, assigned roles, and created scenarios to put our business and tool processes to the test. The testers received new test scenarios each week so that they couldn’t see what was coming next. Our support staff, including the editor and the content librarian, were also given test scenarios. In some cases, we set up intentional conflicts to ensure users knew how to handle them.

Before you can release your metadata strategy to production, you must ensure that:

• Yourtoolsarefunctioningasexpectedto support the strategy. • Rolesandexpectationsareclear.• Themetadataguideisavailable.• Traininghasoccurred,includingmaking all users aware of the importance of metadata (resulting in a willingness to use it). • Nogapshavebeenidentified.• Therearenotechnicalissueswithanyofthe processes supported by the tools.

After all of the “human energy” (8, page 495) spent on creating your metadata strategy, don’t short- change yourself by rushing through the testing process. When you do release your metadata strategy, you want to know it is rock solid.

Metadata in actionAs previously discussed, our goal is to enable effective reuse by making content easy to find. Because the originating content developer (Originator) added metadata for reuse, other content developers (Reusers) can query on those values. Figure 1 shows some of an Originator’s content.

268 TIMAF Information Management Best Practices Vol. 1

In some cases, a content developer might know the content exists, and is already familiar with it. In that case, she would have knowledge of the metadata values that are likely to be associ-ated with the content.

In other cases, the content developer has a need for content, but is not sure it exists. Rather than creating it from scratch, she searches the CMS to see if content exists that she can reuse or at least leverage. For example, a user needs content specific to installing NAS products onto servers. Because our CMS is configured with drop-down lists of valid values (controlled vocabulary metadata values), the users selects the appropriate metadata elements and values from the list.

In this example, the content developer would query on:

• ContentType=Installing;• Product=NAS;• Keyword=servers.

The content developer can search on one or more metadata elements, as shown in Figure 2. Combining multiple metadata elements provides narrower results. In the preceding example, over 1,200 section modules were queried, resulting in one section that met the content developer’s query (shown in Figure 3). At this point, the content developer can review the content in more detail and decide if she can use it as is, or leverage it.

Figure 2: Searching Using Metadata

Figure 1: Originator Content

26915 Charlotte Robidoux, Stacey Swart

It is easy to see that without the metadata to support the query, the content developer would likely never have located the content she needed.She probably would have just created a new section, duplicating existing data. She would have spent time doing this, taking away from her other work. In addition, the CMS would have become populated with redundant content.

Even when a content developer locates content that already exists, it might not fully meet her needs. In that case, she can propose changes to the content. We use the Change Proposal Work-flow feature in our CMS to manage this process. The workflow has the following steps, shown in Figure 4:

• Proposal: The content developer copies con- tent to a working collection, makes changes as needed, and initiates the workflow.• Review: The content librarian validates the request. System-generated email notifications are sent to all Reusers. The content librarian facilitates an offline review and mediates any counter-proposals.

• Outcome: If all Reusers accept the change proposal, the CMS automatically overwrites the original content in the approved collec- tion with the changed content. A system- generated email notification is sent to all content developers, letting them know that the workflow has been completed.

• Relink: If only some Reusers accepted the change proposal, the content librarian assigns a unique ID to the content, and the CMS automatically moves the variant to the approved collection. A system-generated email notification is sent to all content developers, reminding them to relink to the variant as needed.

Figure 3: Reused Content

Figure 4: Change Proposal Workflow

270 TIMAF Information Management Best Practices Vol. 1

If, on the other hand, query results show that new content needs to be created, the content developer can do so. Adding the metadata elements to the new content will help ensure that other content developers can locate the content for future usage.

Summary and ConclusionProving the soundness of metadata in our case entailed extensive collaboration and testing among team members. Key areas of focus included:

• Checkingandrecheckingthatourmetadata values were entered into the tools correctly.• Ensuringhighusabilityinthetoolsandin written processes so that team members could add metadata easily. • Configuringourtoolstoeasilylocatemeta

data and to indicate if values and elements were missing. • Provingtheconceptthatourmetadatawould enable us to locate content effectively for the purpose of reuse.

The ultimate test of success is verifying that implementing metadata allows your organiza-tion to achieve the goals you identified at the outset.

There is little guidance available on how to develop a metadata strategy. While some in-dustries have developed specifications tailored to their content, others seem to be starting at square one. Technical communication, as it relates to the computer industry, could benefit from more substantial models to follow. Having a specification as a starting point would help companies get started with their metadata strat-egy by providing a list of components, possible values, and the pros and cons of building upon this as a foundation.

references1. baca, Murtha, ed.. Introduction to Metadata, 2nd Edition. Los Angeles, cA: Getty Publications, 2008.

2. Dublin core Metadata Initiative. “Dublin core metadata element set, version 1.1 .” 2008. retrieved on

20 Mar. 2009. http://dublincore.org/documents/dces/.

3. International Organization for standardization. “IsO 15836:2009, Information and documentation - the Dublin

core metadata element set.” 2009. retrieved on 21 Feb. 2009. http://www.iso.org/iso/iso_catalogue/catalogue_

tc/catalogue_detail.htm?csnumber=52142.

4. American National standards Institute. “the Dublin core Metadata Element set.” ANsI/NIsO Z39.85.

bethesda, MD: NIsO Press, 2007. retrieved on 8 Jun. 2008. http://www.niso.org/kst/reports/standards/kfile_

download?id%3Austring%3Aiso-8859-1=Z39-85-2007.pdf&pt=rkGKiXzW643YeUaYUqZ1bFwDhIG4-24rJbcZb-

Wg8uE4vWdpZsJDs4rjLz0t90_d5_ymGsj_IKVa86hjP37r_hFEijh12LhLqJw52b-5udAaMy22WJJl0y5GhhtjwcI3V.

5. American National standards Institue. “Guidelines for the construction, Format, and Management of

Monolingual controlled Vocabularies.” ANsI/NIsO Z39.19-2005. bethesda, MD: NIsO Press, 2005. retrieved on

9 Jun. 2008. http://www.niso.org/kst/reports/standards?step=2&gid=None&project_key=7cc9b583cb5a62e8c15

d3099e0bb46bbae9cf38a.

6. rockley, Ann. Managing Enterprise content: A Unified content strategy. Indianapolis, IN: New riders Publishing,

2002.

7. National Information standards Organization. “Understanding Metadata.” bethesda, MD: NIsO Press, retrieved

on 21 Feb. 2009. http://www.niso.org/publications/press/UnderstandingMetadata.pdf.

8. boiko, bob. the content Management bible, 2nd Edition. Indianapolis, In: Wiley Publishers, 2005.

27116 Ann Rockley

Ann ROCKLEY ([email protected]) is President of The Rockley Group, Inc. She has an international reputation for developing intelligent content management strategies and their underlying information architecture. She has been instrumental in establishing the field in online documentation, single sourcing (content reuse), unified content strategies, and content manage-ment best practices. Rockley is the primary author of ‘DITA 101: Fundamentals of DITA for Authors and Managers’ and ‘Managing Enterprise Content: A Unified Content Strategy’.

An Intelligent Content Strategy

Managing content

Ann Rockley

h 16

272

AbstrAct

TIMAF Information Management Best Practices Vol. 1

Today’s businesses are overwhelmed with the need to create

more content, more quickly, customized for more customers

and for more media than ever before. Combine this with

decreasing resources, time, and budgets, and you have a

stressful situation for organizations and their content creators.

Content is usually managed as documents, but when we

manage documents we manage “black boxes” of content.

We cannot easily get at the information in the document to

use elsewhere in another way. We need the ability to easily

find, repurpose, and deliver to multiple channels on the fly.

Intelligent content is structurally rich and semantically aware,

and is therefore automatically discoverable, reusable,

reconfigurable, and adaptable. An intelligent content strategy

bridges organizational content silos and delivery channels.

Traditional content management typically manages documents

or web pages. Intelligent content is typically managed as

content components that are developed using XML. It makes

sense to implement an intelligent content strategy if you

have a lot of opportunities to reuse content in many different

content deliverables, to translate your content, or to deliver

content in multiple channels .

273

Background

16 Ann Rockley

A medical device company struggled to create marketing content for multiple channels and ensure that their messaging would meet regula-tory requirements. Regulatory approval often came just before launch, and it was extremely difficult to make changes across multiple channels at the last minute without missing anything. To make things more difficult, the con-tent was translated into 40 different languages!

Marketing content included brochures for use in doctor’s offices and pharmacies, web-based product pages, a newsletter, letters to customers encouraging them to upgrade their device, press releases, presentations, and product packaging. There were four variations of the product.

The medical device company decided they needed a Content Management System (CMS), and started to look at tools. We pointed out to them that tools are not the whole answer; they could have the best tool in the world, but if they did not start with a clear understanding of their needs and a hard look at their content, the tool would provide some productivity gains, but little in improved customer satisfaction andreal return on investment. We analyzed a small sample of their content and developed a preliminary return on investment (ROI) to illus-trate the value of analyzing their real require-ments and modifying their content before they selected a tool.

The development of their intelligent content strategy involved four phases:

• Analysisoftheirbusinessneeds,current content lifecycle and existing content. • Developmentofaninformationarchitecture to support information products, structured content, reuse and metadata. • Creationofintelligentprocessestocreate and manage content.

• Adaptationandimplementationofnew methodologies and processes.

The skILLs needed To PerForM ThIs BesT PrACTICeTo do the procedure detailed in this work, you need staff with the following skills:

• Business Analyst: Someone with strong business analysis skills to analyze the needs of the organization and the current content lifecycle.• information Architect: Someone with a good understanding of the content in all its forms and strong content analysis and design skills. This person will build the information product models, element models, metadata and reuse strategies; they need to under- stand the requirements of multichannel content delivery in addition to the web. This skill set can be taught to a senior writer or communication analyst.• technology Strategist: Someone with a strong understanding of technology. This person will work with both the business analyst and the information architect to define the technology requirements and assess and select an appropriate solution that meets organizational needs.• Content Coordinator: The content coordina tor is responsible for working across the content silos to get buy-in for the intelligent content strategy, and for ensuring that the intelligent content strategy is being effec- tively addressed. In particular, the content coordinator needs to be able to oversee many projects and determine the intelligent content strategy required to address the needs of all the project owners, the needs of the orga- nization as a whole, and the needs of the customers.• Change Manager: With the adoption of

274 TIMAF Information Management Best Practices Vol. 1

content management and the move to an intelligent content strategy, you need to address the issue of change management. The change manager is someone who is able to identify the potential issues a given change will bring and can determine effective ways to help people adapt.• trainer: A good trainer or series of trainers is required to teach employees not only how to use the new technology, but also to help them understand the intelligent content strategy and adapt to the paradigm shift of the new intelligent content strategy.

sTeP 1: AnALyze The InForMATIonAnalysis is key to a successful intelligent content strategy. Before we can develop this strategy, we need to figure out what is going on with the content, how it is being used, and how it is being managed, as well as the processes used to create, publish, and store it. During the analysis phase, we determine the major pain points, identify the content lifecycle, and per-form a content audit.

Every organization (and every department with- in an organization) faces challenges in success- fully meeting goals. This company was no exception. To implement an intelligent content strategy, it is good to start “where it really hurts”— where there are significant content management and authoring issues. To discover where the organization was hurting the most, we needed to understand the dangers and challenges facing them, the opportunities that could be realized if change occurred success-fully, and the strengths we had to build on to implement these changes. We conducted an audit to determine this information, and found the following pain points:

• Subjectmatterexperts(engineers,medical advisors) were frustrated with the amount of time required of them to educate all the

content creators about the product and then review the content. They never seemed to be able to get their “real” work done.• Nooneknewwherethemostcurrentversion of content resided. Sometimes it was on locked servers, and sometimes it was simply attached to an email.• Reviewershatedthedelugeofemailsthey received, and were confused about what they should be reviewing. They were also frustrated because they seemed to be reviewing the same content over and over.• Whenaregulatoryagencyrequesteda change, authors found it difficult to find all occurrences of the content to change, and changes across channels (e.g., web and print) were often implemented differently.• Translationwastiredofhavingeveryone accuse them of unreasonable costs and time delays. They knew there was a better way to prepare content for translation, but no one would listen to them.• Managementfeltthatthelevelofqualitywas inadequate, and was dismayed by rising costs.

To implement an intelligent content strategy, you need intelligent processes so that everyone involved in developing, storing, and publishing content does it the same way, or—at minimum —is able to interact effectively with each other and share content. Within the organization, content was developed in many different ways, by many different people, and by many differ-ent departments. Some groups followed apredefined process, while others did not. Even when we found a “common” process, it often differed from department to department. To understand where to focus our efforts, we examined the content lifecycle and any issues associated with it. The company’s content life-cycle involved these steps:

• Theteamresponsibleforthebrochures created the content and handed it off to an external vendor for layout in a high end desktop publishing tool, where the content

27516 Ann Rockley

was further refined to fit to the space.• Thewebteamdevelopedthecontentfor the web; it too was crafted to fit.• Thepackagingteamaskedformarketing content for the box. The key features and benefits information was written in a form that fit into the box real estate, and then handed off for integration with the legal and regulatory information. Using a high end page layout tool, they laid out the box “wrapper”.• Yetanotherpersoncraftedaletterand content for a coupon that encouraged existing customers to upgrade.• Theletterwriteralsocreatedthepress release. • Yetanotherwritercreatedsalestraining materials. • Allcontentwascirculatedviaemailfor review and approval.

Before you can model your content—and sub-sequently add intelligence to it—you need to gain an intimate understanding of its nature and structure. During the content audit, we looked at content analytically and critically, allowing us to identify opportunities to structure content, design content for reuse, and add intelligence to the content. The analysis of the content lifecycle identified the following issues:

• Contentwaswrittenandrewrittenfor multiple documents, refined and styled for the particular channel. At each iteration of the content, the words were changed slightly to accommodate the media and the prefer- ences of each individual author. This meant that there were numerous variations in the content, resulting in inconsistent messaging and large translation costs.• Circulatingcontentbyemailwasareal problem, as it was often difficult to tell if one had the latest version. Mistakes have been made in the past, as content has been posted only to find out that it was not the latest version. There was a particularly

expensive mistake when wrong content went out in a brochure and tens of thousands of brochures had to be scrapped.• Translationcostswereexorbitant,ascontent was translated and retranslated and retranslated for each channel and deliverable.• Everyonewasover-workedandfrustrated with the content lifecycle process. There was no such thing as a normal work week. When one product was finished, there was already another one coming up behind.

sTeP 2: desIgn An InForMATIon ArChITeCTureIn the design information architecture step, we build information models and define the metadata that describes the content. Informa-tion models and their supporting metadata are critical to the success of an intelligent content strategy; they become the roadmap the intelli-gent content strategy follows.

create an Information ModelModels formalize the structure of content in guidelines, templates, and structured frame-works such as DTDs or schemas. Through infor-mation modeling, we identified and documen-ted the framework upon which the intelligent content strategy would be based.

The information modeling process considered all information requirements (either for a specific project or within an entire organization) and assessed what information was available to fulfill those requirements. The information model became the “catalog” of all the informa-tion products, and outlined the necessary infor-mation elements for each.

During the information modeling phase, we determined the elements required for each information product (or output) and described how each information product would be designed for optimum usability and reuse.

276 TIMAF Information Management Best Practices Vol. 1

The information models were used to develop templates that authors filled in, providing infor-mation on what type of information should be included in a particular structural element.

Use DIta for StructureMany organizations are beginning to adopt Darwin Information Typing Architecture (DITA) as a structure for their content. DITA is an open content standard that defines a common content structure that promotes the consistent creation, sharing, and reuse of content. DITA is supported by the Organization for the Advance-ment of Structured Information Standards (OASIS) and is the fastest growing XML content standard (1). It was originally developed for technical documentation, but is now being adopted for business documents and pharma-ceutical materials.

It is useful to look at DITA’s name in reverse order to understand its meaning.

• Architecture: IBM was the creator of DITA, but they gave it to OASIS to make it available as a standard. IBM started off with the goal of creating commonly structured content for sharing and reuse. To do this, they developed an architecture to provide a common set of guidelines for how to build content. When we have guidelines and a common way of structuring our information, it makes that information much easier to create and much easier to share.• information typing: When we analyze content, we see a number of common kinds (types) of information. For example, on average, technical documentation consists of 60% procedural (task-oriented) content. Since DITA was originally developed to support technical publications—in particular software-oriented Help materials—it provides built-in structures for the following information types:- Concept: Conceptual (descriptive) informa-

tion that is used to provide context or under standing.- task: Task oriented (procedural) information that is used to define “how to” information.- Reference: Reference (look-up) oriented information that is used to define specific, often detailed information or to bring together at-a-glance information (e.g., in tables) that is referred to (consulted) on an as-required basis. You can use these standard information types to build unique sets of information. Typing information makes us think more clearly about what kind (or “type”) of information we are authoring so that it is consistent and coherent.• Darwin: Charles Darwin is often described as the “father” of evolution. He was a naturalist who proposed the idea that orga- nisms have evolved over time from common ancestors. IBM made the assumption that DITA would evolve over time to support a broad range of information types. In other words, it is not static—if you a non-standard information type, such as a troubleshooting topic, you can develop that “specialized” topic and still ensure that it is consistent with the standard information types.

Since we had identified a lot of opportunities to reuse content and were delivering to multiple channels, DITA made a lot of sense. We mapped the content models to the standard DITA struc-tures and developed writing guidelines to assist authors in how to write their content to fit the DITA structures.

Develop a reuse StrategyA reuse strategy identifies what types of content will be reused, the level of granularity, how content will be reused, and how to support authors in easily and effectively reusing content. Reuse is controlled through business rules or governance. For example, what happens when an author creates content which is approved?

27716 Ann Rockley

This content then becomes the source. Another author chooses to reuse the source content. They are in the process of modifying it when a third author decides to reuse the source, and begins making changes to it. What do you do? Will there be two derivatives (variations)? What if a change is made to the source while the secondary authors are working on their content? Decisions on how to address these reuse issues become the business rules that govern reuse. If the reuse strategy and governance are built into the intelligent content strategy from the begin-ning, it is just one more change that authors adjust to, rather than an imposition at a later date.

We developed a strategy that addressed the goals of the customer content, the way that authors worked, the governance rules to control reuse, and made sure that the reuse strategy could be supported by the selected technology.

Define Metadata NeedsMetadata aids in information retrieval, both for content authors retrieving content from the CMS, and for customers retrieving content from a website or within a document. In addi-tion to using traditional metadata for informa-tion storage and retrieval, we also needed to define metadata for every level of content in the system (For example, document, section, subsection, component, and content fragment.). Component-based content metadata was created to identify content for:

• Channel(print,web,wireless).• Filtering(product,customersegment/ audience, region, product version). • InformationProduct(brochure,packaging, letter, web).

sTeP 3: CreATe InTeLLIgenT ProCessesAn intelligent content strategy involves people and intelligent (collaborative) processes, as well as information. The intelligent processes must create a collaborative environment in which authors throughout the organization can contri-bute to and draw content from a definitive source of information. Collaboration ensures that the content components, such as product descriptions, are consistent and can be reused wherever they are required—in a printed brochure, on the web, in letters, and so on.

We worked closely with the authors to assist them in collaborating, moving away from individuals writing content to a team approach for content development. We defined an organi-zational structure where a small group of writers was responsible for writing the coreinformation (the information that is reused), while others were responsible for identifying how the information set needed for a specific channel differed from the core and adding information to cover those differences.

We also needed to redesign the company’s processes to match the intelligent content strategy and support the way the authors worked. We used workflow to formalize and support the new processes. We redesigned processes for:

• Review(contentcomponents,sections,and whole documents). • Ownershipandreuserights.• Translation(contentcomponents,sections, and whole documents). • HandoffofXMLtocreativedepartmentand outside vendors.

278 TIMAF Information Management Best Practices Vol. 1

sTeP 4: IMPLeMenT The sTrATegyAn intelligent content strategy needs much more than just a CMS; an effective strategy begins at the authoring stage and ends at the delivery stage. We recommended an authoring, content management, and delivery tool solution.

Begin at the authoring StageBefore content can be managed, manipulated, or reused, it must be created. To support a intelligent content strategy, content must be written so that it can be structured and reused according to the content strategy. This means using a structured authoring tool. Structured authoring tools can be upgrades to existing tools, or XML native. Based on the audience, which was largely a Marketing Communications team, we recommended an upgrade to their existing tool, allowing them to continue work-ing in a familiar environment.

choose the right type of cMSThe most common CMSs are web CMSs. While they support web and often wireless content very well, they usually do not support non-web content. Enterprise Content Management (ECM) systems tend to manage documents, not components. Intelligent content really needs an XML-aware system like a Component Content Management (CCM) system. CCM systems manage content at a granular (component) level of content, rather than at the page or document level. Each component represents a single topic, concept, or asset (such as an image or table). Components are incorporated into multiple content assemblies (content types) and can be viewed as components or as traditional pages or documents. Each component has its own lifecycle (owner, version, approval, use) and can be tracked individually, or as part of an assembly. CCM is typically used for multi-channel customer-facing content (marketing,

usage, learning, support). CCM can be a separate system, or a functionality of another content management type, such as ECM. We were happy to learn that they already had an ECM system that could support CCM, and we only had to recommend an upgrade to the latest version of their existing system, rather than the purchase of a new system.

end at the Delivery StageDelivery systems have many different capabili-ties. The CMS may have built-in facilities for delivering content, or you may have to integrate a delivery system with your CMS. Some delivery systems will enable you to deliver to a variety of outputs (web, HTML, PDF), while others may be restricted to a single output. Due to the heavy reliance on visual layout for their materials, the company wanted to keep their existing layout tools. Unfortunately the layout tools did not support DITA-based content, so we recommended that we build a tool that transformed the content from DITA to the lay-out tools. We normally like to use off-the-shelf tools, but that was not possible here.

Summary and ConclusionThe content authors took to structured writing and reuse with no problem, but creativeservices found working with XML somewhat of a challenge. It took a while before they felt comfortable with the new processes.

The content management vendor provided good integration with the authoring tool, but declined to provide support for the publishing processes. We smoothed the transition to structured writing, content conversion, and new processes, and jumped in to find a systems integrator for the publishing issues.

27916 Ann Rockley

The Intelligent Content Strategy was a success. The medical device company realized their ROI in the first year. Translation costs were significantly reduced through reuse and the elimination of desktop publishing. Producti-vity increased by 40%, and errors decreased by 10%.

Changing the way you do business is not without its challenges, and implementing an intelligent content strategy is no exception. Challenges can include the complex nature of the content, the technology (or lack thereof), and the effects of organizational change.

While the concept of reuse is not new, develop-ing an information architecture to support reuse is. There is a history of information architecture design for SGML—and now XML—but much of the work does not address the issues of designing for reuse. This means that there are few standards and guidelines for doing it well. Constructing reusable components and frame-works is inherently abstract, making it hard for most people to envision how a reuse environ-ment could work. Thorough analysis and careful design, followed by prototyping and piloting your content in phases, you can reduce the risks of developing an ineffective architecture.

Technology is changing rapidly. Selecting a vendor with a good track record, demonstrated financial stability, and a non-proprietary format will position your organization to change tools in the future if necessary, and to adapt to other tools used in your organization.

One of the major challenges may be cultural, not technological. Content authors are used to working in specific areas creating one type of content. The concept of sharing content effec-tively between seemingly disparate areas (e.g. marketing and technical publications) or across media (e.g. paper and web) without compromi-sing quality and usability may be difficult for authors to accept.

Addressing the issues of organizational change from the beginning of a project can ensure a smooth implementation. Looking at the lessons learned in other industries like the software industry that faced similar challenges when they moved to reusable software can assist you in meeting the challenge.

references(1) Doe, John. Midnight at the OAsIs. Dubai,

UAE: standardista Press, 2010.

280 TIMAF Information Management Best Practices Vol. 1

28117 Ganesh Vednere

GAnESH VEDnERE ([email protected]) works with a global financial services company based out of New York, and is experienced in implementing enterprise wide content and records management programs. He has been involved in all aspects of content and records manage-ment programs including program strategy, policy and procedures, compliance research, and program implementation. He has over 15 years of industry experience in various business and technology verticals. Ganesh has authored a number of articles related to information manage-ment in leading local and international journals. He is the recipient of AIIM International’s 2009 Distinguished Service Award.

How to get users to use an information management system

Managing Information: From reluctance to acceptance

Ganesh Vednere

g 17

282

AbstrAct

TIMAF Information Management Best Practices Vol. 1

Many organizations have deployed, or are in the process

of developing, enterprise level information management

programs. These information management programs address

a variety of business, operational, legal, and regulatory

requirements, and are meant to enhance and augment existing

business processes and better manage the flow and stor-

age of information across the enterprise. However, several of

these organizations are finding that the end user adoption and

usage rates of these information management systems have

been woefully low. End users either do not use the information

management systems as often as intended, or simply bypass

the information management system altogether. This results in

inefficiencies around information management, negative return

on investment, and poses potential legal and business risks to

the organization. What can be done to increase the success of

the program with the end users? How can the implementation

team increase the level of interest and guarantee a wide scale

acceptance of the system?

The measurement of a successful information management

program is based, very simply, on the number of users who

use it as part of their day to day business activities. But getting

to a large user base is not trivial, and requires a great deal of

thought and planning, coupled with a solid understanding of

business processes and—most importantly—how users inter-

act with them. This article provides key items for information

managers to consider when planning and deploying their own

programs.

283

Background

17 Ganesh Vednere

Organizations are increasingly paying significant attention to the management of information. Information has not only business, operational, and functional characteristics, it is also deemed a critical component in successful litigation and discovery process. Information management is the science—and in some cases the “art”—of identifying, gathering, categorizing, managing, securing, and distributing information. With the ever increasing amount of information that is produced and stored in organizational systems, information management is becoming critical to business. The ability to search, locate, and produce information in a meaningful form is key to a successful business operation.

The skILLs needed To PerForM ThIs BesT PrACTICe

• InformationManagementProgramSponsor:- Sponsors information management initiatives within the organization;- Is typically a C-level executive within the organization;- Takes executive responsibility for return on project funding.• ProgramAdvisoryCommittee:- Is made up of senior level officers within the organizations;- Reviews program status and monitor overall direction and status of the program;- Provides executive level guidance to the program.• ProgramManager:- Establishes the program and governance structure;- Establishes the overall approach for the program;- Manages the execution of the information management program;- Takes responsibility for successful deployment;

- Develops post-production support processes.• InformationPolicyManagerandAnalysts:- Develop and maintains information policies;- Develop procedures and guidelines;- Develop controls for policies and procedures;- Monitor compliance with policies.• TechnologyManager,Analystsand Developers:- Select technology to support information management requirements;- Develop technology architecture and design artifacts;- Implement design in selected technology platform;- Ensure adequate system performance;- Provide ongoing support for system enhancements;- Provide ongoing end user support.• ProgramOffice:- Provides a central program management office;- Manages and monitors project plans;- Tracks budget and resource allocations;- Develops and coordinates issue and problem logs;- Handles program reporting;- Serves as point of contact for all program related needs.• legal:- Provides legal support;- Ensures information management program is in compliance with applicable laws and regulations;- Supports development of information policy and procedures;- Manages eDiscovery and litigation matters.• Businesslines:- Are the end users of the information management program;- Provide requirements to program manager and technology teams;- Participate in end user acceptance testing;- Ensure business users are in compliance with

284 TIMAF Information Management Best Practices Vol. 1

policies;- Participate in information audits and eDiscovery processes.

sTeP 1: MeeT The InForMATIon MAnAgeMenT ChALLengeOrganizations across all types of industries and geographies are in the process of planning or implementing large scale enterprise level infor-mation management programs to manage their company’s information. In some cases, millions of dollars are being spent, large teams are being assembled, and complex technologies are being implemented. It is hoped that the system will be successful and be widely used across the company. However, the average industry statistics are sobering in this regard, and many users remain dissatisfied with their ECM implementations. One can only imagine the millions of dollars that are being spent on developing and implementing ECM systems, only to have them be rejected by the end users.

Some organizations expend a large amount of effort on creating information management approaches, current state analysis, future state design, and implementation and deployment strategies, but when it comes to end users, they get a cursory mention in the section titled “Training”. It is as if training will somehow convince everyone to use the system—some-how the end users will be so eager to use ECM that they will change their ways of working to conform to the new policies and system. As evidenced by multiple industry statistics, this is often not the case, and most users either completely push back on changing their current processes, or, after the initial trial period, end up not using the system at all.

Adoption and usage of the Enterprise Informa-tion Management (ECM/EIM) systems isgenerally not “love at first sight”; rather, it is a gradual process that requires patience, determination, and persistence. During the

implementation process, the organization plans and conducts many activities; it develops use cases to document the requirements and vets the system’s design to ensure that it conforms to industry standards. It conducts pilots to validate the system functionality, asks user focus groups to provide feedback on the usability aspects, and establishes a production support team to provide post implementation support.

The information management team develops processes and procedures to document the various steps necessary to secure company information. They put a lot of thought into developing the information management system’s requirements and design. Having planned for and accomplished so many mile-stones, it is hard to imagine that the biggest milestone may be lurking around the corner, and surprisingly, it may be the one that is the least planned for. This milestone is planning for adoption and usage of the ECM/EIM system. Most implementations involve the user community and generally solicit their opinionsthroughout the implementation cycle, so it may be taken for granted that users would widely use the system once it is in production.

Unlike other IT applications, however, infor-mation management systems are generally not driven by the lines of business—it is more likely that the application is being implemented directly by IT at the behest of the CIO, a governance committee, or operations, hence a slightly different approach needs to be taken during the rollout process. In some cases, organizations apply the same approach to an information management system rollout as they would for other routine IT applications rollouts. Unfortunately, while this approach may work fine from an IT standpoint, it may not always result in an optimal outcome from a business standpoint. One example that we came across recently was when certain users in an organization refused to use the informa-

28517 Ganesh Vednere

tion management system since they perceived it as a cumbersome, burdensome application that did not address their information needs or solve their business problems.

SteP 2: UNDerStaND the characterIStIcS oF INForMatIoN MaNageMeNt PrograMSWhile each information management implemen-tation will vary in terms of the individual business problem being addressed, the types of technologies leveraged, and the teams being used to deploy the solution, there are certain key characteristics that are common to all information management systems.

take Stock of the SituationTypically information management programs are just one of the many corporate initiatives that are going on. Usually an organization has many conflicting priorities, the result being that some departments may not consider the information management program as part of their core business goal. This means that not all business units and their management are going to be very committed to the success of the information management program. Addition-ally, the perceived benefits from correctly managing information varies from business unit to business unit, so the outcome of the information management program does not appear very tangible to users.

Understand the realityInformation management programs are built on top of existing platforms, frameworks, systems, and business processes. This means that no matter how good the program strategy is, the implementation is contingent upon the practical realities on the ground. For example, the implementation strategy may call upon the use of a services model to federate systems together, but if existing systems do not support

services or make it costly to do so, the imple-mentation teams have limited choices, and may be forced to search for less attractive alterna-tives—or worse, not be able to integrate at all.

Make a case for IMMeasuring return on investment (ROI) for information management programs is not very straightforward. There are a number of hard and soft factors that determine the ROI, and the typical time frame before tangible benefits appear can be as long as several years. At the same time, management expects a relatively quick ROI. All of this makes demonstrating information management’s benefits to senior management very challenging. In fact, most information management programs end up having to justify their existence at each annual budgeting discussion.

know What You are BuyingThere are no “silver bullets” when it comes to technology solutions. While impressive strides have been made in packaged out of the box solutions, most commercial-grade information management platforms require significant customization and configuration before they can become a viable solution for an organiza-tion’s unique needs. Organizations cannot simply expect to buy a packaged solution to meet their information management needs; instead, they must go through a long technology implementation cycle before the solution isready to be deployed.

take time to assessEvery information management strategy needs to be based upon a deep understanding of the company’s business processes and their under-lying steps. Information managers need tounderstand the information inputs and outputs of processes: how they are generated, stored, retrieved, managed, and disposed. This makes

286 TIMAF Information Management Best Practices Vol. 1

information programs all the more complex, as it takes a significant amount of time to assess and understand business processes, what systems are involved, and how they interact to manage information.

anticipate changeAnother key aspect of information management programs is that each implementation faces an ever changing business, regulatory, and techni-cal landscape; information is changing at the same time that controls are being implemented to manage it. New regulations require new ways of securing information. The pace of this change has increased exponentially in the past few years, and information experts are already pre-dicting a doubling (if not more) of the amount information produced and used in a few short years.

Given these realities, how is it that mature technologies like ECM have challenges with user adoption? Is this an issue with the techno-logy, the strategy, or the implementation approach?The answer lies in the user. In the end, no matter how sophisticated an information management system is, it is only good if it actually used. Therein lies the challenge: how to get users to use a system in which they find no particular incentive to use it. Let us see some of the key factors that create reluctance and user adoption issues.

sTeP 3: exAMIne The reLuCTAnCe FACTorsLow usage is the bane of information manage-ment implementations. There are multiple factors that create end user reluctance or apathy, and end up affecting the usability of the system. This foot- dragging on the part of end users presents a formidable challenge for information managers to resolve. This section addresses some of key causes for low usage of

IM systems.

User PerceptionSome business users assume that managing information is someone else’s problem—they cannot be bothered with keeping track of and following an ECM/EIM ‘s activities and associa-ted steps. More often than not, this perception is stronger in organizations that have no prior content or information management programs, and that suddenly implement controls within their business processes. Many users already have to deal with disparate IT applications, navigate various approval steps, and generate an assortment of documentation in order to complete a single business process. Adding another system to the mix, especially one that is not perceived as being core to process, will certainly not be welcomed with open arms. While companies expect that providing training is the way to obtain user buy-in, the realityis that end user training can end up being little more than a “check-the-box” exercise. Clearly, this does not help with usage and system adoption. Users are trained on how to use the system, but not on how to incorporate the infor-mation management procedures and the system into their day to day business processes and ways of working.

organizational cultureMost large organizations work in lines that are drawn within departments and regions. Information is managed through silos, and infrequently shared. System developers would rather add additional columns in their database than write a web service to pull existing infor-mation from another division’s system. While some organizations have taken commendable steps to focus communication and training around importance of information sharing, the reality is that many business processes still operate without basic information management tenets. In addition, many senior managers are

28717 Ganesh Vednere

wary of sharing too much information with other divisions, for a variety of organizational and political reasons. The result is an organiza-tional culture that implicitly does not promote information sharing or consistent reuse of content.

Lack of Focus on UsabilityUsability can make or break the entire imple-mentation program. While the user interface may appear to be very intuitive to the design team, the average user may have a very different opinion. Traditionally information systems and their underlying technologies have been focused on core information management functions—ingesting, collating, organizing, and securing information, applying policies, and disposing of information at the end of their retention periods. One only has to look at some of the applications available in the marketplace to understand that many are not the most business user friendly applications out of the box. The primary focus has always been on meeting information management standards, not creating an intuitive, easy-to-use product for the end users. In some cases, the actual end user interface has been left to be customized by the customers, while budget constraints often make it difficult to spend significant amounts of time on designing a best in class user inter-face. This results in a system that meets the information management requirements, but falls woefully short on the usability aspect. Given the initial reluctance of the users to use yet another system, lack of intuitive user interface makes it that much harder to promote the system.

Disparate SystemsSome implementations of information manage-ment are mostly standalone—the system does not necessarily interface or integrate with other business applications. This means that there is no automated bridge between the business

applications that produce the information and the information management system that manages that information. The end result is that users are forced to take extra steps to promote the information from the source system to the information repository. Users may also be asked to re-enter metadata that was already available in the source system, but could not be auto-populated by the destination system. This means that the information may have lost its original context, and now requires a manual step to recreate it. With some business applica-tions, the actual information is not easy to extract, and even more steps are required to move it to the information repository and link it back to the source system. If the link from the source system to the information repository does not work or is not available, users willsimply start to create copies of the original information and keep them on a local drive or file share for easier access, which derails the very purpose of a centralized information management system.

Inadequate Post Production SupportOne of the most important aspects of a success-ful information management system is the solid support provided to end users. This includes not only system support, but also business, legal, and compliance support. There are always users who probably do not know or—do not want to know— the principles of a good information management approach. It is therefore critical that a strong foundation of post production support be provided once users start to use the system. Lack of budget, limited staff, and management commitment make it very hard for information managers to provide the level of support necessary; it is not surprising to hear of two or three information managers support-ing thousands of end users. Also, information management is typically not part of the normal help desk repertoire. Users need to call or e-mail separately, to the team, to obtain

288 TIMAF Information Management Best Practices Vol. 1

support. This lack of dedicated support results in user apathy, as they do not get the fast turn-around needed.

Frequent changesWith the ever changing business and regulatory environment, the advent of newer and newer technologies such as those in Web 2.0 (social networks, software as a service, etc.), changing laws and regulations, and the increasing pressure on organizations to be 100% compliant, we live and work in a very challeng-ing environment for information management systems. Just think—as an industry, we are just now catching onto automating the management of e-mail information, and in the meantime users in our organizations are already installing wikis, blogging, and building social networks. At some point in time, companies will need to evaluate all of these formats for useful informa-tion, research their storage requirements, and manage them appropriately. The pace of business, coupled with constant change, means that no information management system is going to be able to keep up with the demands of managing all of the information. Addition-ally, some users may resort to making their own rules up and self-governing information, which may not always be the most optimal scenario.

sTeP 4: MoVe FroM reLuCTAnCe To AgreeMenTTransforming user reluctance into acceptance is not easy, but the path can be eased by consider-ing the following factors:

Maximize User InvolvementIt is really important to involve the system’s end users at every step of the information program— from developing scope and require-ments, to laying out policies and procedures, to instituting program governance, to system design, development, and testing. Users also

need to be involved in developing the post production processes. User involvement can take various shapes and forms. Start by de-termining a core set of users—perhaps across the various departments that will be using the system—who will act as a “reality check” during the implementation lifecycles. This core set of users will also provide guidance on all aspects of the system—how it should function, what interfaces need to be developed, and what tasks need to be automated. In essence, end users provide key input—not just during the requirements phase, but also during the design and development phases. With the right level of involvement, users have a vested interest in the success of the program.

In some organizations, users may not be willing (or may not have) the time to support a reason-able level of involvement with the program. It then becomes incumbent upon the program managers and sponsors to obtain the necessary resource and time commitment from both senior and middle level management of the various lines of businesses and departments. It may also be worthwhile to establish some ground rules around user involvement; for example, users will spend a certain amount of hours per week in supporting the program etc. If support is still not forthcoming, involvement from a “C” level person or other senior mana-gers may be required to push the business units into providing resource and time commitments.

Given the extended timeframe for implementa-tion, the program managers must also ensure that these users remain committed for the dura-tion of the implementation, otherwise once the initial period of meetings are over, users may slowly start to drift away, or miss important milestones and eventually stop providing the requisite level of support. In order to mitigate some of these concerns, program managers should focus on what exactly is needed from each user—instead of having routine (and dare we say—boring) weekly meetings, plan

28917 Ganesh Vednere

on involving the users in specific and manage-able tasks. Do not expect users to just give you the data you need; instead plan on providing templates, questionnaires and design options, holding facilitated sessions, and involving their management. At the end of the day, a little recognition of their valuable contribution goes a long way in getting their long term support.

control ScopeWhile this may appear to contradict the earlier point about ensuring that user needs are met, it is still critical for the implementation team to manage scope right from the beginning. A typical EIM program implementation starts off with a somewhat less than solid scope; after all, there are rarely any boundaries established in the initial stages around information types, media, formats, locations etc. In fact, the first step usually is to assess the information land-scape and then figure out all of the details around implementation steps, milestones, time-lines etc. When the overall scope of the program does get established, the implementation team must ensure that the scope is well managed and the appropriate change management processes are instituted and followed. Controlling scope does not mean that one should cut corners and eliminate items that may impact timelines. Rather, it is a consideration for the program managers to keep scope creep on the back of their minds, whenever discussions on features and functions are being conducted. Once again, involve the users in discussions on changes in scope, and help them to understand the cost/benefits involved in putting in or taking out specific features within the system.

establish robust Post Production SupportThere is nothing that will make or break the information management system as much as post production support—or lack thereof. Post production support is not just setting up a help

desk number to route user queries; you should conduct a detailed analysis of the types of support requirements. This includes an under-standing of the types of questions posed by users: those related to business, workflows, systems, user interface, training, defects and enhancements etc. It is best to have a dedicated phone number or e-mail mailbox augmented with FAQ’s and how-to’s, and system documen-tation that is posted on a readily accessible website. In addition to supporting the technical aspects of the system, production support team should be ready to respond to non-technical questions such as:

• HowdoIstoreaparticulardocument through a business workflow? • CanIdestroythisexpiredcontract?• Igotane-mailapprovalfromanotheragency. How do I store it?

As defects are found or enhancements request-ed, it is important to provide users with a clear understanding of how these will be addressed. As an example, if a user requests an enhance-ment to a particular screen, he should be able to understand how the item will be prioritized, how the cost/benefit will be conducted, who is the deciding authority on go/no-go for that item, and the timelines for developing those items in the system. A good enhancement/change management model should include a select group of senior users who participate in the enhancement prioritization processes. After the system goes into production, there is generally a spike in “one- off” user requests which may range from the trivial (‘Can we make the font larger?’) to the major (‘My business process needs to be integrated into the system workflow.’). The implementation team must make appropriate provisions—must have additional resources dedicated to production support team in the beginning—to provide adequate service levels.

Another important aspect of post production

290 TIMAF Information Management Best Practices Vol. 1

support is user communication. Almost all systems need to brought down for routine maintenance activities and software or hard-ware upgrades. Wherever possible, establish a “downtime schedule” at least two to three months in advance so that users know when the system will be down. It is also helpful to communicate any workarounds, if available. Users must also understand the potential impact of any disaster recovery timeframes on their business processes—if it takes 6 hours for the ECM/EIM system to come back up, what is the impact, if any, on business transactions during that time period? Also, these items must be documented and approved by the business and technology leadership to avoid any confusion when such an event does occur.

Plan for changes but Minimize end User ImpactDo not spook the users by doing frequent application releases, constantly updating procedures, or performing constant program changes. Establish an effective change manage-ment process that reviews every change affect-ing the end users, and attempt to minimize user impact; for example, staggering several systems changes into one production push. Once again, an effective support team will provide assistance to the users and minimize the inevitablefrustration that appears when processes or systems are changed. Frequently users complain that they get little—and in some cases no—notice of the system changes, and do not get a chance to evaluate how the system changes impact them.

In addition to system changes, there are also updates required to information management policies and procedures. Typically these updates are done on an annual basis, or more frequently if there are new regulations or laws that impact how the organization manages its information. As part of each update, a comprehensive assess-ment should be conducted of how each update

impacts the underlying business processes, procedures, and systems. Simply posting policy updates through the company’s intranet is not the best way to ensure that everyone under-stands the updates to the policy or its impact.

sTeP 5: MoVe FroM AgreeMenT To ACCePTAnCeThe next step in the user adoption cycle is to get to a state where users accept the system, understand the features, functions and limita-tions and start to adopt it. The acceptance of the system can be facilitated by several of the following key factors:

Institute a governance ModelIn order to manage the roles and responsibilities of all parties in the information management program—whether business, legal, compliance, or IT—you need to develop a good governance model. It behooves the information managers to focus on each of the operational aspects of the program and develop clear cut governance activities and procedures. These include policies for the creation and maintenance of informa-tion, information disposition, permissible formats, access and security controls, search, e-discovery, and information disposition. The governance model should incorporate informa-tion managers within the lines of businesses as well. As an example, the organization could implement a governance model where there is a corporate level centralized program team that establishes policies and procedures, while local-ized information managers within the various divisions and departments are responsible for implementing those policies and procedures within their respective areas.

The governance model should also involve participation from mid and senior level manage-ment, including a matrixed relationship to the compliance, risk, and IT oversight committees. The thought here is to reinforce that informa-

29117 Ganesh Vednere

tion management is not just an IT responsibil-ity; rather, everyone within the organization must come together to collectively solve the information puzzle. By having representatives from each of the lines of businesses, we can also ensure that the program will get support from all areas within the organization. In addition, these senior management points of contact act as escalation points when required.

Focus on UsabilityRight from the start of the program, focus on usability and developing good user interfaces. Information management packages generallyprovide an extensive library of widgets with customization capabilities. It is essential to focus sufficient amount of time in thinking about and designing an interface that appeals to the end users. No amount of bells and whistles in the core information management function is enough if the user experience is lacking. Even when there is a limited budget, information managers must address usability to the best of their capabilities. It helps to develop a focus group of users who review and provide feedback on the usability aspect of the system. In general, the system should minimize the number of “clicks” that a user needs to do in order to store anything into the information repository.

Another aspect of usability is providing due consideration to automating as many of the information management workflows as possible. For example, wherever appropriate, get users to use templates when creating documents, and enforce this policy across the board. By ensuring your documents’ consistency, it may becomefeasible to organize, classify, and structure them in an automated fashion. Provide naming conventions and standardized metadata taxono-mies for records so that storing, retrieving, and retaining them becomes that much easier. During the information analysis, figure out what areas are good candidates for automating and

then develop cost/benefit analyses to aid in the planning process. A good example of this is scanning of paper information. Some systems require users to do several steps: loading the paper, creating batches, entering metadata, doing quality control, importing batches into the repository, moving information from scan queues to appropriate folders etc. You might be able to use currently available products that have intelligent automation to reduce steps and minimize the effort for scanning.

System stability, ease of use and, optimal performance are the hallmarks of a good nformation system. Any system, whether it be information management system or other-wise, will not win users over if it is slow and cumber-some to use. Ideally the technology team should have planned adequate amount of server, storage, and network space, but it does not hurt to conduct periodic system performance evaluation and implement corrective measures as appropriate. Validating ongoing system performance is more important when the program is implemented in a phased approach and more and more users are brought on incrementally.

Integrate and Federate SystemsWhere feasible, review the integration of the information management system with other corporate business and IT applications to mini-mize the number of user actions required to move information into the information reposi-tory. This is much easier said than done. Integrating disparate systems is no easy task, and most organizations who have tried enter-prise level federated architecture models have not really had much success. Part of the problem is that there is no standardization of data models, application platforms, or develop-ment widgets, and attempting to suddenly tie them together becomes very difficult. However, organizations are beginning to realize thepractical benefits of a services or federation

292 TIMAF Information Management Best Practices Vol. 1

model in specific areas of information manage-ment. Enterprise search is one such example—the vendor market is rapidly maturing, and many organizations have started implementing enterprise level search tools to enable informa-tion search for litigation or business requests. As more and more vendors include a federation model that is practical, it will be increasingly possible to manage information as-is, rather than extracting and importing it into the infor-mation repository.

As new technologies appear in the marketplace and existing technologies mature, organizations that have not already done so must start on the path of information federation. There is a lot of ground work that needs to happen before vendors comes into the picture, and it behooves information managers to analyze and under-stand what systems can be federated, what the impact on business processes will be, and how information federation will change existing ways of working and business operations. There are common taxonomies to be developed, metadata to be determined, and detailed systems analysisto be conducted. Organizations must make haste and start identifying candidates for federation.

Develop Simpler Information Management ProceduresMost organizations have business procedures documenting the steps that need to be under-taken to complete a business function. It is highly recommended that you update these procedures with specific steps to address information management. If procedures do not exist or are not easily available, information managers should consider creating simple to use guidelines for the end users. This may include a cheat sheet or a summary guide with key and commonly used functions or short cuts. You can distribute this documentation during brown bags, presentations, and town hall events.

Information management procedures that are straightforward and easy to follow are more likely to be adopted as a common practice—users will not adopt complex procedures that involve multiple steps and force them to go to different systems or update data on multiple screens. After all, the basis of a sound proce-dure model is to ensure that all information is searchable, locatable, and— most importantly—producible in a format and manner that is operationally efficient and effective. In order for this to happen, procedures must be practical and understandable by the average non- techni-cal user. A typical information management pro-cedure must address the following key points:

• Howwilltheinformationtobecreated? (Do we use a template? Is there a standard format?) • Whatmetadataisrequiredtocategorizethis information? • Whatistheformatfortheinformation?• Wherewilltheinformationbestoredand secured?• Howwilltheinformationberetrieved?• Howandwhenwilltheinformationbe disposed of?

These things form part of the information management workflow that should be incorpo-rated within various business procedures and process steps.

Summary and ConclusionEnterprise Information Management systems need not be “yet another system we are forced to use”. ECM/EIM programs are generally driven by technology and not by the lines of business, and hence there is a certain resistanceto a full scale adoption of system. Proper up-front planning, keeping the focus on the end

29317 Ganesh Vednere

user, and providing an adequate level of sup-port goes a long way in turning user reluctance into acceptance, and getting the user to say the elusive “I Do”.

From acceptance to Partnership: Making Your Information Management Program SuccessfulOnce users have accepted the system, have started to incorporate it into their day to day processes, and there is no longer any need to “force feed” information management, the program should move to a collaborative partnership model. Users are no longer an afterthought—they have in fact become valuablepartners with a vested interest and stake in the success of the of information managementprogram. Here are some key considerations for information managers to consider in making the partnership model a success story:

inVESt in PROGRAM PLAnninG AnD SEtuP

Information management programs are seldom quick hits—they take money, time, and resources to be successful. As organizations look to curtail costs and reduce overhead, justifying large scale information programs can be hugely challenging. At the same time, organizations continue to spend significant amounts of money on searching and retrieving for information. If information managers are able to demonstrate cost savings, a good case can be made to fund the information management program. This is a key part of the overall planning process. Information managers must balance organiza-tional needs versus organizational budget. It is all well and good to think in terms of industry best practices and other such metrics, but if the organization is unwilling or unable to spend large sums of money on information manage-ment then information managers must step back, focus on key risk items, and expend their

energies on those items alone. There is no point in “boiling the ocean” and attempting to solve each and every information management problem within the organization. Keep the following points in mind as you plan your information management program:

• Go slow but steady: the implementation cycle should rarely undertake a big bang approach, rather a methodical rollout approach is called for. This gives a chance for both the implementation team and the users to determine, review and fix issues before they turn into problems.• Focus on the big picture: provide a big picture view of the program to users so that they appreciate the why, how and what of the program. Help the users to champion the program on their own and help them to appreciate the benefits of an information management system on their day to day activities.• Conduct pilots: leverage smaller pilots in selected areas to validate the implementation approach and to use it as a sounding board to obtain feedback from end users and all other parties involved (e.g. Legal, Compliance, Audit, IT etc.).

PLAn FOR tHE FutuRE....nOW!

Information management programs are notori-ous for quickly outliving the controls they are based upon. Technology cycles are fast-paced, and the even faster pace at which information workers adopt these new technologies makes it imperative that any information management implementation provide the appropriate level of flexibility to augment the existing infrastruc-ture and add controls and components as the demands of the program grow. Ultimately it is the level of openness and flexibility, not the level of good design or sophistication of the technology architecture, that determine how long an information program lasts within the organization. In every case technology gets out-

294 TIMAF Information Management Best Practices Vol. 1

dated, systems grow obsolete, business process-es get updated, and people and organizations change. What does not change is that the underlying information has to be managed.

An information system that has been built with an eye to the future serves the organization well in the long term. An example of this long term planning would be to include social information. Given the focus on social collaboration and userdriven computing, it is essential to plan and develop controls to manage information in these channels. Many of these social collabora-tion utilities are maintained outside of the company’s firewalls, and therefore information managers must think about how to secure this information through a combination of policies, procedures, and information controls. This may include restricting access to certain collabora-tion tools, hosting collaboration tools within the company, monitoring/tracking unauthorized usage, developing strict content guidelines, and establishing approval steps.Another example is to start thinking about federation of information and how it will impact future information management. It is a given that there will be a variety of new and updated information channels in the future. It is also a given that one cannot possibly build controls or processes to manage each of these information channels. Hence it is good to start thinking about federation of information to address this uncertainly. There are multiple ways to achieve federation—adopting a services model, implementing an enterprise information backbone, using third party tools to provide federated functionality etc. Even though there is not yet a simple and low cost way to do federation, the organization must initiate discussions around federation and common information frameworks.

SECuRE ExECutiVE MAnAGEMEnt SuPPORt

Obtaining executive management is vital to the success of any information management program. Some program implementations rely too heavily on having just one or two executivessupport the program, but what happens if that executive leaves the organization or gets reassigned? The entire program is at risk, as it is quite possible that new leadership will not support existing program structure and plans. It is therefore crucial that information managers obtain some level of support from a wide variety of executive managers and C-level individuals. This support should be from the business, legal, compliance, operations, finance, and technology teams. This model acts as a safety net during organizational changes, re-organizations, mergers, and partitions. Having multiple voices supporting the program and showing an interest in completion goes a long way in ensuring the long term survival of the information management program.

In addition to support from senior management, make contacts, establish relationships, and demonstrate how you can add value to other levels of management, especially to middle management. In a way, information manage-ment programs are all about getting people to do what you want them to do (at least in terms of managing information). Use broad based support to drive home the message of proper information governance. Leverage management to push the compliance message where appro-priate. Reinforce the message by getting man-agement—especially senior management—to send out periodic communiqués to employees. There is nothing like a C- level executive asking employees to follow information management policies; previously insurmountable obstacles disappear, roadblocks remove themselves on their own, the naysayers suddenly become supporters, and so on.

29517 Ganesh Vednere

FOCuS On ORGAnizAtiOnAL tRAnSFORMAtiOn

Changing the culture of the organization is neither easy nor trivial—it takes relentless focus and perseverance to make even a small dent. Yet changing organization culture is paramount to the success of any information management program. The important thing is to get senior management and large numbers of users to appreciate the benefits of good information practices. Once this is done the path is somewhat eased, as everyone realizes it is in their interest for the program to succeed. How to achieve organizational transformation? It takes time, patience, and strong internal and external support. If the information managers have access to individuals within the organiza-tion who have expertise in organizational trans-formation, then they should be leveraged. Quite often though, it makes sense to engage outside experts to establish the framework for the management of the transformation. While setting up policies, procedures, and processes is certainly a good step, do not rely upon these rules to transform people’s behavior overnight. Using a structured approach to cultural trans-formation, with the right level of communica-tion and training, goes a long way in smoothing the path.

“OVER” COMMuniCAtE

There is no such thing as over communicating in an information management program. Here are some key communication-related points:

• Provideanadequatelevelofcommunication across all levels of the organization. Prepping the users far in advance allows adequate time for management and the user community to implement any internal changes required as part of the program rollout. Communica- tion should include references to all of the individual pieces of the program—policies, procedures, systems, training, deployment,

and production support.• Usetrainingandfacilitatedsessionsasaway to have an open and frank discussions with users on how to best manage information. Help users to understand the benefits to them and to the organization. Do not make training a one way monologue; use it as a way to showcase the program and emphasize that the program cannot be successful with out their support and participation.• Leverageusersfromthepilotandearly implementations to convince other users. Establish user focus groups that act as your local sounding board within the business units. These focus groups provide a platform for users to voice their opinions, concerns, and issues to the information management team, while providing lessons, best practices, tips, and tricks to take back to users’ respective teams.• Employfeedbacktoolssuchasformal/ informal surveys, a user feedback mailbox, comments via website, etc. to gauge the level of interest (or angst) and understanding of the program’s goals and objectives. Move quickly to address the concerns that these surveys uncover; that way users feel their opinions are being heard and being acted upon.• Organizeperiodiceventssuchas“Informa- tion Day” to drive home the importance of sound information management practices. These events are low cost high value ways to broaden the reach to all groups within the organization.• Duringthedeploymentplanning,think about how to train consultants and contractors that have been hired within the organization on various business and techno- logy projects. Since most these positions are temporary, plan the on-boarding process so that these contractors understand information management practices. When their contracts are completed, the de- boarding process ensures that all of their documents and records are handed off to their respective

296 TIMAF Information Management Best Practices Vol. 1

managers. This is also true for employees who may transition from one role to another or transfer to a different department within the organization.

MOnitOR YOuR PROGRESS

In addition to communications, users must be aware of the consequences of not following the information policies and procedures. It is not uncommon for some organizations to terminateemployees who have seriously violated company information policies, especially those involving security considerations (for example, posting confidential documents to personal e-mail addresses). While a “carrot and stick” is not always appreciated, it does help to make the point that the program is a must for the health of the organization and therefore every employee is expected to follow relevant proces-ses. The following are some ways to monitor your progress:

• Develop a business unit information management compliance plan. – Develop a plan for each business unit that addresses how the business unit will use the system, and how it will adhere to and comply with the information policies. The plan includes audits, monitoring, and tracking of any issues found. A periodic compliance report should be sent to each of the business unit managers and executive committees to ensure that there is adequate senior management traction.• Monitor locally and act locally. – Implement- ing an audit for an information management program in a global setting can be tremendous-ly challenging. It is best to focus efforts to plan, act and implement the audit program locally. Each region has its own unique business, legal, regulatory and cultural nuances and it is very difficult to standardize and institute a common information audit structure in a global setting.• Conduct spot checks. - Conduct informal

spot checks to see how well users are complying with information management procedures and if they are using the systems as intended. Document any issues that there discovered, and work with users and their line managers to mitigate the issues.• Monitor cost savings. - Track cost savings and the other benefits that result from the information management program. These statistics provide a great way to bolster the standing of the program with senior manage- ment and highlight the payback.

29718 Martin White

MARtin WHitE ([email protected]) is Managing Director of Intranet Focus Ltd. and consults on the design and management of intranets, the development of enterprise infor-mation management strategies, and the implementation of content management and enterprise search software. He has been a Visiting Professor at the Department of Information Studies, University of Sheffield since 2002 and is also a member of the Publications Board of the Royal Society of Chemistry. Martin is a Fellow of the Royal Society of Chemistry and an Honorary Fellow of the Chartered Institute of Library and Information Professionals. Martin is the author of the Eureka column on enterprise search issues for EContent magazine, for which he is a Contributing Editor. He was Chairman of the Online Information Conference from 1999- 2006 and is the author of ‘The Content Management Handbook’ and ‘Making Search Work’ and (with Stephen Arnold) ‘Successful Enterprise Search Management’. He has worked for clients in the USA, UK, Europe and the Middle East.

Using semi-structured ethnographic interviews to determine user information requirements

gathering Information requirements

18 xMartin White

298

AbstrAct

TIMAF Information Management Best Practices Vol. 1

Before any solutions can be found for the problems caused

by poor information management the nature and scale of the

problems must be identified. A great deal has been written

about the importance of carrying out user research when

developing any technology-based application, but there is

a significant lack of recommendations about how this user

research can be carried out when the need is to establish

how people find, use and share information.

The solution proposed here is based on classic ethnographic

interview techniques developed by sociologists and anthro-

pologists to look at way that people behave. A set of generic

information discovery queries will be used as a framework for

the interviews, to avoid the problems associated with asking

users either what resources they use (which provides no infor-

mation on how these resources are used) or asking in a very

general way how they go about finding information.

299

Background

18 Martin White

Over the last ten years a wide range of organi-zations have requested help with writing intranet strategies. It is always a concern that organizations that have strategies and policies for everything from pensions to smoking in the office have failed to see the importance of developing a strategy for an application that is available on every desktop and contains information that people need probably every work day.

Without a strategy, the usual approach has been to add every possible piece of information that the organization possesses to the intranet and then assume that since all the information is contained there that all user requirements have been met. However organizations have often found that intranet usage, as evidenced by page hits or some such metric, does not increase. At this time, companies call in a consultant requesting a diagnosis and fix the problem.

Under these circumstances, there are actually many problems that need to be fixed. First of all the number of hits on a page is no indication of the value of the page. Second, providinginformation generated internally is only part of the requirement, as staff also need access to a wide range of external information resources. The third issue, and the subject of this paper, is that staff do not go page-hunting. They are usually on a journey of information discovery, where one piece of information leads them to the next piece. In the course of an average day, employees spend considerable amounts of time just looking for information.

There have been a number of studies on the amount of time spent searching for information, notably by Susan Feldman (1) and her team at International Data Corporation (2). Also, in late 2006 a survey was carried out by Accenture (3) that showed that middle managers spend more

than a quarter of their time searching for infor-mation necessary to their jobs, and, when they do find it, it is often wrong. Managers spend up to two hours a day searching for information, and more than 50 percent of the information they obtain has no value to them.

Probably the first study that put a price on the failure to find information was carried out in early 2008 by Capgemini (4). The main findingof this study was that a broken information culture – that is, the values and behaviors associated with how they collect, use, manage and share information – is endemic in the UK and is believed to suppress performance by an average of 29%, with 63% of respondents making decisions at least daily without access to the best available information. If this loss of performance could be overcome with good information systems and cultures then the eco-nomic gain for the UK could be $140 billion, an amount that may have allowed escape from the recession in 2009.

Overall, broad-spectrum all-staff surveys provide little in the way of useful feedback because the questions have to be very generic. When targeted at a small group of users they can be of value, but first the small groups have to be identified. Another common approach is to interview roughly 50 members of staff and from these interviews scale up the results to be representative of perhaps 50,000. The level of trust that can be placed in the outcomes is usually very low, especially where the inter-viewers have had little in the way of training and guidance.

The skILLs needed To PerForM ThIs BesT PrACTICeTo carry out an ethnographic interview the main requirement is a strong understanding of

300 TIMAF Information Management Best Practices Vol. 1

the business of the organization, so that the outcomes of individual interviews can be placed in the context of the organization and particu-larly in its objectives. The questions asked must be open and not biased by the background of the interviewer. Ideally, interview notes should be taken by an assistant, so the interviewer can concentrate on the questions and responses.

sTeP 1: Choose usersI discovered ethnography accidentally, while developing an intranet strategy for the Interna-tional Monetary Fund in 2001, when presented with a copy of Inside the IMF by one of the senior economists (5). In this copy, the author used an ethnographic approach to describe how the IMF went about its daily business. Still, one of the best illustrations of the power of ethno-graphy is given in The Challenger Launch Decision (6), which shows - beyond any doubt - how the wrong decision can be made if the information base on which it is made is incomplete or flawed.

Experience conducting many hundreds of interviews using some of the basic techniques of ethnography over the last decade has shown it is difficult to stop the interviewees from going off into anecdotal reflections on the quality of the management of the organization, rather than the quality of the information available. While working on an enterprise search project at the World Bank, Washington, in 2009 there was a need to describe user requirements for the new search application. A set of generic information queries was created by the author, to assist the interview team. While developing these generic queries the author realized that they had a broader application than end-user research for search applications, and so from this work the four step process described below was developed.

The initial challenge is deciding how to choose which staff members to interview. First one

should talk to whoever in the organization is responsible for risk management. Many organi-zations publish their main risks in Annual Reports, and many strong examples can be found in the USA where it is mandatory to list the main business risks to the company in its annual 10K report. The example below is taken from a recent Xerox 10K, and it lists just a few examples of the risks that are set out in the filing.

• “Wefacesignificantcompetitionandour failure to compete successfully could adversely affect our results of operations and financial condition.”• “Ifwefailtosuccessfullydevelopnew products and technologies, we may be unable to retain current customers and gain new customers and our revenues would be reduced.”• “Wehaveoutsourcedasignificantportionof our overall worldwide manufacturing opera- tions and face the risks associated with relying on third party manufacturers and external suppliers.”

In the internal documentation that is used to create this risk register, each risk is allocated to a specific manager, and so an initial interview list can be generated quickly.

At this point, one can explore with senior managers who in the organization is going to be making decisions that will hopefully reduce these risks. The answer may well be “TheBusiness Compliance Manager,” a role rather than a person. At this point it is likely to be worth developing a set of personas that sum up the characteristics of a group of employees making similar decisions, so that one or two representative interviewees can be found and the outcomes then validated against a second group. The use of personas in the development of public Websites is well documented (7), but there is little guidance on developing personas for intranets and other internal applications,

30118 Martin White

other than the excellent work by McQueen (8).

sTeP 2: PerForM user InTerVIewsWhenever possible the interviews should be carried out at the desk of the interviewee. This is because the interviewee may need some visual stimulus to remember how particular decisions have been made. For example, “I usually check up on suppliers in this little blue book” reveals a dependence on a book that is now perhaps five years out of date and has been replaced by a subscription-based Website that the interviewee does not have access to. Ethnography is a field- work methodology, and so it seems that undertaking interviews in a conference room or project office never has the same feel and intensity.

After some introductions one should present the interviewee with the set of twenty queries asking them to select no more than five and ideally just two or three that represent either high-volume or high-criticality information queries. The term ‘query’ is used here quite deliberately, as opposed to‘question’ because the query itself may have to be solved by a series of questions, or in the case of using a search engine, a series of searches.

Experience has shown that interviewees can quickly scan down the list and spot the types of queries that they deal with on a regular basis.

In this list ABC is used to denote the name of the organization:

1. What work is ABC doing on [...]? 2. What has happened to this project/topic since [date]? 3. What other documents are there similar to this one? 4. Who can I ask for advice on [...]? 5. Who is this person?

6. I need to find the membership of this [group] to see if there is anyone on it I know 7. What have been the outcomes of this project/initiative?8. What are the procedures for [...]? 9. What are the key documents on [...]? 10. Where can I find the document on [...]? 11. What has ABC done in similar situations to [...]? 12. Which documents are related to this one? 13. I need to see/I do not want to see duplicate documents in other languages/other versions 14. I need to find documents which may be in multiple languages 15. I need to find all the documents on this topic for compliance reasons 16. I need to see information from both external and internal sources on this topic 17. I need to share information on [...] with my team 18. I want to be alerted when new information on this topic is added to the system 19. I want to restrict my query to a specific group of documents 20. I need the definitive document on [...]

For each query one should then explore the user’s behavior with these sorts of follow-up-questions:

• Whatpromptsareneededtoundertakethe query – were you asked by someone else, or is it part of your standard job profile to carry out this query every month (for example)?• Whatresourcesdoyouuseandwhy?• Howconfidentareyouthatyouhavefound all the relevant information? • Howconfidentareyouthattheinformation you have found can be trusted? • Whatdoyoudowiththeinformationyou have found?

Make it clear at the outset that the interview will last no longer than 45 minutes and that it is

302 TIMAF Information Management Best Practices Vol. 1

off- the-record. If one should need to cite a particular interview in a report she should always gain the written permission of the interviewee. At the 40 minute mark summarize the outcomes and show the list of generic queries to the interviewee to see if any further queries now come to mind. The final wrap-up takes around ten minutes and one should make it a point of principle to be finished at the 50 minute mark. However one should always ask the interviewee if he can come back to talk to them at a later stage in the process.

When writing up the interview aim to take no more than one page of paper to complete the task, using bullet points rather than a large amount of text. Once all the interviews have been completed scan through them and high-light keywords or concepts that seem to occur on a fairly frequent basis.

sTeP 3: AnALyse The ouTCoMeA quantitative approach to defining user requirements is not recommended because trying to undertake a quantitative survey in an organization is fraught with too many problems. Rather than producing a data analysis, one’s objective in this stage should be to try and capture some of the issues that will have a significant impact on the operational effective-ness of the organization and the individual if solved. Some examples to illustrate this follow:

1. People often wish to find the latest informa- tion or alternatively information published between two given dates. This is an under- standable requirement but leads immediately to the question of how the date of the piece of information is being defined. I have seen several global SharePoint implementations where there has been no recognition that the US and European date formats are different. It took many people in Europe some time to work out why the tragic events in the USA of 11 September 2001 are known as 9/11 and

not 11/9.

2. Another frequent request is for a real-time alert system about information that is added to the system. Further probing of this request is often revealed as not an information request but a defense mechanism from some- one who feels they have been sidelined by the organization. Another frequent outcome is that employees have no idea of the rate that new information is added to the system. This results in a lack of understanding of the extent they might be deluged with informa- tion that they cannot find time to process.

3. Understanding the process of information discovery can also indicate the way that someone looks first at a list of (for example) projects and a brief description of the project before carrying out a search for project papers. Users might make such a decision as they may not be sure of the spelling of the project name or even sure whether it is a relevant project.

In effect, one should be trying to build informa-tion discovery workflows showing where the information inputs and outputs occur, and just as important what caused the branch in the workflow. Another element of the analysis is to consider what might be referred to as the information literacy of the interviewee. I have found that user’s level of competence typically falls into one of four different levels of compe-tence, and people tend to exhibit characteristics of more than one of these. These four levels are:

known ItemThe user wants to find a single and specific item and knows where to look for the item, forexample the 2008 Strategic Plan. Finding the 2007 Strategic Plan, or the strategic plans for all the subsidiary operations is not useful, so using a search engine could be a frustrating exercise.

30318 Martin White

Learning/exploratoryThe user wants to find information that will help him to learn something, and therefore may have a poorly defined query. To continue with the case of the Strategic Plan, he has been asked to prepare a framework for the 2010 Strategic Plan, having just joined the Business Development department of the organization. Finding previous plans could be very helpful, as could any guidance notes on how to prepare not only strategic plans but any corporate docu-ment that has to go to the main board. New employees often fall into this category.

ParametricThe user knows the basic hierarchical structure of a repository and wants help stepping through it to find information. She might be looking for the business plan of the Turkish subsidiary. She would like to be able to go through a folder structure that works down from corporate and ends up at Turkey. Just finding the business plan for Turkey through a search may seem to be the obvious route, but (and this is a real case) the interviewee was not sure which regional operations area Turkey reported to. The folder hierarchy was useful in confirming this report-ing structure and also suggesting that it might be useful to look at the regional strategy as well as the input from Turkey.

expertThe user is familiar with the subject area and is able to frame the query based on expertise. This is where information discovery can take some twists. In the case of the business plan query that has been used as a case study the interviewee knew who would have written the business plan if they had been looking for the German equivalent. And so this user would have used the search engine to look for Gerlach and Plan and 2008.

sTeP 4: MAke use oF The resuLTsAll too often the solution for complaints of not being able to find information is to provide a search engine. However search is not the ubiquitous discovery solution. For example a hyperlink represents a contribution of know-ledge to a web page. Hyperlinks are generally added manually, and the people adding these links are using expertise to relate two docu-ments together. That expertise can be of im-mense value to others, especially in the Learn-ing/Exploratory use case described above.

Just off the campus of Stanford University in Palo Alto is a large shopping mall with stores designed to cater to the needs of Silicon Valley’s affluent population. You can reach the mall via El Camino Real, the Way of the King or the Royal Road. There is a branch of Tiffany’s but no bookshop. Upon entering the mall there is a list of stores categorized into around a dozen head-ings. The mall’s management decided to provide shoppers with no indication of the contents of each store, only a name and a location.

Looking for jewelry using such a system is not easy, especially to a person unfamiliar with the American approach to shopping, who will find the names of the stores unfamiliar. Just a few, such as Edwards Luggage, are helpfully tagged, but then Edwards Luggage also sells pens and executive novelties. The challenge is an order of magnitude greater if you are looking for a birthday present. There is no ‘Birthday Present Store.’ The best option is to talk to someone from one of the information desks in Macys, Bloomingdales or Neiman Marcus and have a dialogue with a person who will ask a lot of questions in an attempt, usually successfully, to point you in the right direction.

Going back to the IMF project mentioned above the project team on this occasion did carry out an organization-wide survey but the focus of the survey was on the issue of search,

304 TIMAF Information Management Best Practices Vol. 1

as there was a general view it did not work well. The survey asked respondents to give one example of where the intranet had supported work, and one where they had failed to find the information needed. When the senior managers looked at the information discovery failures, time after time they found that they knew where the information was to be found. But managers accepted that they only knew where this information was after some thirty years in the organization.

In October 2007, two disks containing informa-tion on 25 million UK citizens went missing from Her Majesty’s Revenue and Customs. The subsequent investigation proved highly embarrassing for the HMRC and for the Government. There was a Department Security Standards Manual (DSSM) that set out proce-dures that would have prevented the loss of the information. To quote from the report on the situation by Kieran Poynter:

“The primary dissemination method for infor-mation security policy in HMRC is via its intra-net. However, almost all interviewees contacted in my team’s investigations expressed a lack of knowledge as to exactly where on the intranet, security policy is to be found. In addition, staff have noted that the intranet search function is unhelpful in generating relevant results for search terms such as ‘DSSM.’” (9, page 35)

Another outcome of this process is that the queries can be used as the basis for usability tests, which all too often are based on the queries that the design team develop amongst themselves.

Summary and ConclusionMy purpose in writing this paper is to high-light the complexities of how people go about finding information in an organization. Surveys that do no more than seek to prioritize whether news is more important than policies totally miss the point and can be misleading. The ethno- graphic approach outlined here is an attempt to get close to this complexity in a more structuredway than just a general interview. There could well be more than 20 generic queries, but for now the set outlined here will be used by the author on a regular basis. Furthermore, the author would be pleased if someone could build on the methodology described here and create a more rigorous analytic approach to the problem of defining just how we need to go about managing information to create business and personal benefits.

Many years ago I came across the following aphorism, though I now cannot remember where: “The fire of progress is lit by innovation, fueled by information and sustained by hope and hard work.”

Understanding the role that information plays in progress is absolutely business-critical.

30518 Martin White

references1. Feldman, susan. the high cost of not finding information. 1 Mar. 2004. retrieved on 28 sept. 2009.

http://www.kmworld.com/articles/readarticle.aspx?articleid=9534.

2. IDc. the Hidden costs of Information Work. 1 Mar. 2005. retrieved on 28 sept. 2009.

http://www.interwoven.com.tw/documents/whitepapers/wp_idc_cdm_05.pdf.

3. Accenture. Managers say the Majority of Information Obtained for their Work Is Useless, Accenture

surveFinds. 4 Jan. 2007. retrieved on 28 sept. 2009.

http://newsroom.accenture.com/article_display.cfm?article_id=4484.

4. capgemini. Failure to exploit information loses UK economy £67 billion a year. 3 Mar. 2008. retrieved on

28 sept. 2009. http://www.uk.capgemini.com/news/pr/pr1605.

5. Harper, richard. Inside the IMF. New York: Academic Press, 1998.

6. Vaughan, Diane. the challenger Launch Decision. chicago: University of chicago Press, 1996.

7. Mulder, steve and Ziv Yaar. the User Is Always right. berkeley, cA: New riders, 2007.

8. McQueen, Howard. Using Persona Advocates to develop user-centric intranets and portals. 29 Oct. 2008.

retrieved on 24 sept. 2009. http://www.mcq.com/article-persona.

9. Poynter, Kieran. review of information security at HM revenue and customs: Final report. 25 Jun. 2008.

retrieved on 25 sept. 2009. http://www.hm-treasury.gov.uk/d/poynter_review250608.pdf.

306 TIMAF Information Management Best Practices Vol. 1

30719 Bill Yock

BiLL YOCK ([email protected]) is an Associate Vice Provost in the Office of Information Management at the University of Washington. He leads the Enterprise Information Servicesdivision, which provides services for information intelligence, delivery, design, integration, and definitions. Bill is a member of the UW Data Management Committee and the Privacy Assurance and Systems Security council. He is currently the chair of the Kuali Foundations TechnologyRoadmap Committee and a board member of the Kuali Rice Project.

Creating a Constitutional Democracy for Data Management

governing Data Management

oBill Yock

19

308

AbstrAct

TIMAF Information Management Best Practices Vol. 1

The University of Washington (UW) has a large number of extremely complex information systems. These systems include not only large administrative ERP business systems (Finance, HR, Payroll, etc.), but also a wide spectrum of educational and business specific systems (Research Administration, Student Information Systems, Housing and Food Services, etc.). The “Institutional Data” within these systems is a strategic asset of the University, and the appropriate governance for manage-ment and use of the data is critical the University’s operations. Inappropriate governance can result in serious inefficiencies and risk exposures. A consistent, repeatable, and sustainable approach to governance over Institutional Data is necessary to deal with the inevitable data management governance questions that arise like: Who is responsible for determining what data is sensitive? (confidential, restricted, etc.) What rules determine who can have acces to sensitive data? When there is disagreement over the meaning of data how can it be reconciled? What guidelines should exist to control copying and retaining data? Where do users go when they have questions or problems?To deal with these complex issues the UW has created a “constitutional democracy” approach to governance. A policy document serves as our base “constitution” setting forth principles, definitions, and responsibilities. Guidance docu-ments serve as “amendments” to the constitution, which helpauthorities interpret issues and make decisions. Data Trustees serve as an “executive branch”, with authority to administer and enforce policies and guidelines. Data Custodians are the “judicial branch”, organized according to specific domains of data under their jurisdiction. (Student Admissions and Enrollments, Research Proposals, etc.) A Data Management Committee (DMC) serves as the “legislative” branch, helping to write new laws and lobby for change in existing practices. The numerous information management personnel across the university are the “citizens” responsible for following the laws and administering data management best practices. By following a constitutional democracy approach the UW “data management republic” is working in a more cohesive and streamlined fashion than ever before. The approach has proven to be repeatable and sustainable.

309

Background

19 Bill Yock

The Data Management Committee highlighted in this best practice was initiated at the same time the University of Washington established the Office of Information Management under the direction of the Provost. The formation of these new governance bodies was proposed in a special report issued in December of 2006. The Future of Information Systems Task Force Report, and recent information about the Office of Information Management, are available at: http://www.washington.edu/provost/oim .

The skILLs needed To PerForM ThIs BesT PrACTICeTo follow our method you will need these sorts of people:

• A Sponsor: Someone in an executive position to give you the mandate to carry out the process.• A Champion: Someone well placed in your organization who promotes your initiative. This is a critical role.• A Manager: Someone to drive the process forward and manage the details. • A Governing Group: On a day to day basis, there are many decisions to be made that impact diverse groups within your organiza- tion. We found that a governing group with represent-atives from across the university worked well to provide the context, brain power, and authority to make and promote our decisions.

sTeP 1: FoMenT reVoLuTIon

Find revolutionary Leaders

The major administrative information systems at UW are primarily 20 to 30 year old main-frame COBOL based systems, which are batch

and transaction oriented. It is not easy to access the data for analytical and reporting purposes.This lack of easy access to data resulted in many units performing downloads of data into their own “shadow systems”. This situation lead to anarchy amongst data management professionals, resulting in everyone interpreting the data according to their own rules and definitions. Reports coming out of central offices seldom balanced with reports generated by academic units. Frustration and resentment had reached a feverish new height in 2005.

In response to this anarchy Mike Eisenberg, Dean Emeritus of the Information School at UW, helped bring together the Deans, Administrators, and Computing Directors of the academic units into a coalition called the IT Resource Sharing Group. Under Mike’s strong influence and persuasion this group met on a regular basis to discuss common needs and solutions. The group chartered several important studies that provided a strategic direction for advancing information system improvements. The Future of Information Systems Task Force Report called for sweep-ing reforms and new governance. In late 2006, under the direction of Provost Phyllis Wise, a new Office of Information Management was established along with a new Data Management Committee (DMC). The DMC was launched under the responsibility of Sara Gomez, CIO of the Office of Information Management (OIM).

Destabilize the old regimeOne of the primary symptoms of the need for a revolution in data governance was dysfunction within the Enterprise Data Warehouse (EDW) program. That program had been operating for several years with substantial barriers to wide-spread adoption and success. These barriers consisted of the typical data management

310 TIMAF Information Management Best Practices Vol. 1

challenges like lack of access control and lack of data definitions. Many well-meaning individuals met on a regular basis in a Data Management User Group (DMUG) to try to resolve these issues, but it was clear that they lacked authority and accountability. The EDW often served as a bottleneck to progress, rather than generating solutions. For example, when EDW managers denied access to sensitive data in the warehouse to certain users (at the strong request of other DMUG members who felt responsibility over that data), they were caught in the middle of disputes over access. EDW managers would attend DMUG meetings, challenging the assumptions that the EDW members made regarding their personal opinions regarding data management best practices. With the formation of the DMC, the DMUG was disbanded and replaced with a forum to officially set data management rules, raise and resolve difficult issues, and enforce the decisions that were made.

In addition to the EDW program within OIM, many other administrative central offices and units (Human Resources, Payroll, Planning & Budgeting, Institutional Studies, Financial Management, Deans Offices, etc.) work with institutional data. They too faced the same issues that confronted the EDW. Many of the leaders of these units were recruited to serve as committee members on the DMC. They have become an active “congress” in establishing rules and guidance on data governance. They have also become cornerstones in the task forces created by the DMC to tackle tough problems.

Declare a New regimeThe legitimacy of the Data Management Committee was established by a Charge Letter from the Provost to each committee member. Many important projects and programs are often initiated by charge letters from the Provost at UW; the letter highlighted the

urgency and importance of establishing data management and business process best practices. The appointment of Mike Eisenberg as chairman established the credibility that the DMC needed to succeed. His influence and political respect among the President, Provost, Vice Provosts, Vice Presidents, Deans, and Chancellors enabled the fledgling democracy to take root. The revolution had begun.

sTeP 2: esTABLIsh A ConsTITuTIon

create a Preamble and Bill of rights

Early in the formation of the DMC, it became apparent that we needed to articulate a new set of laws that would provide a solid foundation of principles and best practices. I researched the websites of many other universities to gather input on how they organized and defined data management practices. While each had unique methods and processes, all had common“policy essentials” which have becomeour “Bill of Rights” of data management best practices. These policy essentials can be boiled down into the common types of:

• Value: Data and information are important assets. • Risk: Data and information need to be protected. • quality: Data and information need to be improved continually. • Availability: Data and information need to be accessible. • Accountability: Someone is responsible for data and information.

All of these themes were common to the needs UW’s needs. Building upon the themes found on other websites, I constructed a draft list of policy statement essentials. The DMC members

31119 Bill Yock

then reviewed, deliberated, and refined these essentials to reflect the UW’s unique challenges. The DMC met on a bi-weekly basis; and over the course of several meetings we worked on each of these together. This resulted in the cur-rent set of UW policy essentials, which are:

• institutional Data is the property of the university of Washington and shall be managed as a key asset. – Institutional Data will be managed through defined governance guidelines, standards, policies and procedures. • unnecessary duplication of institutional Data is discouraged. – Data Custodians (defined below) shall be responsible for sharing Institutional Data out of official Systems of Record, when reasonable and according to policies and procedures, so as to minimize redundant storage and process- ing of that data in multiple repositories. Exceptions are allowed for purposes of business continuity and fail-over. Unnecessary maintenance of Institutional Data is discouraged. – When the same Institu- tional Data elements exist in multiple Systems of Record, the official values must be kept synchronized. When feasible, manual updates of Institutional Data should be performed in one System of Record and then automatically copied to as few other additional data repositories as possible. This eliminates redundant processing, increases integrity, and provides better auditing capabilities.• quality standards for institutional Data shall be defined and monitored. – Data quality standards shall be defined, published, communicated, managed, and applied according to the reliability and risk levels established by appropriate Data Custodians (defined below). Examples of data quality standards include: data validation rules, time- liness of updates, defined error rates, integrity monitoring processes, etc.• institutional Data shall be protected – Insti-

tutional Data must be safeguarded and protected according to approved security, privacy and compliance guidelines, laws, and regulations established by the University of Washington, the State, or the Federal government.• institutional Data shall be accessible accord- ing to defined needs and roles. – Institutio- nal Data and Metadata shall be accessible to all, in accordance with defined access and use policies and procedures determined by the Data Management Committee and Data Custodians (defined below). Users requesting access shall be assigned to appropriate roles that have clearly documented guidelines in accordance with all UW, State, and Federal laws and regulations.• institutional Metadata shall be recorded, managed, and utilized .– Metadata will be used to model, define, and organize data in order to maximize the value of Institutional Data. Institutional Metadata will be published and communicated clearly and consistently.• institutional representatives will be held accountable to their roles and responsibili- ties. – Roles and responsibilities for data management will be clearly defined, and individuals assigned to specific roles will be held accountable for performing data manage- ment responsibilities, as a part of their regular job responsibilities.

These types of policy essentials were relatively easy for the DMC members to agree to; they speak directly to common pains of value, quality, etc. The language was purposely very “authoritative” using terms like “shall be”. These policy essentials form the first eight amendments to our constitution providing clear, irrefutable principles.

articles of DefinitionWhile the policy essentials were relatively easy for DMC members to agree on and establish, the context in which the policy essentials are

312 TIMAF Information Management Best Practices Vol. 1

applied became more challenging. One key question which emerged is: “What data needs to be managed?” In order to provide the needed context, the DMC started work on the “Defini-tions” section of the policy document. These too were developed in an iterative fashion. We devoted a portion of each meeting to reviewing and redrafting the language, but this section evolved over a much longer period of time then the essentials. The current definitions are:

• institutional Data: “Institutional Data” is defined as all data elements which are created, received, maintained, or transmitted by the University of Washington. Institutional Data can be contained in any form, including but not limited to documents, databases, spreadsheets, email and websites; represented in any form, including but not limited to letters, numbers, words, pictures, sounds, symbols, or any combination thereof; communicated in any form, including but not limited to handwriting, printing, photo copying, photographing, and web publishing; and recorded upon any form, including but not limited to papers, maps, films, prints, discs, drives, memory sticks and other computing devices.• institutional Metadata: “Institutional Meta- data” is additional data collected, maintained, and used which generally describes and defines the processes around the management of Institutional Data. Examples of Institutio- nal Metadata include:

- Definitions regarding the purpose, usage and context of Institutional Data. - Identification of which system is the official system of record of Institutional Data. - Who is responsible for management of Institutional Data.- How Institutional Data is transferred, derived, and stored. - What security and privacy practices are used to safeguard Institutional Data. - Risk and compliance classifications for

Institutional Data. - Rules around retention of records and data.• System of Record: A “System of Record” is an information system that is designated by Data Custodians (defined below) as hold- ing official values of Institutional Data. Official values are the data designated as the most accurate representation of the meaning and context of Institutional Data elements, which are recorded as facts. Official values are not necessarily the originally entered values, and as such, a System of Record may not necessarily be the system where values are originally entered. When questions arise over the meaning or inter- pretation of data elements or their values, the System of Record is used to resolve discrepancies. • institutional Data Warehouse Repositories: Institutional Data is often copied from Systems of Record into other systems, for ease of access, or to facilitate historical storage of data. UW’s Enterprise Data Ware- house is an example of this type of additional data repository. Since Systems of Record may purge Institutional Data on defined schedules, these types of additional data repositories contain Institutional Data that survives beyond what the System of Record contains. They then become a System of Record holding official values of Institutional Data. These additional Institutional Data Warehouse Repositories are designated by Data Custodians (defined below) and are subject to the same policies and procedures that govern the use of Institutional Data.• Departmental/unit/Local Data Repositories: Various UW departments or units copy Institutional Data from Systems of Record into their own departmental, unit, or local data repositories. Any Departmental/Unit / Local Data Repository that contains a copy of Institutional Data are subject to the same policies and procedures which govern the use of Institutional Data.• Subject Area Domains: Institutional Data is

31319 Bill Yock

classified according to specific high- level “Subject Area Domains” for the purpose of assigning accountability and responsibilities over that data. The Subject Area Domains are defined and maintained by the Data Manage- ment Committee (defined below). The Data Management Committee refers to this classification as the UW Data Map and maintains a copy of it on its website (1). Examples of high-level Subject Area Domains are Human Resources, Academics, Financial Resources, Alumni and Development, etc. The UW Data Map also further defines specific Business Domains within each Subject Area Domain. Examples of Business Domains within the Academics Subject Area Domain are Curriculum and Courses, Financial Aid, Applications

Admissions and Enrollments, Transcripts Degrees and Awards, etc.

These articles of definition provide important context for agreeing to principles around data management best practices and developing procedures and standards.

Map and Divide the Data territoryOnce the basic definitions were defined, it became important to establish the scope of data to be managed. The Enterprise Data Warehouse program was a good place to look to help provide a perspective on scope. The EDW had started creating a “taxonomy” of data subject area domains which we called the uW Data Map. This taxonomy has become a useful tool

Figure 1: High-level Data Map

314 TIMAF Information Management Best Practices Vol. 1

in organizing systems and data and in identify-ing accountabilities. It is purposely a high-level conceptual perspective of major business functionalities at a university, not just a reflec-tion of current organizational departments or functional systems. Figure 1 is a high-level conceptual visualization of the UW Data Map.

The UW Data Map is also being used to help organize the development of the Enterprise Data Warehouse. The Master Data section in the middle represents an interesting “integra-tion” challenge to data management. The data elements in this section are data points that are common across many of the domains, and are used in a variety of systems. Coordinating and managing this data is challenging, in that there are multiple viewpoints and interests in the data usage.

The current UW Data Map represents “adminis-trative” data domains only. The DMC intends to help create other slices in the data map pie for other domains as needed. Data maps for “clinical” data domains and “academic” data domains have been discussed. Starting with a good high-level conceptual domain model for these other disciplines will facilitate quick identification of appropriate Data Custodians and Data Trustees for the data.

The DMC collaborated refining the UW Data Map by filling in the major business processes and descriptions for the subject areas. Finding the right hierarchy and level of detail was a challenge. We set some ground rules to try to help keep it manageable. Initially we settled on seven major subject areas (Master Data, Academics, Research, Financial Resources, Human Resources, Alumni and Development, and Services and Resources). Within each subject area, we attempted to divide by only 10 or 12 major business processes. The Services and Resource subject area was particularly challenging, given the wide variety of business services and systems it represents; we deliber-

ated on whether or not it should have its own data map taxonomy. For now, it remains part of the overall “administrative” taxonomy, but in the future we may decide to create a separate data map for it.

Separate PowerWith a base UW Data Map taxonomy defined, the DMC turned its attention to identifying who the appropriate Data Custodians and Data Trustees might be. The DMC “unofficially” assigned persons according to their job func-tions. The process for debating and assigning the people was similar to the process for creating the policy essentials and definitions; it was a regular agenda item in recurring meetings. Table 1 is an example of the UW Data Map and Data Custodian and Data Trustee assign-ment list for the “Research” subject areadomain. Note that anywhere from one to four Data Custodians were identified for each subject area and business process domain.

31519 Bill Yock

Business Domain

Funded / Sponsored Research

Departmental Research

Compliance

Impacts / Benefits / Partnerships

Grants, contracts, awards, cooperative agreements, equipment & resource sharing agreements, etc.

Material transfers, Dean / Unit support, internal funding

Research / faculty effort certification, regulatory requirements, ethical conduct, sponsor imposed requirements, etc.

Outcomes and value generated from research activities including, intellectual property, patents, technology transfer / licensing, global partnerships, interdisciplinary partner-ships, inter-institutional partnerships, community service, clinical studies, publications, citation frequency, recognitions and honors, consortiums, recognized centers and institutes, etc.

Associate Vice Provost for Research Compliance & Operations

Associate Vice President, Financial Management

Associate Vice Provost for Research Compliance & Operations

Associate Vice Provost for Research Compliance & Operations

Associate Vice President, Financial Management

Associate Vice President/Chief Compliance Officer, UW Medicine

Director, Academic Human Resources

Director of Finance and Business Operations, UW TechTransfer

Associate Vice Provost for Research Compliance & Operations

Director of Regulatory Guidance, SOM

Associate Vice Provost for Research

VP Research

Senior Vice President, Finance and Facilities

VP Research

VP Research

Senior Vice President, Finance and Facilities

CEO UW Medicine

VP Academic Personnel

VP UW Technology Transfer

Director of Finance and Business Operations, UW TechTransfer

VP Research

Vice Dean, Research and Graduate Education, School of Medicine

Description / Business Processes

Data custodians Data trustees

research – administrative Data related to research activities

taBLe 1 Sample Data Custodian and Trustee List >>

316 TIMAF Information Management Best Practices Vol. 1

sTeP 3: seT uP The goVernMenTThe preliminary assignments and anticipated responsibilities were not announced to the majority of these individuals, however, some were enlisted to help “role play” their new responsibilities. This allowed us to fine tune the policies and procedures that the DMC was creating. This is an important part of starting up a new Government. It is important to develop a strong relationship with “acting” Data Trustees and Data Custodians who are comfortable with an evolving constitution and with changing processes and procedures. These actors help to refine the policies and procedures and create training materials in advance of scaling out to all Data Custodians and Trustees. The roles and responsibilities that Data Trustees and Data Custodians are expected to perform are highlighted in Figure 2. This figure represents a typical strategic, tactical, and operational

division. Notice that the Data Custodians are the bridge between strategic Data Trustees and the various operational players managing and using data in systems.

Research Expertise / Staffing

Equipment & Instrumentation

Space & Facilities

Areas of established and developing knowledge, funding qualifications, donor marketing, program assignment and collaboration, retention and recruiting. External Affairs

Inventory management, title / ownership, facility sharing, transfers, repurposed fabrication, useful life expectancy, etc.

Research and laboratory space availability, types, (furnished, not furnished) occupancy rates, redundant (shared) facilities, etc.

Associate VP, Advancement Services, Development & Alumni Relations

Associate Controller, Financial Management

Director, HS Academic Services & Facilities

Manager of Program Operations, Financial Accounting /Equipment Inventory

Director, Capital & Space Planning Office

Vice President for Development & Alumni Relations

AVP Controller

VP Planning & Budgeting

Figure 2: Data Management Governance Structure

31719 Bill Yock

Most Data Custodians have a direct reporting relationship to the Data Trustees, however,this is not an absolute requirement; some Data Custodians in one business process area may be Data Trustees in another. Overall, there are approximately 25 Data Trustees and 50 Data Custodians.

create an executive BranchData Trustees are executive officers of the University who are responsible for business rules around data management. They are also responsible for appointing the judiciary Data Custodians.

DAtA tRuStEES

Data Trustees are institutional officers, (Vice Presidents, Vice Provosts, Deans, Chancellors, etc.) appointed by the President or Provost, who have authority over policies and procedures regarding business definitions of data, and the access and usage of that data, within their delegations of authority. Each Data Trustee appoints Data Custodians for their specific Subject Area Domains. The CIO is a part of the executive branch of government and has additional responsibilities related to the technologies that manage data.

Vice Provost for information Management / CiO

The Vice Provost of Information Management/Chief Information Officer (CIO) is appointed by the Provost, and is a Data Trustee over certain centralized Institutional Data WarehouseRepositories, like the Enterprise Data Ware-house. The CIO is responsible for setting and enforcing standards and guidelines for data management technologies and systems related to appropriate: computing infrastructures, data processing performance, data delivery and integration, data architectures and structures, data quality, data domain definitions, metadata

repositories, naming conventions and syntax, and access control mechanisms. Together the Data Trustees and the CIO make up the execu-tive branch of Data Management governance, providing strategic leadership and direction and setting overall policy and priorities.

create a Judicial BranchData Custodians are the subject matter experts that are most knowledgeable about the Institu-tional Data and the business processes. They have several responsibilities outlined in the Policy document.

DAtA CuStODiAnS

Data Custodians are individuals assigned by and accountable to the Data Trustees. They help define, implement, and enforce data manage-ment policies and procedures within their spe-cific Subject Area and Business Domains, as defined in the UW Data Map. For a list of do-mains and currently assigned Data Custodians, I refer to the UW Data Management Committee Website. In conjunction with the Data Manage-ment Committee guidelines and procedures and the prescribed technology and systems standards and guidelines established by the CIO, Data Custodians are essential in perform-ing several key data management functions including:

• IdentifyingSystemsofRecordcontaining Institutional Data. • CategorizingInstitutionalDatawithin Systems of Record according to security and privacy guidelines. • Definingaccess,quality,andusageguidelines for Institutional Data. • Reviewingandapprovingrequestsforaccess to Institutional Data. • DocumentingandmaintainingInstitutional Metadata.• Educatingandsharingbestpracticeswith other data management personnel.

318 TIMAF Information Management Best Practices Vol. 1

Data Custodians shall be knowledgeable and work in accordance with numerous datamanagement policies and procedures across the University, including:

• Standards,policiesandproceduresestablished by the Privacy Assurance and System Security (PASS) Council of the University (2).• Recordretentionandpublicdisclosurerules and regulations as outlined on the Records Management Website (3).• PublicrecordsandOpenMeetingstandards, policies and procedures defined by the Office of External Affairs (4).• ElectronicDiscoveryandDisclosureGuide- lines as outlined on the UW Attorney General’s Website (5).

Data Custodians have a broad set of responsibi-lities, and are the judiciary that help to interpret and enforce existing policies and procedures. Since many Data Custodians are also DMC members, they are also active in helping to formulate new policies.

The Policy Implementation Committee (PIC) is a special tactical group defined to assist the Judicial and Legislative Branches. They serve as a steering committee of the DMC, helping to document and coordinate policies andprocedures. They triage requests and issues that are sent to the DMC and get the appropriate Data Custodians involved when necessary. The committee is made up of 5 DMC members and is supported by an administrative assistant who helps document the issues.

Checklists and guidelines are being created by the DMC to help explain the workflow of resolving different types of issues. For example, a request for access to sensitive data in the Enterprise Data Warehouse often involves a number of steps:

• APICmemberisassignedtotherequest. The member contacts the requestor and clari-

fies what the business purpose is for using the data.• ThePICmembersendstherequestto the appropriate Data Custodian(s) for concurrence.• Ifconcurrenceisgranted,thePICmember notifies the appropriate system adminis- trators to set up the requestor in the authori- zation system.• ThePICmemberthensendstherequestor an “Access and Use Agreement” for official signature.• Oncethesystemadministratorhassetup access, the PIC member sends the requestor a message informing them that access has been granted.

In the preceeding scenario, the PIC members help implement requests but the Data Custodians are the decision makers.

The DMC assists the Citizens of the democracy by providing them a forum to bring forth issues and lobby for reform. The DMC has commis-sioned, and works in conjunction with, various task forces (Cabinet Posts) which work ondifficult issues. These task forces make recommen-dations for resolving issues which result in “Guidance Documents” that are officially endorsed by the DMC and Data Trustees. These Guidance Documents serve as “amendments” to the overall “constitution” by supplementing the Policy and Procedure Documents.

Guidance documents contain summaries of issues or concerns, who is affected by the issue, rules and decisions made on the issue, and official concurrence signatures of appropriate Data Trustees (Executive Officers) and/or Data Custodians (Judicial Officer). As the number of guidance documents grows, they are building a body of “case law” and “official registers“ of knowledge, helping all citizens of the democra-cy understand the rule of law. Some interesting examples of these guidance documents are

31919 Bill Yock

documented in the next section.

sTeP 4: engAge The CITIzenryOne of the remarkable things about a democra-cy is the engagement of its citizens in the pursuit of worthy causes. The activism around Data Management at UW is no exception. The DMC provides a forum for any citizen of UW to bring forward data management challenges and recommended solutions.

encourage Popular MovementsOne example of a citizen activated movement involves access to financial data. Past guidelines for financial management reporting regulated access to budget and expense data for only a person’s specific department or unit. With the evolving interdisciplinary nature of programs being developed across units, this guideline was restricting administrators from accessing the data needed to plan and monitor appropriately.

However, opening up access to financial data for cross unit analysis also meant opening up access to budgets associated with discretionary and strategic purposes. Many stakeholders worried that certain researchers, faculty, and staff, would engage in political arguments over the source and amount of funding they received.However, a group of people with a vested interest in changing the guideline started an “outreach” program. They met with many stakeholders to discuss their concerns and explain their needs. They created a presentation and met with many Deans and Administrators, gathering feedback on the pros and cons of the recommended change in the guideline. Based on the feedback received, they developed a new guideline explaining the risks and benefits, and submitted it to the appropriate Data Trustees for official signature. The new guideline was accepted, and was officially posted on the DMC website, recording the decision to provide guidance for all in the future.

This same process—that of citizens performing their own outreach program—has occurred several times now. Each time the process is similar. The citizens petition the DMC to sanction an outreach program; they develop content and presentations under DMC review; they conduct outreach meetings and gather feedback; they report back to the DMC on a regular basis as they conduct the research; and they then draft a new “guidance document”. Once the DMC accepts the new guideline, it is presented to the appropriate Data Trustees. If accepted, it is signed, scanned, and posted on the DMC website as an official guideline.

Form Interest groupsAnother important way to engage citizens in improving our democracy is through establish-ing “interest group” task forces. The DMC usually has several active task forces working at the same time. These task forces work to define tools (checklists, glossaries, instructions, etc.) that improve data management procedures. They also conduct research and analysis and make recommendations on improving data management practices. For example, a task force was created to improve procedures in defining access and roles to sensitive data. This special interest task force contains representa-tives from units with particular interests in access controls (Information Security, Human Resources, Identity Management, etc.). Depending on the agenda of the task force meetings, other interested parties are invited to attend and contribute.

One significant constitutional amendment that this task force has produced is the “Roles and Access Matrix”. The amendment provides definitions and instructions for end users and Data Custodians in handling requests for access to sensitive data. The task force created the matrix to define roles and privilege levels accor-ding to the subject area domains of the UW Data Map. The matrix is posted on the DMC

320 TIMAF Information Management Best Practices Vol. 1

website, providing instruction to users that request access. The users can review the matrix, identify the “role” they fall within, and submit a request to the DMC support email address for access.

To date, this request process is for access to data in the Enterprise Data Warehouse. The access control scheme represented in this matrix is fine grained—the roles that have been established define access privileges to both columns (data fields) and rows (data records) across all the major subject area data domains. This fine grained control system is actively being explored for the possibility of scaling out to other data sources other than the EDW. A current version of the Roles and Access Matrix can be found at https://www.washington. edu/provost/oim/dmc/accesss.html. All active task forces report on their progress on a regular basis at the DMC meetings.

allow for “State” governmentsSome task forces have resulted in convening special governance and decision making bodies.The Enterprise Reporting task force is a good example of this. It was initially created to answer the “Top 5 Questions” to which Deans and Chancellors wanted answers. Questions like: “What is the student headcount for each of my programs?” seem straightforward, but can be very complex to answer. The task force found out that this one seemingly straight-forward question quickly led to many other questions, such as:

• Shouldstudentsonleaveorinvolvedin foreign study be included? • Forinterdisciplinaryprogramsshouldthe student be counted once in each program? • Shouldstudentsthathavenotdeclareda major be included? • Isitimportanttoincludenon-matriculated students?

To understand the root issues that make it difficult to answer the seemingly simple questions, the task force formed a special Deans and Chancellors Advisory group that included a handful of Deans and Chancellors to advice the task force. Their input and direction led to the identification of the major questions and issues. While they acknowledged that some schools, colleges, and campuses have specific requirements to answer the question differently, the task force agreed that each should follow the same rules and definitions that answer the question consistently whenever possible. A process to come to concurrence on the instituti-onal definitions of the data elements in question was setup and the creation of new enterprise reports that use those established definitions were built. The resulting “institutional defini-tions” are posted in the DMC website for all analysts and report writers to use when build-ing similar reports. This has led to an increase in the consistency and credibility of certain reports that are produced which in the past often reported seemingly contradictory results.

sTeP 5: InsTITuTIonALIze The new ForM oF goVernMenT

turn Policy into Law

The DMC has been operating for nearly two years with working drafts of its policy document and guidelines and acting Data Custodians. The policy essentials, definitions, and procedures have matured and worked well. In order to sustain the momentum into the future, the DMC is seeking to get formal approval of the policy document which, when ratified, gets included into the University’s official Adminis-trative Policy Statements repository. The Univer-sity Technology Advisory Council (UTAC) is thecurrent governance body that approves techno-logy policy documents. Before submitting to the UTAC for official approval, the DMC performed an education and outreach program with key

32119 Bill Yock

stakeholders. The first step in that outreach was presentation of the policy document to the Privacy Assurance and Systems Security Council (PASS). The PASS council has representation from a broad constituency across the university, including the Chief Information Security Officer,the Attorney General’s Office, the Risk Manage-ment Office, the Medical Center CIO and Privacy Officer, Network Systems, Identity and Access Management, and the Office of Informa-tion Management. The PASS council provided strong endorsement in November 2008.

The DMC also created an executive summary presentation that explained the importance of the Data Management Policy. Various DMC members are doing outreach work with Data Custodians and Data Trustees, to educate them, and to get their endorsement of the policy. This is an important process that helps refine the UW Data Map and the list of Data Custodi-ans and Trustees. It builds awareness and helps create baseline knowledge for all regarding the constitution. Once the Data Trustee and Data Custodian outreach is finished, the DMC will then submit the Policy document to the UTAC for final endorsement. After the UTAC endorses the Data Management Policy, the DMC will be legitimized as an important governance body of the university.

turn Law into PracticeIn addition to formal approval of the policy document, the DMC is working to create a formal online educational program for Data Custodians. This training will be used to explain the fundamentals of the policy constitution, as well as the parliamentary procedures outlined by the DMC. It will be used to orient all citizens involved with data management best practices, but will primarily be targeted at Data Custodi-ans, who have a large responsibility.

Let Democracy FlourishRegular communications about the functions of the DMC are extremely important. Word is starting to spread about the DMC’s work, and individuals actively requesting an audience. The DMC’s website contains important informa- tion—from meeting minutes and status on special task force progress, to special updates and highlight reports that must be prepared and delivered to executive audiences like Deans and Administrators on a regular basis.

Just like in a national democracy, it is important to rotate the elected officials who serve. Member ship in the DMC and task forces is evaluated and changed on a regular basis. Care-ful consideration is given to creating a member-ship that is representative of both administrative and academic units. There is no official election and voting process; rather, current members dis-cuss, suggest changes, and recruit new members.

Summary and ConclusionOne of the greatest benefits of the DMC and the data governance policies and procedures being put in place is that general “citizens” now have an official place to go to present their unique data management challenges and to ask for help. By having a policy document that serves as a constitution, the rule of law for data management is clearly defined. In the past, people raised issues about data management, but since there was no common ground or baseline of agreed upon rules, the issues were never resolved and practices were inconsistent. Disagreements tended to bounce around from group to group with nobody feeling empowered to act or make a decision. Now that account-abilities are clear and the procedures well defined, there is increased efficiency and consis-tency in dealing with the many challenges.

322 TIMAF Information Management Best Practices Vol. 1

As with any great democracy, the problems that arise in the future will require that previous policies be reformed. Having a checks and balances approach to separating judicial, legislative, and executive powers is a wise architecture that the founders of the United States put in place. The data management governance structures put into place at UW followed a similar model, and are proving prudent in maintaining the appropriate balance of power while allowing democracy to flourish.

references1. Data Management committee. UW Data Map. retrieved on 17 Aug. 2009.

https://www.washington.edu/provost/oim/dmc/trustees.html.

2. Privacy Assurance and system security (PAss) council.

3. Homepage. retrieved on 17 Aug. 2009. http://www.washington.edu/computing/security/pass/.

4. records Management services article retrieved on 17 Aug. 2009

http://www.washington.edu/admin/recmgt/index.php.

5. UW Office of External Affairs. Public records and Open Meetings. retrieved on 17 Aug. 2009. http://depts.

washington.edu/pubrec/index.shtml.

6. UW Attorney General’s Office. Electronic Discovery. retrieved on 17 Aug. 2009.

http://www.washington.edu/admin/ago/ediscovery.html.