The Imperative for Open Altmetrics

12
The Imperative for Open Altmetrics Stacy Konkiel, Heather Piwowar, Jason Priem IMPACTSTORY Journal of Electronic Publishing Volume 17, Issue 3: Metrics for Measuring Publishing Value: Alternative and Otherwise, Summer 2014 DOI: http://dx.doi.org/10.3998/3336451.0017.301 [http://dx.doi.org/10.3998/3336451.0017.301] [http://creativecommons.org/licenses/by/3.0/] Abstract If scholarly communication is broken, how will we fix it? At Impactstory—a non-profit devoted to helping scholars gather and share evidence of their research impact by tracking online usage of scholarship via blogs, Wikipedia, Mendeley, and more—we believe that incentivizing web-native research via altmetrics is the place to start. In this article, we describe the current state of the art in altmetrics and its effects on publishing, we share Impactstory’s plan to build an open infrastructure for altmetrics, and describe our company’s ethos and actions. “Scholarly communication is broken.” We’ve heard this refrain for close to twenty years now, but what does it mean? Academic publishing is still mostly a slow, arduous, and closed process. Researchers have little incentive to experiment with new forms of scholarly communication or make their research freely available at the speed of science, since they’re mainly recognized for publishing journal articles and books: a narrow, very traditional form of scholarly impact. Most arguments attribute academic publishing’s problems to a system that benefits corporate interests or to perverse incentives for tenure and promotion. The solution? Open up research and update our incentive systems accordingly. For too long now, academic publishing has relied on a closed infrastructure that was architected to serve commercial interests. Researchers who attempt to practice open science can find it difficult to get recognition for the impact of open access (OA) publications and research products beyond the journal article, products that include scientific software, data, and so on. Some have already imagined a better future for scholarly communication, one where OA is the norm and a new, open infrastructure serves the diverse needs of scholars throughout the research lifecycle. The decoupled journal is slowly becoming a reality, [1] [ #N1 ] OA publications continue to gain a market share, [2] [ #N2 ] and measuring impact of a diverse set of scholarly outputs through altmetrics is becoming an increasingly common practice for scholars. [3] [ #N3 ] We founded Impactstory with this future in mind. Impactstory [ http://impactstory.org ] is a non-profit, open source web application that helps researchers gather, understand, and share with others the impact of all their scholarly outputs. We believe that Impactstory and other services that serve scholarly communication are essential to the future of academia. The Imperative for Open Altmetrics http://quod.lib.umich.edu/cgi/t/text/idx/j/jep/3336451.0017.301/... 1 of 12 11/3/14, 9:06 PM

Transcript of The Imperative for Open Altmetrics

The Imperative for Open AltmetricsStacy Konkiel, Heather Piwowar, Jason PriemIMPACTSTORY

Journal of Electronic Publishing

Volume 17, Issue 3: Metrics for Measuring Publishing Value: Alternative and Otherwise, Summer 2014

DOI: http://dx.doi.org/10.3998/3336451.0017.301 [http://dx.doi.org/10.3998/3336451.0017.301]

[http://creativecommons.org/licenses/by/3.0/]

AbstractIf scholarly communication is broken, how will we fix it? At Impactstory—a non-profit devoted tohelping scholars gather and share evidence of their research impact by tracking online usage ofscholarship via blogs, Wikipedia, Mendeley, and more—we believe that incentivizing web-nativeresearch via altmetrics is the place to start. In this article, we describe the current state of the artin altmetrics and its effects on publishing, we share Impactstory’s plan to build an openinfrastructure for altmetrics, and describe our company’s ethos and actions.

“Scholarly communication is broken.” We’ve heard this refrain for close to twenty years now, butwhat does it mean?

Academic publishing is still mostly a slow, arduous, and closed process. Researchers have littleincentive to experiment with new forms of scholarly communication or make their research freelyavailable at the speed of science, since they’re mainly recognized for publishing journal articles andbooks: a narrow, very traditional form of scholarly impact.

Most arguments attribute academic publishing’s problems to a system that benefits corporateinterests or to perverse incentives for tenure and promotion. The solution? Open up research andupdate our incentive systems accordingly.

For too long now, academic publishing has relied on a closed infrastructure that was architected toserve commercial interests. Researchers who attempt to practice open science can find it difficultto get recognition for the impact of open access (OA) publications and research products beyondthe journal article, products that include scientific software, data, and so on.

Some have already imagined a better future for scholarly communication, one where OA is thenorm and a new, open infrastructure serves the diverse needs of scholars throughout the researchlifecycle. The decoupled journal is slowly becoming a reality, [1] [#N1] OA publications continue togain a market share, [2] [#N2] and measuring impact of a diverse set of scholarly outputs throughaltmetrics is becoming an increasingly common practice for scholars. [3] [#N3]

We founded Impactstory with this future in mind. Impactstory [http://impactstory.org] is anon-profit, open source web application that helps researchers gather, understand, and share withothers the impact of all their scholarly outputs. We believe that Impactstory and other services thatserve scholarly communication are essential to the future of academia.

The Imperative for Open Altmetrics http://quod.lib.umich.edu/cgi/t/text/idx/j/jep/3336451.0017.301/...

1 of 12 11/3/14, 9:06 PM

In this article, we’ll describe the current state of the art in altmetrics and its effects on publishing,share our plan to build an open infrastructure for altmetrics, and describe our company’s ethos andactions.

The current publishing ecosystem—and why it needs tobe changedAltmetrics—sometimes called “alternative metrics” and defined by Priem, Piwowar, & Hemmingeras social media-based metrics for scholarly works [4] [#N4] —are having a major effect on traditionalscholarly publishing, but not for all of the reasons you might expect.

Traditional academic publishers are masters of vertical integration. Once a manuscript issubmitted to a traditional journal for publication, that journal coordinates peer-review, copy-edits,publishes, markets, manages copyright for, and provides scores of other services [5] [#N5] for thepublished article.

In general, this system has done its job relatively well to date—publishing pay-to-read journals. Butit has also resulted in a publishing ecosystem that can be harmful to scholars and the public [6] [#N6] :toll access journals with exorbitant subscription fees (as the for-profit publishers seek to expandtheir ever-widening profit margin [7] [#N7] ) and journal impact factors being used as a proxy for thequality of a published article when evaluating scholars’ work (not the fault of the publishers, to besure, but they nonetheless contribute to the problem by promoting and sustaining JIF hype).

What if we imagined a web-native publishing ecosystem that functioned in an open, networkedmanner, similar to how much research itself is conducted nowadays? What if we decoupled theservices that many journals provide from the journal itself, and had scores of businesses that couldprovide many of the essential services that authors need, like peer-review, copy editing,marketing—with less overhead and greater transparency?

Such a system has the opportunity to foster a scholarly communication environment that benefitsscholars and the public, freeing the literature via Open Access publishing, improving the literaturethrough open and post-publication peer review, and understanding the literature’s impact througharticle-level metrics and altmetrics.

Luckily, that new system is in the process of being built. Every day, game-changing publishingservices like Publons [https://publons.com/] and Rubriq [http://www.rubriq.com/] (stand-alonepeer-review services [8] [#N8] ), Annotum [http://annotum.org/] and PressForward[http://pressforward.org/] (publishing platforms), Dryad [http://datadryad.org/] and Figshare[http://figshare.com/] (data-sharing platforms), and Kudos [https://www.growkudos.com/] (an articlemarketing service) are debuted. And altmetrics services like Impactstory [https://impactstory.org/] ,Altmetric [http://www.altmetric.com/] , PlumX [https://plu.mx/] , and PLOS ALMs [http://article-level-metrics.plos.org/] are also starting to be widely adopted, by both publishers and scholars alike.

The rise of altmetricsAltmetrics are a solution to a problem that increasingly plagues scholars: even in situations wherescholarship may be best served by a publishing a dataset, blog post, or other web-native scholarlyproduct, one’s own career is often better served by instead putting that effort into traditionalarticle-writing. If we want to move to a more efficient, web-native science, we must make that

The Imperative for Open Altmetrics http://quod.lib.umich.edu/cgi/t/text/idx/j/jep/3336451.0017.301/...

2 of 12 11/3/14, 9:06 PM

dilemma disappear: what is good for scholarship must become good for the scholar. Instead ofassessing only paper-native articles, books, and proceedings, we must build a new system where alltypes of scholarly products are evaluated and rewarded.

The key to this new reward system is altmetrics: a broad suite of online impact indicators that goesbeyond traditional citations to measure impacts of diverse products, in diverse platforms, ondiverse groups of people. [9] [#N9] Altmetrics leverage the increasing centrality of the Web inscholarly communication, mining evidence of impact across a range of online tools andenvironments:

[/j/jep/images/3336451.0017.301-00000001.jpg]

These and other altmetrics promise to bridge the gap between the potential of web-nativescholarship and the limitations of the paper-native scholarly reward system. A growing body ofresearch supports the validity and potential usefulness of altmetrics. [10] [#N10] [11] [#N11] [12] [#N12] [13]

[#N13] Eventually, these new metrics may power not only research evaluation, but also web-nativefiltering and recommendation tools. [14] [#N14] [15] [#N15] [16] [#N16]

However, this vision of efficient, altmetrics-powered, and web-native scholarship will not occuraccidentally. It requires advocacy to promote the value of altmetrics and web-native scholarship,online tools to demonstrate the immediate value of altmetrics as an assessment approach today,and an open data infrastructure to support developers as they create a new, web-native scholarlyecosystem. This is where Impactstory comes in.

ImpactstoryA vibrant, altmetrics-powered world of web-native scholarship requires early guidance and publicinfrastructure. The market is not providing this. ImpactStory aims to fill that gap.

Impactstory is a mission-driven non-profit. We incorporated as such in 2012 because werecognized the need to keep altmetrics Open . Ourselves freed—from the need to turn a profit forstockholders—we believe we’re able to focus on building a better product that meets users’ needsrather than aims towards profitability. To date we’ve been funded by the Open KnowledgeFoundation, JISC, Alfred P. Sloan Foundation, and the National Science Foundation.

The Imperative for Open Altmetrics http://quod.lib.umich.edu/cgi/t/text/idx/j/jep/3336451.0017.301/...

3 of 12 11/3/14, 9:06 PM

As a non-profit, we’re governed by a Board of Directors. We’ve been fortunate to have some of thebest and brightest minds working in the areas of Open Science on our Board: Cameron Neylon[http://cameronneylon.net/] (PLOS), Heather Joseph [http://www.sparc.arl.org/about/staff/heather-joseph] (SPARC), John Wilbanks [http://del-fi.org/jtw] (Sage Open), and Ethan White[http://whitelab.weecology.org/] (University of Utah).

We’re far from the only altmetrics provider. The Public Library of Science’s Article Level Metrics(ALMs) [17] [#N17] webapp was the first altmetrics service to gain traction in 2009, backed by thenon-profit’s push to reduce academia’s dependence on the journal impact factor. [18] [#N18] Two yearslater, the first commercial altmetrics providers, Altmetric.com [19] [#N19] and Plum Analytics [20] [#N20]

were founded, and Impactstory also began operating under the name “Total-Impact.” Altmetricsand bibliometrics researchers have also created a number of apps over the years, includingScienceCard, [21] [#N21] ReaderMeter, [22] [#N22] and PaperCritic, [23] [#N23] many of which have sincebeen deprecated.

All services to date have provided unique insights into research impact. Some highlights include:

Impactstory: citations, downloads and page views, and altmetrics for a broad array of web-nativeresearch products, in a profile format designed to meet the needs of individual researchers.

PLOS ALMs: citations, downloads and page views, and altmetrics for all PLOS articles, relative tothe performance of other articles in their corpus.

Altmetric.com: mentions of articles in mainstream media and policy documents, and a numberof other metrics for publications that are displayed via a platform. Designed to provide businessintelligence primarily for institutions and publishers.

PlumX (powered by Plum Analytics): Worldcat holdings for books and institutional repositorydownloads & page views for all scholarly outputs, alongside other metrics for a variety ofresearch outputs. Designed to give insights primarily to funders and institutions.

What sets Impactstory and Plum Analytics apart from most other providers is that our aim is toprovide altmetrics for web-native research products, beyond journal articles and preprints.

Why we’re building the altmetrics commonsExisting for-profit providers have approached altmetrics data as a commodity to be sold. Thisstance supports a relatively straightforward business model, and so is understandably attractive toinvestor-backed startups. It leads, however, to a negative externality: a fragmented landscape oftightly guarded silos containing mutually incompatible data (an outcome we have already seen inthe citation database market). It is an approach on the wrong side of history.

The real value of altmetrics data, like other Big Data streams, is not in the numbers themselves; thevalue is in what the community can build on top of the data: a new generation of web-nativeassessment, filtering, and evaluation tools. [24] [#N24] In this way, open altmetrics data is quite likethe network of open protocols and providers behind the World Wide Web: it is essentialinfrastructure for a revolutionized communication ecosystem. Impactstory is designed to build andsustain this infrastructure—to create an altmetrics data commons. Our work takes a two-prongedapproach, focusing on advocacy and an open source altmetrics webapp, with an eye towards futureimprovements, including building an open altmetrics data platform.

The Imperative for Open Altmetrics http://quod.lib.umich.edu/cgi/t/text/idx/j/jep/3336451.0017.301/...

4 of 12 11/3/14, 9:06 PM

AdvocacyOur advocacy efforts to date have had two aims: to help scholars and librarians understandweb-native scholarship and its importance, and to help them also understand the benefits ofaltmetrics.

With these goals in mind, we have given talks at universities and conferences, led workshops,published articles in high-profile journals like Nature, and pursued research describing andvalidating altmetrics. More importantly, though, our advocacy helps shape the altmetricsmovement, keeping openness and interoperability central to emerging conversations.

WebappThe impactstory.org webapp provides users with an online, metrics-driven CV that has several keyfeatures.

Diverse metrics for diverse products, with context

[/j/jep/images/3336451.0017.301-00000002.jpg]

Few researchers create only papers in the course of research. They collect data, maybe write ascript to parse and analyze the data, present their findings at a conference using a slide deck, put apreprint up on ArXiv to get some feedback on their an initial draft of a paper describing theirfindings, and then (finally) publish their paper in a peer-reviewed journal. These outputs all haveimpacts that leave traces on the web—other researchers will reuse a script, favorite the slide deckfor later review, comment on the preprint, and possibly cite the published paper. Yet, impact isusually only measured in citations of the published paper.

We think that by measuring only citations, academics are missing the fuller, richer picture. That’swhy Impactstory is designed to capture impacts for all research outputs, at all stages of research.

The Imperative for Open Altmetrics http://quod.lib.umich.edu/cgi/t/text/idx/j/jep/3336451.0017.301/...

5 of 12 11/3/14, 9:06 PM

Our webapp finds and displays a variety of metrics across a number of services where web-nativeresearch outputs live: Dryad, Figshare, Mendeley, CrossRef, Scopus, GitHub, Slideshare, Vimeo,and more. We report on recommendations, citations, discussions, saves, and views for data,software, papers (both published and unpublished), slide decks, and videos.

The Impactstory webapp also provides important context to raw metrics. We do this by sortingmetrics by engagement type (recommendations, citations, discussions, saves, or views) andaudience (public or scholars). We also use percentiles: metrics on each product are compared tothose of all products of the same age and type across the profiles of all Impactstory users, as in thepop-up box on the illustration above.

Automatic updatesUsers can connect their Impactstory profile to third-party services like ORCID, Figshare, andSlideshare so that any time a new product is added to any of those services, it will be automaticallyimported to their Impactstory profile. This important feature takes the pain out of updating yourCV—less time spent hunting down and formatting the citations of all the scholarly products youcreated over the past year.

Ability to download and reuse dataUsers can download and reuse the data we provide in Impactstory profiles, to the extent allowed bythe data providers’ terms of service. Users can download in .CSV and JSON formats, and they canalso embed their Impactstory profile into other websites just by copying and pasting a few lines ofcode.

Notifications

The Imperative for Open Altmetrics http://quod.lib.umich.edu/cgi/t/text/idx/j/jep/3336451.0017.301/...

6 of 12 11/3/14, 9:06 PM

[/j/jep/images/3336451.0017.301-00000003.jpg]

Impactstory users can also receive notification emails, which alert them to whether their researchproducts have seen any activity within the past week. These updates include a number of “cards”that highlight something unique about their new metrics for that week: if they are in a toppercentile related to other papers published that year, or if their PLOS paper has topped 1000views or gotten new Mendeley readers. Users get a card for each type of new metric one of theirproducts receives.

Over time, we are building a webapp that will offer scholars a powerful replacement for their onlineCVs. In providing this “altmetrics CV,” we hope we will support broad, grassroots adoption ofaltmetrics from working researchers. Hearing about open altmetrics is one thing— but seeing one’sown altmetrics, with the ability to freely download them, is far more powerful.

Future workMore profile-level summarizationHow many citations did all of your papers receive last year? What’s the total number of GitHubforks you’ve gotten in your career? How often were your datasets recommended in the previousyear?

We’re aiming to provide compelling, author-level statistics for our users via profile-level statistics.And we’re intending to make these metrics useful to profile viewers as well, without venturing intothe dangerous “one metric to rule them all” territory that’s plagued academics for years in the formof journal impact factors on CVs.

More complex modellingResearchers are starting to understand the various “flavors” of impact (what citations mean for theimpact of a paper versus what a Mendeley bookmark means; how “forks” on the collaborativesoftware coding website, GitHub, influence software’s impact; and so on). As we explained on theImpactstory blog [http://blog.impactstory.org/top-5-altmetrics-trends-to-watch-2014/] earlier thisyear, [25] [#N25] soon, researchers and altmetrics providers will begin to provide:

more network-awareness (who tweeted or cited your paper? how authoritative arethey?), more context mining (is your work cited from methods or discussionsections?), more visualization (show me a picture of all my impacts this month),more digestion (are there three or four dimensions that can represent my “scientificpersonality?”), more composite indices (maybe high Mendeley plus low Facebook islikely to be cited later, but high on both not so much).

Recently, altmetrics researchers have also recognized a need for qualitative research that gets atthe motivations behind particular events associated with the use of scholarship (why did thisresearcher cite this paper? what prompts lay people to post about a study on Facebook?). A notablenew company working in this space is SocialCite [http://www.social-cite.org/] , which allows readersto indicate whether a citation is appropriate and high-quality, as well as why an article was cited

The Imperative for Open Altmetrics http://quod.lib.umich.edu/cgi/t/text/idx/j/jep/3336451.0017.301/...

7 of 12 11/3/14, 9:06 PM

(for evidence, assertions, methods, and so on).

Uncovering the impacts of softwareThe traditional “coin of the realm” that measures the impact of articles has been citations. What isthe coin of the realm for software? Is there more than one coin?

We recently received an NSF EAGER grant to study how automatically-gathered impact metricscan improve the reuse of research software. Over the course of the two-year grant, we’ll improveImpactstory’s ability to track and display the impact of research software. Our webapp will sooninclude tools to uncover where and how software is downloaded, installed, extended, and used, andpresent this information in an easy-to-understand dashboard that researchers can share.

We’ll also use quantitative and qualitative approaches to see if this impact data helps promoteactual software reuse among researchers.

The long-term goal of the project is big: we want to transform the way the research communityvalues software products. This is in turn just one part in the larger transformation of scholarlycommunication, from a paper-native system to a web-native one.

An open altmetrics data platformFinally, an improved ImpactStory API will form the hub of an open data infrastructure connectingdozens of diverse data providers (like Mendeley, Twitter, or Dryad) with a constellation ofapplication developers. Applications include impact-aware PDF readers, institutional repositoryusage widgets, literature search tools, enhanced citation indexes, faculty profile collections,funding databases, institutional and regional impact assessments, expert identification systems,post-publication peer-review platforms, and recommendation engines—in fact, we’ve had requestsfor data from projects in each of these categories already. As we improve its scalability, our openAPI will support an ecosystem in which impact data flows like water among these and other diverseapplications, with Impactstory as the “village well” supplying a shared, open, always-on stream ofimpact data.

Our non-profit is dedicated to promoting open science by building the tools that will provideincentives for researchers who practice it. We are also committed to building an openinfrastructure for altmetrics, to keep altmetrics data open and verifiable, allowing for innovativeservices to be built that meet researchers’ needs.

Stacy Konkiel is the Director of Marketing & Research at Impactstory. A former academiclibrarian, Stacy has written and spoken most often about the potential for altmetrics in academiclibraries.

Stacy has been an advocate for Open Scholarship since the beginning of her career, but credits hertime at Public Library of Science (PLOS) with sparking her interest in altmetrics and otherrevolutions in scientific communication. Prior, she earned her dual master’s degrees inInformation Science and Library Science at Indiana University (2008). You can connect with Stacyon Twitter at @skonkiel [http://twitter.com/skonkiel] .

Heather Piwowar is a cofounder of Impactstory and a leading researcher in research dataavailability and data reuse. She wrote one of the first papers measuring the citation benefit ofpublicly available research data [http://www.plosone.org/article/info:doi/10.1371/journal.pone.0000308]

The Imperative for Open Altmetrics http://quod.lib.umich.edu/cgi/t/text/idx/j/jep/3336451.0017.301/...

8 of 12 11/3/14, 9:06 PM

, has studied patterns in data archiving [http://www.plosone.org/article/info:doi/10.1371/journal.pone.0018657] , patterns of data reuse [https://peerj.com/preprints/1/] , and the impact ofjournal data sharing policies [http://researchremix.wordpress.com/2010/10/12/journalpolicyproposal] .

Heather has a bachelor’s and master’s degree from MIT in electrical engineering, 10 years ofexperience as a software engineer, and a Ph.D. in Biomedical Informatics from the University ofPittsburgh. She is a frequent speaker [http://www.slideshare.net/hpiwowar] on research dataarchiving, writes a well-respected research blog [http://researchremix.wordpress.com/] , and is activeon twitter (@researchremix [http://twitter.com/researchremix] ).

Jason Priem is a cofounder of Impactstory and a doctoral student in information science(currently on leave of absence) at the University of North Carolina-Chapel Hill. Since coining theterm “altmetrics,” [https://twitter.com/jasonpriem/status/25844968813] he’s remained active in thefield, organizing the annual altmetrics workshops [http://altmetrics.org/altmetrics12] , giving invitedtalks [http://jasonpriem.org/cv/#invited] , and publishing peer-reviewed altmetrics research.[http://jasonpriem.org/cv/#refereed]

Jason has contributed to and created several open-source software projects, including Zotero[http://www.zotero.org/] and Feedvis [http://feedvis.com/] , and has experience and training in art,design, and information visualization. Sometimes he writes on a blog [http://jasonpriem.org/blog]and tweets [https://twitter.com/jasonpriem] .

ReferencesEysenbach, G. (2012). Can tweets predict citations? Metrics of social impact based on Twitter and

correlation with traditional metrics of scientific impact. Journal of Medical Internet Research,13(4). doi:10.2196/jmir.2012

Habib, M. (2013). Expectations by researchers [Lightning talk]. NISO Altmetrics Initiativemeeting, San Francisco, CA. 9 Oct., 2013. http://www.slideshare.net/BaltimoreNISO/niso-lightning-mchabibv3 [http://www.slideshare.net/BaltimoreNISO/niso-lightning-mchabibv3]

Haustein, S., & Siebenlist, T. (2011). Applying social bookmarking data to evaluate journal usage.Journal of Informetrics, 5(3), 457–446. Retrieved from http://dx.doi.org/10.1016/j.joi.2011.04.002

Laakso, M., & Björk, B.C. (2012). Anatomy of open access publishing: a study of longitudinaldevelopment and internal structure. BMC Medicine, 10(1), 124. doi:10.1186/1741-7015-10-124

Li, X., Thelwall, M., & Giustini, D. (2011). Validating online reference managers for scholarlyimpact measurement. Scientometrics, 91(2), 1–11. doi:10.1007/s11192-011-0580-x

Morrison, H. (2014). “Elsevier STM publishing profits rise to 39%.” Imaginary Journal of PoeticEconomics. 14 March 2014. http://poeticeconomics.blogspot.com/2014/03/elsevier-stm-publishing-profits-rise-to.html [http://poeticeconomics.blogspot.com/2014/03/elsevier-

stm-publishing-profits-rise-to.html]

Neylon, C., & Wu, S. (2009). Article-Level Metrics and the Evolution of Scientific Impact. PLoSBiol, 7(11).

The Imperative for Open Altmetrics http://quod.lib.umich.edu/cgi/t/text/idx/j/jep/3336451.0017.301/...

9 of 12 11/3/14, 9:06 PM

Nielsen, F. (2007). Scientific citations in Wikipedia. First Monday, 12(8). Retrieved fromhttp://arxiv.org/pdf/0705.2106 [http://arxiv.org/pdf/0705.2106]

Priem, J., Taraborelli, D., Groth, P., & Neylon, C. (2010). Altmetrics: a manifesto. RetrievedOctober 26, 2010, from http://altmetrics.org/manifesto/ [http://altmetrics.org/manifesto/]

Priem, J., Piwowar, H., & Hemminger, B. (2011). Altmetrics in the wild: An exploratory study ofimpact metrics based on social media. Presented at the Metrics 2011: Symposium onInformetric and Scientometric Research, New Orleans, LA, USA.

Priem, J., & Hemminger, B. M. (2012). Decoupling the scholarly journal. Frontiers inComputational Neuroscience, 6. Retrieved from http://www.frontiersin.org/Computational_Neuroscience/10.3389/fncom.2012.00019/full [http://www.frontiersin.org

/computational_neuroscience/10.3389/fncom.2012.00019/full]

Taraborelli, D. (2008). Soft peer review: Social software and distributed scientific evaluation. InProceedings of the 8th International Conference on the Design of Cooperative Systems(COOP ’08). Carry-le-Rouet, France. Retrieved from http://eprints.ucl.ac.uk/8279/[http://eprints.ucl.ac.uk/8279/]

Wilbanks, J. (2011). Openness as infrastructure. Journal of Cheminformatics, 3(1), 36.doi:10.1186/1758-2946-3-36

NotesPriem, Jason, and Bradley M. Hemminger. “Decoupling the Scholarly Journal .” Frontiers inComputational Neuroscience 6 (2012). doi:10.3389/fncom.2012.00019 [#N1-ptr1]

1.

Laakso, Mikael, and Bo-Christer Björk. 2012. “Delayed Open Access – an OverlookedHigh-Impact Category of Openly Available Scientific Literature.” Journal of the AmericanSociety for Information Science and Technology (preprint). http://hanken.halvi.helsinki.fi/portal/files/1311951/laakso_bj_rk_delay_preprint.pdf [http://hanken.halvi.helsinki.fi/portal/files/1311951/laakso_bj_rk_delay_preprint.pdf] . [#N2-ptr1]

2.

Habib, M. (2013). “Expectations by researchers” Lightning talk at the NISO AltmetricsInitiative meeting, San Francisco, CA. 9 Oct., 2013. http://www.slideshare.net/BaltimoreNISO/niso-lightning-mchabibv3 [http://www.slideshare.net/BaltimoreNISO/niso-lightning-mchabibv3] [#N3-ptr1]

3.

Priem, Jason, Heather A Piwowar, and Bradley H Hemminger. “Altmetrics in the Wild: AnExploratory Study of Impact Metrics Based on Social Media.” Metrics 2011: Symposium onInformetric and Scientometric Research. New Orleans, LA, USA. http://jasonpriem.org/self-archived/PLoS-altmetrics-sigmetrics11-abstract.pdf [http://jasonpriem.org/self-archived/PLoS-altmetrics-sigmetrics11-abstract.pdf] [#N4-ptr1]

4.

Anderson, Kent. 2013. “UPDATED — 73 Things Publishers Do (2013 Edition) | The ScholarlyKitchen on WordPress.com.” Scholarly Kitchen Blog. http://scholarlykitchen.sspnet.org/2013/10/22/updated-73-things-publishers-do-2013-edition/[http://scholarlykitchen.sspnet.org/2013/10/22/updated-73-things-publishers-do-2013-edition/] .[#N5-ptr1]

5.

Piwowar, Heather A., and Jason Priem. 2014. “Keeping Metrics Free.” Impactstory Blog.Accessed July 22. http://blog.impactstory.org/24638498595/ [http://blog.impactstory.org

6.

The Imperative for Open Altmetrics http://quod.lib.umich.edu/cgi/t/text/idx/j/jep/3336451.0017.301/...

10 of 12 11/3/14, 9:06 PM

/24638498595/] . [#N6-ptr1]

Morrison, H. (2014). “Elsevier STM publishing profits rise to 39%.” Imaginary Journal ofPoetic Economics. 14 March 2014. http://poeticeconomics.blogspot.com/2014/03/elsevier-stm-publishing-profits-rise-to.html [http://poeticeconomics.blogspot.com/2014/03/elsevier-stm-publishing-profits-rise-to.html] [#N7-ptr1]

7.

Priem, Jason. “List: standalone peer review services.” https://docs.google.com/document/d/1HD-BEaVeDdFjjCNFkb0j3pvwe7MrP3PtE-bWHkkdq7Q/edit#heading=h.uhoilqhqulp8[https://docs.google.com/document/d/1hd-beaveddfjjcnfkb0j3pvwe7mrp3pte-bwhkkdq7q/edit%23heading=h.uhoilqhqulp8] [#N8-ptr1]

8.

Priem, Jason, Dario Taraborelli, Paul Groth, and Cameron Neylon. 2010. “Alt-Metrics: AManifesto.” http://altmetrics.org/manifesto/ [http://altmetrics.org/manifesto/] . [#N9-ptr1]

9.

Eysenbach, Gunther. 2006. “Citation Advantage of Open Access Articles.” PLoS Biology 4 (5)(May): e157. doi:10.1371/journal.pbio.0040157. http://www.pubmedcentral.nih.gov/articlerender.fcgi?artid=1459247&tool=pmcentrez&rendertype=abstract[http://www.pubmedcentral.nih.gov/articlerender.fcgi?artid=1459247&tool=pmcentrez&rendertype=abstract] . [#N10-ptr1]

10.

Haustein, Stefanie, and Tobias Siebenlist. 2011. “Applying Social Bookmarking Data toEvaluate Journal Usage.” Journal of Informetrics 5 (3) (May): 457–446. http://dx.doi.org/10.1016/j.joi.2011.04.002. [#N11-ptr1]

11.

Li, Xuemei, Mike Thelwall, and Dean Giustini. 2011. “Validating Online Reference Managersfor Scholarly Impact Measurement.” Scientometrics 91 (2) (December 21): 1–11.doi:10.1007/s11192-011-0580-x. http://dl.acm.org/citation.cfm?id=2205928.2205953.[#N12-ptr1]

12.

Nielsen, FÅ. 2007. “Scientific Citations in Wikipedia.” First Monday 12 (8). http://arxiv.org/pdf/0705.2106 [http://arxiv.org/pdf/0705.2106] . [#N13-ptr1]

13.

Neylon, Cameron, and Shirley Wu. 2009. “Article-Level Metrics and the Evolution ofScientific Impact.” PLoS Biol 7 (11) (November). [#N14-ptr1]

14.

Priem, Jason, and Bradley M. Hemminger. 2012. “Decoupling the Scholarly Journal .”Frontiers in Computational Neuroscience 6. http://www.frontiersin.org/Computational_Neuroscience/10.3389/fncom.2012.00019/full [http://www.frontiersin.org/computational_neuroscience/10.3389/fncom.2012.00019/full] . [#N15-ptr1]

15.

Taraborelli, Dario. 2008. “Soft Peer Review: Social Software and Distributed ScientificEvaluation.” In Proceedings of the 8th International Conference on the Design ofCooperative Systems (COOP ’08). Carry-le-Rouet, France. http://eprints.ucl.ac.uk/8279/[http://eprints.ucl.ac.uk/8279/] . [#N16-ptr1]

16.

http://article-level-metrics.plos.org/ [http://article-level-metrics.plos.org/] [#N17-ptr1]17.

Smith, Richard. 2009. “Richard Smith: The beginning of the end for impact factors andjournals.” The BMJ Blog. http://blogs.bmj.com/bmj/2009/11/02/richard-smith-the-beginning-of-the-end-for-impact-factors-and-journals/ [http://blogs.bmj.com/bmj/2009/11/02/richard-smith-the-beginning-of-the-end-for-impact-factors-and-journals/] [#N18-ptr1]

18.

http://www.altmetric.com [http://www.altmetric.com] [#N19-ptr1]19.

http://www.plumanalytics.com/ [http://www.plumanalytics.com/] [#N20-ptr1]20.

More information available at http://50.17.213.175 [http://50.17.213.175/] . [Formerlyhttp://www.sciencecard.org/] [#N21-ptr1]

21.

The Imperative for Open Altmetrics http://quod.lib.umich.edu/cgi/t/text/idx/j/jep/3336451.0017.301/...

11 of 12 11/3/14, 9:06 PM

http://www.dcc.ac.uk/resources/external/readermeter [http://www.dcc.ac.uk/resources/external/readermeter] [#N22-ptr1]

22.

http://www.papercritic.com/ [http://www.papercritic.com/] [#N23-ptr1]23.

Wilbanks, John. 2011. “Openness as infrastructure.” Journal of Cheminformatics, 3(1), 36.doi:10.1186/1758-2946-3-36 [#N24-ptr1]

24.

Priem, Jason and Heather A. Piwowar. 2014. “Top 5 altmetrics trends to watch in 2014.”Impactstory blog. http://blog.impactstory.org/top-5-altmetrics-trends-to-watch-2014/[http://blog.impactstory.org/top-5-altmetrics-trends-to-watch-2014/] [#N25-ptr1]

25.

Product of Michigan Publishing, University of Michigan Library • [email protected] • ISSN 1080-2711

The Imperative for Open Altmetrics http://quod.lib.umich.edu/cgi/t/text/idx/j/jep/3336451.0017.301/...

12 of 12 11/3/14, 9:06 PM