Altmetrics in the Wild: Towards Creating a Live CV

from Research Remix - Click to Link to Source

from Research Remix - Click to Link to Source

The more scholars move their work online from where it was once ephemeral and hidden, the more they are integrating social media to their communication, the closer we are to telling what is the value that they themselves add to their content – and to blending these isolated factors to create a certain taste, a flavor.

Jason Priem, whose talk on finding an n-dimensional impact space I recently examined on our blog, and Heather Piwowar (Research Remix), who studies the behavior of shared article clusters and post-publication datasets, together with Bradley M. Hemminger, have just presented a preprint to their manuscript on Altmetrics in the Wild: Using Social Media to Explore Scholarly Impact.

“Articles cluster in ways that suggest different impact flavors,” they suggest in their work, sampling more than 20,000 articles, in search of a tool that would be complementary to traditional bibliometrics – that would measure process, instead of simply counting product, that would add a rich scale to the product, instead of simply keeping count.

From the abstract: “In growing numbers, scholars are integrating social media tools like blogs, Twitter, and Mendeley into their professional communications. The online, public nature of these tools exposes and reifies scholarly processes once hidden and ephemeral. Metrics based on this activities could inform broader, faster measures of impact, complementing traditional citation metrics. Alternative metrics,” Piwowar et al. explain later on, “or “altmetrics” build on information from social media use, and could be employed side-by-side with citations — one tracking formal, acknowledged influence, and the other tracking the unintentional and informal “scientific street cred”. The future, then, could see altmetrics and traditional bibliometrics presented together as complementary tools presenting a nuanced, multidimensional view of multiple research impacts at multiple time scales.”

Answers Provided

Is there in fact enough data available to construct meaningful metrics?

Although only an estimated 2.5% of scholars actively use Twitter, this number is growing steadily, and around 95% of sampled articles from the arXiv preprint repository have been tweeted. Ten to twelve percent of articles had been bookmarked on Delicious, tweeted, shared on Facebook, or received a comment on the PLoS website. About 7.5% of articles were the topic of a blog post or had received an F1000 rating. About 5% of articles had been cited on Wikipedia, Liked on Facebook, or Commented on in Facebook.*

Do social media really host scholarship, or just idle chatter?

Several studies have presented evidence of scholars using the service to enrich academic conferences as well as cite scholarly literature; as many as ⅓ of tweets from scholars contain scholarly content.*

How is data distributed across tools and users?

Different communities have different levels of social media adoption and readership levels.*

How are altmetrics distributed over time?

Citations, pageviews, and Wikipedia citations all show linear relationships between article age and number of events; the older an article is, the more events it has, suggesting that events are accumulating steadily (if more slowly) over time.*

Social reference managers CiteULike and Mendeley, as well as Delicious bookmarks and F1000 ratings, fall into a second distribution, in which the number of events is relatively unaffected by articles’ ages.*

How do altmetrics relate to one another and to traditional citations?

To examine the intercorrelation among indicators, we turned to factor analysis.
After examining the relative loadings of variables on the factors post-hoc, we interpreted and named the six impact factors as impact signal reflected through: 1) citations, 2) page views and shares, 3) Facebook-hosted discussion, 4) PLoS-hosted comments, 5) social reference manager saves, and 6) PDF downloads.*

The factors must be interpreted in the context of the other factors, in the order of factor extraction. Citation has become a gold standard for impact in both theory and practice. it is important to examine altmetrics against this existing standard.*

Can we cluster articles of different impact types using altmetrics?

We chose five metrics to be representative of impact metric dimensions: 2011 Web of Science citation counts, Mendeley saves, HTML page views, F1000 score, and a composite we call shareCombo. We constructed the shareCombo variable to keep collinearity and number of clustering dimensions low; it consists of the aggregation of events from four sources: Facebook shares, Delicious bookmarks, Blog mentions, and mentions on Twitter.*

The clusters of impact patterns could be considered the “impact flavor” of the research article. The goal, however, is not to compare flavors: one flavor is not objectively better than another. However, recognizing different types of contributions might help us appreciate scholarly products for the particular needs they meet.*

*All answers are selected from the paper

Towards Creating a Live CV

So far, only one type of research product was measured – peer reviewed article, leaving out the datasets, failing to correlate it to other articles, to readers, to other datasets – while tools emerge rapidly, that gather altmetrics, such as PLoS Article-Level Metrics, ReaderMeter, CitedIn, total-impact, altmetric.com, and ScienceCard. “Much work to expand this research,” Piwowar et al. say in the paper, “will center around reducing noise that obscures the impact signal – or, more accurately, around isolating and identifying different types of impacts on different audiences. In the future, tools like these may allow researchers to keep a “live CV” showing up-to-date indicators of their works’ impact next to each product they have produced. Funding organizations could keep timely portfolios of their grants impact across different populations and communities. Tenure, promotion, and hiring committees could assemble research teams whose members have different sets of “impact specialties”.”

To bear in mind what Kathleen Fitzpatrick said when trying to redefine authority in authorship, besides teaching machines to make that technological change for us, it could be more challenging to acquire the much needed social, intellectual and institutional change, to take this step. Technology is, however, not separated from the thought flow and authors may soon develop different relationship with their text, other than simply – to produce.

This entry was posted in Open Access, Open Access Tools, Peer Review and tagged , , , , , , . Bookmark the permalink.

1 Response to Altmetrics in the Wild: Towards Creating a Live CV

  1. Pingback: Altmetrics in the Wild article - Weekly Twitter Activity 2012-03-30 | Michael Habib | Nudging Serendipity

Leave a comment