This article in the Chronicle of Higher Ed talks more about evolution of the scholarly information cycle, and the fact that research that used to take years to be made available to the public can now be accessed almost instantaneously via social media. The proponents of altmetrics have a manifesto which asks the question I was thinking while reading the article: How much does the conversation on social media tend toward “empty buzz,” as opposed to significant evaluative discussion? Peer review, which the authors dismiss as too slow, too encouraging of conventional thinking, and insufficiently holding reviewers accountable, is by its nature considered. At least it’s not off the cuff. This makes it less at risk of being characterized as “empty buzz.”
“Researchers must ask if altmetrics really reflect impact, or just empty buzz. Work should correlate between altmetrics and existing measures, predict citations from altmetrics, and compare altmetrics with expert evaluation. Application designers should continue to build systems to display altmetrics, develop methods to detect and repair gaming, and create metrics for use and reuse of data. Ultimately, our tools should use the rich semantic data from altmetrics to ask ‘how and why?’ as well as ‘how many?'”
Indeed. Altmetrics could provide a valuable addition to a toolset for analyzing scholarly significance, but it measures of significance using altmetrics would need to be viewed alongside traditional measures of significance tracking, such as citation analysis, to get an accurate and full picture.