The authors discuss the value of journal metrics for the assessment of scientific-scholarly journals from a general bibliometric perspective, and from the point of view of creators of new journal metrics, journal editors and publishers. They conclude that citation-based indicators of journal performance are appropriate tools in journal assessment provided that they are accurate, and used with care and competence.
Source: Scientometrics, Online First™, 24 March 2012
Citation-based metrics are appropriate tools in journal assessment provided that they are accurate and used in an informed way
Henk F. Moed, Lisa Colledge, Jan Reedijk, Felix Moya-Anegon, Vicente Guerrero-Bote, Andrew Plume and Mayur Amin
The interest in developing scholarly impact metrics is frequently justified by the need to objectively prioritize scarce resources and to better manage scholarly productivity. However, the study of scholarly communication in general, including scholarly impact metrics, has significant relevance to a number of other scientific domains such as computational social science, social network analysis, web science, and complex systems. In this presentation I will provide an overview of established scholarly impact metrics, grounding each in their respective scientific traditions and backgrounds. Changes in scholarly communication patterns, including the move to online environments and the increasing use of social media, have recently prompted a Cambrian explosion of new impact metrics derived from new data sources. These metrics may reflect previously unexplored facets of scholarly communication and impact, and may thus yield a more complete picture of scholarly communication. In my presentation I will provide an overview of these new metrics, and identify the opportunities as well as challenges that they present.
A: Citation based social metrics:
Random walk; PageRank/Eigenvector
Shortest path; Closeness/Betweenness
Scholarly community and communication is moving online.
Data pertaining to online activities (implicit, behavioral) vs. citation data (explicit declaration of influence)
C: AltMetrics: Behavioral AND “attention” data
Social media attention, bookmarking, mentions. Attempt to also capture “social” attention or public impact of scholarly work
Source:Metrics Session: An overview of scholarly impact metricsPresented by Mr. Johan BOLLEN on 19 Jun 2013 from 15:30 to 16:00Session: Plenary 2
One of the interesting articles in the latest issue of College & Research Libraries (September 2012) is this article on the development and results of a recent survey of academic librarians about their attitudes, involvement, and perceived capabilities using and engaging in primary research. The purpose of the survey was to inform the development of a continuing education program in research design.
"This article contributes a new perspective on the topic of how librarians think of their own abilities to conduct research with the introduction of a confidence scale and opens a line of inquiry for possible future research activities related to self-efficacy and research productivity..."
"Two other potentially profitable research agendas are identified in this article: defining a research culture in a library setting and performing a systematic review of published academic librarian practitioner-researchers to learn how to replicate their success."
" . . . Mathews’ white paper “Think Like a Startup” makes a compelling case that within 20 years many of the modern academic libraries’ services will be housed and run by other units across campus. Therefore, Mathews argues academic libraries need to forge new partnerships across campus, discover new ways to create value for their users, and experiment with radical new approaches to solving their most pressing needs. . . ."
This case study will be of interest to libraries considering a shift from traditional reference service to one that provides in-depth and personalized service, and provides a discussion of the choices made and issues considered from such a paradigm shift.
Eliminating traditional reference desk services requires careful analysis and planning, as well as adequate and ongoing training of desk staff on the frontlines of service. The study indicates that the new service model resulting in more in-depth, lengthy consultations will require librarians' expertise. The elimination of walk-in specialized reference assistance has not negatively impacted overall reference statistics, with some reference and instructional statistics actually improving. However, it concludes that each institution must balance its organizational needs with that of its customers when designing its future services.
Since the 1960s citation counts have been the standard for judging scholarly contributions and status, but growing awareness of the strategy's limitations should lead to acceptance of alternative metrics. Citation analysis drawbacks include lack of timeliness, self citation and citations that are superfluous, negative and incomplete, and traditional counts reflect only a small fraction of actual usage. A better categorization of scholarly impact would cover usage, captures, mentions and social media in addition to citations. Metrics should include mentions in blogs and other nontraditional formats, open review forums, electronic book downloads, library circulation counts, bookmarks, tweets and more. Such alternative metrics provide a more complete view of peer response to scholarly writings and better demonstrate the relative position of a research grant applicant and potential for influential work. Altmetrics are readily available, and their value for evaluating scholarly work should be recognized.
The authors:" There is a temptation to see this new paradigm for measuring impact as a passing fad: interesting, but too early, or simply not serious with regard to scientific research. The question arises: Does the process for granting tenure need to be changed in order for these measures to be accepted? A better question is why a demonstrably sub-standard process whose faults and drawbacks are so well known has persisted for so long. The easy answer is that it is all we have had for five decades, but the truth is that decisionmakers want quantifiable data for making decisions. Promotion, hiring and grant funding processes will continue to evolve, but those changes will not be prerequisites for including more holistic measurements."
Special Section: What, Why And Where?
You have full text access to this contentAre alternative metrics still alternative?Mike Buschman Co-Founders of Plum Analytics, Worked at Microsoft as a Librarian and Program Manager1,Andrea Michalek Co-Founders of Plum Analytics, Serial Entrepreneur with a Focus on Search and Information Retrieval Products2
Bulletin of the American Society for Information Science and Technology
To stay robust and relevant, academic libraries may need to abandon hands-on collection development and big deal subscription packages in favor of patron-driven acquisitions (PDA), open access, and curation of campus specialties.
David W. Lewis, dean of the Indiana University-Purdue University Indianapolis (IUPUI) University Library, in his article entitled "From Stacks to the Web: the Transformation of Academic Library Collecting", predicts that the academic library world will radically restructure itself in the next eight years. He forecasts that by 2020, effectively all content delivery will have become digital.
He suggests, "If academic libraries are to be successful, they will need to: deconstruct legacy print collections; move from item-by-item book selection to purchase-on-demand and subscriptions; manage the transition to open access journals; focus on curating unique items; and develop new mechanisms for funding national infrastructure."