Skip to Main Content

Science and Technology Section (STS): Sci Post Repository: Sci Post 2017

This is the repository for STS's Sci Posts

Bibliometrics

Bibliometrics are a way to analyze and measure published works.  Often bibliometrics are used to help determine research productivity and research impact which in turn are influential in decisions on the hiring and promotion of academics.  Needless to say, all bibliometrics should only be used in the appropriate context: usually the academic discipline in which the work appears.  Traditional metrics are generally based on citations and other scholarly references.  Altmetrics generally involve mentions in social media and downloads.  Although currently there is a distinction between the traditional metrics and altmetrics, this distinction is likely to disappear over time.

 

Bibliometrics are generally applied to books, journals, articles, and authors.  A summary of the different metrics can be seen in this figure.

                               

Traditional Metrics                         Altmetrics

                                                                                                                       

Bibliographies            (Books)         Library ownership

Book Reviews                                   Amazon rating

 

Impact Factor             (Journal)       Sum of article altmetrics for Journal

Eigenfactor                                       Downloads/Views

H-Index                                            Tweets

 

Journal                         (Article)        Facebook mentions

Citations                                            Google posts

                                                          Academic social media shares

                                                                                                                       

Total # of papers           (Author)        News briefs

Total # of citations                               Blog posts

Citations/paper        

H-Index                                               Sum of article altmetrics for Author

 

Increasingly, book chapters can be tracked the same way that articles can be tracked.  However, traditionally the mark of success for a book was to be listed in a bibliography for the discipline and to have the book reviewed in as many journal as possible.  Alternatively, we can count how many libraries (academic and public) own a particular book and also even look at the Amazon rating for a book.

Some journals “count” more or are “ranked” higher; and often scholars are encouraged to publish in these "good" journals and libraries will tend to subscribe to these "influential" journals more.  Higher ranked journals are often assumed to be more difficult to publish in and therefore more prestigious, but they do not necessarily have a lower acceptance rates than other journals.   Impact Factor is one of the main metrics for journals and is the average number of citations received per paper for a given journal during the preceding two years.  A lesser known metric is Eigenfactor which uses network analysis to determine a journal's total importance to the scientific community and is greatly influenced by the number of articles in the journal.  One metric that is starting to grow in popularity is the H-Index which is also highly influenced by the number of papers published.  Altmetrics for a journal is simply the sum of all the altmetrics of each of the articles.  You may have seen the torus or altmetric attention score (from altmetric.com) or the plum print (from plumanlytics.com) which gives the total number of “mentions” and the colors indicate what kind of “mentions.”

Traditionally, journal articles are first evaluated based on what journal it is published in.  Then, articles are evaluated based on the number of citations it has.  Keep in mind that different databases will give different citation numbers.  Although authors often try to publish in highly ranked journals, being published in a highly ranked journal is no guarantee that one’s article will be read and cited; and many highly cited papers are published in lower ranked journals.  As researchers spend less time browsing journals and are more likely to find articles searching through an indexing database, the potential influence of an individual article is no longer as tied to journals as in the past.  For altmetrics, the main measurement is at the article level which is why sometimes people equate altmetrics to article metrics.  Article altmetrics include downloads/views (both from the publisher’s website and repositories); social media mentions; shares in Mendeley, academia.edu or researchgate.net; and more formal newsbriefs or blog posts about the article.

Traditional author metrics are a summary of an author's work and influence and are often listed on CVs.  They include the total number of papers and the total citations and the average citation per paper.  But the total and average citations can be greatly influenced by a single paper if it has an overwhelming number of citations. H-Index (the number of papers h that have been cited at least h number of times) gets around this problem somewhat; but it is highly influenced by the number of papers that have been written so when comparing people, one can normalize it by academic age (or the years since receiving a PhD). The altmetrics for authors are based on the sum of the altmetrics of their articles.  

       

For more information:

Journal Metrics:  https://www.journalmetrics.com/

Impact Factor:  http://admin-apps.webofknowledge.com/JCR/help/h_impfact.htm

Eigenfactor:  http://www.eigenfactor.org/about.php

H-Index:  Hirsch, J. E. (15 November 2005). "An index to quantify an individual's scientific research output". PNAS. 102 (46): 16569–16572.

Article Metrics: http://sparcopen.org/our-work/article-level-metrics/  

 

Author of this post:

Christina Chan-Park, Science Librarian, Baylor University

Subject/Disciplinary Repositories: An Overview of Resources

There is a growing body of freely-available scholarship, both in published and preprint form, being deposited in a diverse universe of repositories.  The resources listed below are provided as tools to locate research of interest to STS librarians.

 

OpenDOAR: Directory of Open Access Repositories

OpenDOAR is “a project to list and categorise academic open access research repositories. The aim is to provide a comprehensive and authoritative list of such repositories for end-users who wish to find particular archives or who wish to break down repositories by locale, content or other measures.” According to the OpenDOAR home page, as of 2014 (when the page was last updated) there were over 3,400 listings.  Users can search for repositories by subject area, content type (e.g., articles, books, datasets, multimedia, theses), country, language, and software, as well as limit the search to discipline-specific repositories.  The contents of repositories are searchable through Google Custom Search.  A list of repositories sorted geographically is also available. OpenDOAR is one of the SHERPA services run by the Centre for Research Communications (CRC).  It is being developed and maintained by the University of Nottingham.

 

Registry of Open Access Repositories (ROAR)

According to the ROAR home page, “The aim of ROAR is to promote the development of open access by providing timely information about the growth and status of repositories throughout the world.” As of this writing, there were 4,545 entries in ROAR.  Users can search for repositories by country, software, repository type, activity, number of records, repository name, LC classification, or age, while browsing can be done by country, year, repository type, or repository software. The contents of repositories are searchable through Google Custom Search. A unique feature of ROAR is the graphing of deposit activity from 2000.  Search results can be exported in a variety of formats, including activity table, Dublin Core, HTML Citation, and RDF.  ROAR is hosted at the University of Southampton.

 

arXiv.org

arXiv.org is a service providing open access to e-prints in the fields of physics, mathematics, computer science, quantitative biology, quantitative finance, statistics, and the recently launched electrical engineering and systems science and economics.  Users can browse by subject and search by year, title, author, and/or abstract.  An experimental keyword search is available with a caveat about the currency of results.  As of this writing, there were more than 1.3 million e-prints in arXiv.org, which is owned and operated by Cornell University.

 

OSF Preprints

OSF Preprints aggregates search results from other preprint providers and repositories.  Subjects include architecture, arts and humanities, business, education, engineering, law, life sciences, medicine and health sciences, physical sciences and mathematics, and social and behavioral sciences. It uses the Open Science Framework maintained by the Center for Open Science.  As of this writing there were entries for more than 1.2 million preprints, approximately 1,450 of which are housed in OSF Preprints itself and the remainder in the 21 repositories it aggregates.

 

PubMed Central

PubMed Central (PMC) is a full-text archive of journal articles in the biomedical and life sciences deposited by participating journals and of author manuscripts submitted in compliance with public access policies of research funding agencies.  As of this writing, there were 4.5 million articles in PubMed Central, which is hosted at the National Library of Medicine.

 

PubAg

PubAg is a full-text archive of journal articles authored by USDA scientists.  As of this writing, there were 42,000 articles in PubAg, which is produced by the National Agricultural Library.  PubAg also contains more than 1.7 million citations to literature in the agricultural sciences.

 

Disciplinary Repositories (Simmons College)

Simmons College provides a list of disciplinary repositories as part of the Open Access Directory on its wiki.

 

Following is a select list of preprint servers for narrower subjects available as of this writing:

 

bioRxiv: The Preprint Server for Biology is a free archive for unpublished preprints in the life sciences.  It is operated by Cold Spring Harbor Laboratory and is labeled as “beta”.

  •  Beta: The Preprint Server for Chemistry is a free service for unpublished preprints in chemistry and related areas.  It is operated by the American Chemical Society.

Subject-specific Open Science Framework preprint servers:

 

 Further Reading:

 

“STEM Preprint Repositories: Where Are They Now?” Inside Science Resources, August 15, 2017. https://insidescienceresources.wordpress.com/2017/08/15/stem-preprint-repositories-where-are-they-now/.

 

 

Author of this post:

Betty Landesman, Librarian Emerita

 

Social Networking Sites for Researchers

Social Networking Sites for Researchers

 

Today, there are growing numbers of social networking sites available for researchers to share their research. From Mendeley to Academia.edu to ResearchGate, these sites allow users to:

 

1. Disseminate research to the web which can lead to branding and visibility of one's own research in a particular field. This can also lead to an increase in readership and citation counts as well.

 

2. Discover new research topics or find specific research papers that can be available through these sites. Users of these social media resources can share ideas, collaborate with new colleagues on similar topics of interest and foster an online community in these research areas.

 

Similar to Facebook, LinkedIn and Twitter, they support the process of sharing and promoting one's activities. However, Academia.edu, ResearchGate and similar sites focus on supporting and disseminating research content with specific features for academic purposes. They are free sites with no academic affiliations, and encourage interactions between scholars from around the world.

 

Here are some highlights for two popular social networking sites for researchers: Academia.edu and ResearchGate:

 

Academia.edu - is a free site that serves almost like a repository. Although it has a ".edu" at the end, it is a commercial enterprise and not an educational institution site. Users can create a profile, account, a university affiliation and upload or download papers from other profile pages. If papers are not uploaded, one can request to have the paper sent by the author/contributor. The site can also keep track of how many times the paper has been downloaded. One interesting aspect of this site is that it keeps track of visitors who have seen your profile and indicates where they are coming from. This kind of tracking can demonstrate how your work can be disseminated across different parts of the world. The site is much more used by humanities scholars.

 

ResearchGate - is a free site that users can create an account to upload their papers into the site. Like Academia.edu, it also allows users to find and download other papers if they have been uploaded. The site can also keep track of how many times your profile has been viewed; how many times your papers have been read and cited. The citation tracking can be useful to find out which papers have cited your work and you can go directly into the scholar's account if they have one. Once an unpublished work has been uploaded, the site can generate a DOI (Digital Object Identifier) which can allow users to find the source of the work online more easily and readily. So far only ResearchGate has this feature. The site is much more used by scientists and social scientists.

 

These interactive sites can generate visibility for any new scholar or graduate student. This can be an innovative way to make new connections and draw on new research. However, some challenges can be posed when using these sites. Many scholars do not know that their published works tend to have an embargo or copyright agreement with the publisher. Thus, it is not recommended to post new published articles into these sites as they can violate copyright agreements with the publisher or vendor. Universities have already been dealing with some of these challenges and recommend alternative ways to support this kind of process.

 

One way is to publish materials in an open access or open source publication. This can allow the authors to upload their files into the profile sites more fluidly without any legal constraints. Another way is looking at institutional repositories (IR). A growing number of university libraries have their own IRs ,and affiliated scholars can deposit their materials into the IRs and then link it back to these sites. We see that there are more and more social networking sites appearing to get the attention of busy researchers.

 

It is highly recommended for research/academic librarians to proactively engage with scholars by showing the possibilities and limits of these sites. They can be fun to use but there are some challenges to consider when using these sites.  

 

For more information:

Academia.edu:  https://www.academia.edu/

ResearchGate: https://www.researchgate.net/

Google Scholar: https://scholar.google.com/

Mendeley https://www.mendeley.com/

 

Jordan, K. 2014. Academics and their online networks: exploring the role of academic social networking sites. First Monday, 19 (11). http://dx.doi.org/10.5210/ fm.v19i11.4937

 

Author of this post:

Raymond Pun, First Year Student Success Librarian, California State University, Fresno

Open Science Framework

Introduction:

Founded in March 2013, the Center for Open Science (COS) aims to increase openness, integrity, and reproducibility of research. Several well-known and growing COS products or projects include the Open Science Framework (OSF) platform, the research reproducibility projects in Psychology or Cancer Biology, and SHARE, a one-stop-shopping database that aggregates search results from preprint repositories such as engrXiv, AgriXiv, or bioRXiv.

The following highlights several COS products or initiatives: (images attached as well)

Open Science Framework (OSF)

Promoting tools for reproducible research, Open Science Framework (OSF), as a collaborative platform, supports the research and publication cycle from start to finish.  OSF data link works well with other research and publication tools such as Zenodo, GitHub, FigShare, Amazon Web Services, Mendeley, and DMPTool.

Within a project team, members can share, analyze, edit, and communicate their research results at various phases of the research cycle by setting limits on what to share publicly and privately. The Center for Open Science maintains their YouTube channel with many good videos to help you quickly grasp the functionalities of various tools. The webinar OSF 101 and Introduction to Preprints might be good starting points.

For each project, there are many sections such as wiki, files, components, tags, auto citation generator in MLA, APA, Chicago, etc.. The folder structuring allows components to be nested inside components.  A DOI/ARK as a unique identifier can be assigned to a part of or the entire project.


 

SHARE

Funded by the Institute of Museum and Library Services (IMLS) and the Alfred P. Sloan Foundation, the SHARE initiative was founded in 2013 by the Association of Research Libraries (ARL), the Association of American Universities (AAU), and the Association of Public and Land-grant Universities (APLU). SHARE, a direct result of the partnership between the ARL and the COS, aims to build “its free, open, data set by gathering, cleaning, linking, and enhancing metadata that describes research activities and outputs—from data management plans and grant proposals to preprints, presentations, journal articles, and research data.”

As of August 22, 2017, SHARE collects metadata from 160 aggregated sources including CrossRef, arXiv, BioMedCentral, Cogprints, PeerJ, CERN Document Server, and many institutional repositories.  Your institution can also register your institutional repository as a SHARE Notify metadata provider.

For more background reading about SHARE, please consult Jud Ruttenberg’s article in C&RL News in December 2016.

OSF Preprints:

The Center for Open Science (COS) launched OSF Preprints: The Open Preprint Repository Network in September 2016. Within SHARE database, OSF Preprints includes search results from institutional repositories, arXiv, PeerJ, Research Papers in Economics, and others, in addition to the branded preprint services hosted by COS (PsyArXiv, SocArXiv, and engrXiv).  

Note that the ACRL STS Discovery & Access group recently highlighted some of the STEM arXiv-like preprints in their blog post, STEM Preprint Repositories: Where Are They Now?

References:

Center for Open Science. Introduction to Preprints, 2017. https://www.youtube.com/watch?v=fRvxDVtmQjA.

———. OSF 101 - June 2017, 2017. https://www.youtube.com/watch?v=nnIpcMWC3rc.

Resources, STS Science. “STEM Preprint Repositories: Where Are They Now?” Inside Science Resources, August 15, 2017. https://insidescienceresources.wordpress.com/2017/08/15/stem-preprint-repositories-where-are-they-now/.

Ruttenberg, Judy. “SHARE: Infrastructure for Open Scholarship.” College & Research Libraries News 77, no. 11 (December 2016): 547–50. doi:10.5860/crln.77.11.9586.

Author of this post: Khue Duong, Science Librarian, California State University, Long Beach

Open Educational Resources: A Brief Introduction

Open Educational Resources (OER) are becoming more popular in higher education, especially with the development of numerous open textbooks. According to SPARC, Open Educational Resources typically include two parts: it is free to use and it includes a license, like Creative Commons.  David Wiley, Lumen Learning, recommends that OER include the following 5R’s: Retain, Reuse, Revise, Remix, and Redistribute. Materials that include these 5 Rs grant educators and librarians the ability to mix various chapters of books and even build upon them.  

The enthusiasm for OER has grown recently due in part to the rising cost of textbooks and the growing interest in the possibility for customization of texts.  According to the Bureau of Labor Statistics, the consumer price index for textbooks has risen 88% from 2006-2016 while the consumer price for all items rose 21%.  Additionally, the College Board states that the average estimated budget for books and supplies was over $1,200 per student.

Librarians are getting more involved with open educational resources by encouraging campuses to use open textbooks, advising on creative commons licensing, and even creating open materials. There are a number of resources that librarians can use to gather and review open materials:

 

  • MERLOT: links to a variety of open resources that educators and librarians can use for classes.  Some of the items included in MERLOT are open textbooks, case studies, assignments, presentations, and quizzes.  

  • OER Commons: OER Commons also contains a variety of open resources.  OER Commons is easy to search and to refine by educational level (For example, higher education)

 

Open textbooks can be a great option for faculty looking to reduce textbook costs and/or try something new.  A couple examples of open textbook sites include:

 

  • OpenStax:  OpenStax features over 30 open textbooks with over 20 books that fall in the sciences and math. Students can access the book online, download a copy, or pay for a print version.

  • Open Textbook Library: Open Textbook Library is a collection of open textbooks from various publishers and universities.  For example, the OpenStax texts are included in the library.  Some of the texts have been reviewed by faculty, which can be helpful when locating and evaluating books.  Additionally, institutions can become members of the Open Textbook Network (OTN) which manages the Open Textbook Library.  The OTN can be an excellent resource for librarians and institutions interested in the open textbook movement.

 

Another resource that can be helpful is the Achieve Rubrics for evaluating OER.  The rubric is used for the evaluation tool within OER Commons.

 

For more information about open educational resources, please visit the sites listed below.  This is a growing area that libraries are already supporting and even leading the initiative.

 

 

Author of this post: Jeanne Hoover, Scholarly Communication Librarian, East Carolina University Libraries