Significance: Citations are the most direct measure of the impact of an article.
Caveats: Citing frequency varies by discipline. Some citations can be trivial, for example, citing an article in a long list of a required literature review. Or self cites.
Tip: Finding citations is a brute force process and is time consuming. You can search Google Scholar and search within full text Library databases. Note that by default most databases search only citations and abstracts, so look for the checkbox that says something like “Search within the text.” You would be looking for your work to be cited in the reference lists of other authors.
Journal platforms will often identify the cites to your work from other articles in their database. For example, if you published with Elsevier, look at your article page in ScienceDirect.
Don’t forget references to your work that could appear in books or dissertations. Search within Ebook Central, Google Books, and ProQuest Dissertations.
Also, set up an author page in Google Scholar. You tell Google what your publications are and it will notify you when something is cited. It is not 100% comprehensive but is still useful. Check it out here: http://scholar.google.com/intl/en-US/scholar/citations.html
Significance: Self-evident. The journal itself will identify if it is peer reviewed, so this is easy to demonstrate.
Caveat: Nearly every publication that claims to be a journal says it is peer reviewed, but this varies in rigor. In an era of proliferating journals, the fact that a journal is peer reviewed is not in and of itself a measure of reputation.
Significance: A possible measure of the exclusivity of the journal.
Caveat: Often more important than acceptance rates is the rigor of peer review and editing.
Tip: Published acceptance rates can be hard to find. For most disciplines, you often must contact the editor.
Significance: Indirectly a measure of reputation and a direct measure of the accessibility of the research.
Caveat: This number is reliable indicator but doesn’t work for recently created journals. For budget reasons, libraries are reluctant to add new journal titles these days.
Tip: Find this information at www.worldcat.org Do an advanced search that lets you limit by journal as a format. Make sure you are counting all the editions. WorldCat usually provides a record that combines all the editions.
Note: Not all Libraries are included in the free version of WorldCat. Sometimes it doesn't matter. For example if WorldCat indicates that 800 libraries provide access to the journal, but the real number is 950, either way it is a lot of libraries. However, sometimes, you might want the most accurate number. Please contact David Schoen at schoen@niagara.edu for that data.
Open Access Journals. Journals that are freely available on the web aren't subscribed to by libraries, of course. The journal platform may indicate the number of times that the article has been downloaded. If not, the editor should be able to tell you.
Significance: Inclusion in subject specific indexes is a measure of reputation. Inclusion in the large, multi-disciplinary databases can be an indicator that the journal is core to its discipline. In other words, it is important enough that the multi-subject indexes include it.
Caveat: This number is a reliable indicator but doesn’t work as well for recently created journals. In general, however, if a journal is not covered by the core indexes of its discipline, that tells you something.
Tip: Check the publisher’s web site, where you often find a list of where the journal is indexed. Otherwise, ask David Schoen at schoen@niagara.edu. New open access journals will often not yet be indexed in traditional subject indexes. However, open access journals should be indexed by Google Scholar and the Directory of Open Access Journals (DOAJ).
Significance: One way to measure the reputation of a journal is to measure how often its articles are cited.
Caveat: It can be difficult to understand and explain to others the various methods of estimating impact of a journal. To learn more about the limitations of impact factors take a look at these partisan, though interesting, critiques:
And note the movement called altmetrics, which is intended to broaden the definition of impact:
Tip: Below are some sites that provide impact factors and rank journals (Increasingly major journal platforms also provide quantitative measures of impact for each article.)
Impact Factor via ISI Journal Citation Reports (JCR)
A subscription database not owned by NU that ranks journals by subject. Provides a number of metrics including impact factor, which is a measure of the frequency with which the "average article" in a journal has been cited in a time period. JCR is a useful place to find rankings within a discipline. A high ranking is good of course, but note that a low ranking isn’t bad because the database is selective in what it ranks. Just being included in in JCR is a measure of reputation.
CiteScore (Scopus)
Provides the yearly average number of citations to recent articles published in that journal. It also ranks journals within their disciplines.
Google Scholar Metrics
Ranks the most cited journals in various disciplines.
Source Normalized Impact per Paper
Measures contextual citation impact by ‘normalizing’ citation values - takes a research field’s citation frequency into account. Considers immediacy - how quickly a paper is likely to have an impact in a given field.
Scimago Journal & Country Rank
A publicly available portal that includes the journals and country scientific indicators developed from the information contained in the Scopus database. Journals can be analyzed separately and viewed within a subject ranking.
Scopus
Tip: Sometimes it is more useful to indicate the ranking within a discipline rather than providing a number for citation rates. For example, instead of saying that the American Philosophical Quarterly has an H-Index of 25, it might be more useful to say: The American Philosophical Quarterly is ranked in Quartile 1 by ScImago Journal & Country Rank.