Skip to main content
It looks like you're using Internet Explorer 11 or older. This website works best with modern browsers such as the latest versions of Chrome, Firefox, Safari, and Edge. If you continue with this browser, you may see unexpected results.
Use this information to find top journals in a field to keep on top of the research, or to research publishing opportunities for yourself.
Computer Science Analysis
This tool aims to predict quality of "future" conferences (and schools) based on the quantitative data out of past conferences. Data is solely quantitative, and the authors admit it should be considered alongside qualitative metrics. Still, it is an interesting look at citation metrics and use. Note: Data is bounded to 2007-2016.
Devoted to Computer and Information Science literature, CiteSeerX crawls user-submitted Open Access research and analyzes citation and other metadata to extract information about a paper/journal/conference.
The CORE Conference Ranking is an ongoing activity that provides assessments of major conferences and journals in the computing disciplines. CORE is managed by an Australian professional association.
It's a jungle out there!
Common predatorial tactics
The "publish or perish" culture has turned scholars into prey, and the predators that have arisen out of this culture are publishers looking to cash in on our desire to have our research known. They learn how we research quality and adjust their tactics. Do not rely on years-old advice of how to spot predators - always be skeptical. Here are some of their recent tactics:
- Lies about where they're indexed
- Researchers use scholarly indexes (ACM, IEEE, Web of Science, and more) to find scholarly publications. Many publishers will claim to be indexed there, but aren't actually!
- Check Ulrich's periodical directory to find out exactly where a journal or conference proceeding is indexed.
- Google Scholar is not a legitimate index to use to judge whether a publisher is a quality publisher. Like Google, it indexes everything without any thought to quality.
- Over-publishing issues.
- If they say they're quarterly, do they actually publish only 4 per year? Look closer, sometimes one volume may only have 4 issues, but they might have all been published in the same month.
- Huge editorial boards, or none at all.
- Look in the CVs of people on the boards - do they brag about it on their CV? If it's a legitimate journal that they're legitimately on the board of, they will!
- I've seen hundreds of members on an editorial board who only talk about publishing in that journal on their CV, not of membership on the board. I think these publications are getting smarter by at least learning to list people who have some relationship to the journal (even if its not editorial).
- I've also seen blank pages, or boards that consist only of people who work at the publisher.
- Lies about membership in industry initiatives and associations. They learned we look for this, so they've started claiming membership, but don't actually qualify. So double-check.
Highly Curated Collections
Science Citation Index Expanded
Web of Science has chosen only the very top journals in Science to be part of Science Citation Index Expanded. When you perform a search, you can limit it "Highly Cited" or "Hot Papers", by discipline, by funding agency, and more.
Citation Metrics measure the importance of a journal based on how often the articles within it are cited by other "important" journals. There are tactics to "game" this system, so only use this as one piece of information when judging journal quality.
InCites Journal Citation Reports (Thomson Reuters)
Objectively determines the relative importance of journals within their subject categories. Information for each title includes the "impact factor" (measurement of the frequency with which the average article has been cited in a particular year) and the "immediacy index" (how quickly the average article in a journal is cited).
Cabell's Scholarly Analytics
Use "Journalytics" to find pre-vetted quality journals and view their rankings in various other metrics databases (JCR, Altmetrics, etc.). Use the "Predatory Reports" to search for a questionable journal to determine if it has already been deemed "predatory" or "low-quality", and why. Cabell's takes into account the editorial process as well as citation metrics in determining rankings.
Web of Science
Web of Science is a database where you can search for articles and view the citation metrics for the article and journal in the results. Uses JCR, Science Citation Index, and others.