Survey Completed: April 2006
This survey is statistical, and clearly not based on a personal reading of every journal. Moreover, I’m making no claim that I have discovered some infallible way to evaluate philosophy journals. Consequently, there may be journals that are not fairly placed. In the interest of providing the best information possible, I will be happy to supplement this page with additional comments regarding which journals are over- or under-rated by this survey.
Moreover, tastes in journals will obviously vary between philosophers. Personally, I rarely find anything of interest in Mind (Level 3), while I routinely find interesting articles in the Journal of the History of Ideas (Level 9). Similarly, a specialist journal such as New Vico Studies is unlikely to be of much interest to anyone but Vico scholars. I mention this to underline the purpose of this page, which is not to rank journals, but to highlight journals that might otherwise be overlooked. Readers of this page should compare the list with their own evaluations, ignore it when they think it’s wrong, but otherwise use it as a guide to new journals they might find worth reading (or submitting an article to). Certainly some journals are better than others, but ultimately the question is not which journal is the best, but which is best for your work, and that is something this list obviously cannot tell you.
It is in this spirit that I am not providing many details on how the list is compiled. Clearly I think the approach I use has some worth. However, I believe that providing details on the methodology of the list will have more negative than positive effects. I make no claim that the criteria I use are themselves directly indicative of the quality of a journal. Rather, the goal in developing the methodology was merely to “track” quality. Consequently, it is certainly possible for a journal to do well with respect to these criteria without being a good journal. Moreover, the criteria are such that the methodology is open to manipulation. Obviously, this web page doesn’t have such standing that any already-recognised journal will feel the need to adjust its practices so as to appear higher in the survey. However, guiding people to journals of that prestige-level isn’t really the purpose of this list. No-one needs to be told that Philosophy and Public Affairs is a top-quality journal. Rather, the goal of the list is to help people find quality journals with which they were previously unfamiliar. Such journals, however, are precisely those most likely to allow their procedures to be affected by the methodology adopted in a survey such as this. The survey has already gained fairly widespread attention, and a prominent place in the survey is an easy way for journals to secure attention they otherwise would not receive. This is particularly a concern given the interplay of academic and financial concerns in journal publication, as publication of the survey’s methodology could lead to pressure on journal editors from publishers that less interest in the quality of their journal than in sales. Anyone familiar with the use of the Philosophical Gourmet report (even before its current prominence) by American university deans will understand this rationale.
Ultimately, however, while I agree that this list would be most informative if the methodology were public, I don’t feel any significant harm is caused by keeping it secret. Philosophers can evaluate for themselves if the survey seems at all reliable, based on its placing of those journals with which they are already familiar. If they decide it is helpful, they can take it themselves from there. If the survey says Journal X is good, they can examine a copy for themselves, or at least get hold of selected articles. If they don’t agree it’s good, they can simply not look at it again. On the other hand, if they agree the Journal is good, the survey will have achieved its purpose. It will have introduced someone to a new resource, one that they probably would not even have considered examining if they were simply told “here are all the journals in the world, some are good, go find one.”
There are three aspects of the journal’s methodology that I will highlight, as I don’t believe they are subject to the concerns expressed above. Firstly, the survey only takes into account recent issues of journals (approximately a 10 period). Thus, a journal’s prestigious history will have no effect at all upon its placing in the survey. Secondly, the list is based upon the rationale that a bigger journal is not necessarily a better one. That is, the list reflects the view that if, for example, one doubles or triples the number of articles in the Journal of Philosophy, indisputably a top-level journal, but only adds articles of mediocre or low quality, then the quality of the journal as a whole will decline – even though it still carries just as many important articles as ever. As a result, while a large journal can still appear at the top of the survey if its articles are consistently notable (e.g. Philosophy and Phenomenological Research), there may be other journals placed lower in the list that also consistently publish top-quality articles, but publish a large number of less notable articles as well.
Finally, it should be remembered that this survey evaluates journals as philosophy journals. As a result, there are very good journals that appear in the lower levels of the survey (e.g., on my personal evaluation, Diogenes and Critical Inquiry), simply because most articles they publish are “philosophical” only in the broadest sense, and bear little relation to work in philosophy as an academic discipline. Nonetheless, they would not be listed here at all if they did not regularly carry articles of interest to professional philosophers.
At the end of the list is a selection of “Other Journals to Consider”. Journals are placed in this section solely because I had inadequate information to evaluate them. It does not indicate that these journals are inferior in any way. I will gradually research these journals and add them to the proper place in the survey.
The country listed after the journal’s name is generally that of the country in which it is published – hence the large number of Netherlands journals. Where the editorial presence was clearly somewhere other than the place of publication I have used the nation in which the editors were located.
The primary change from the 2004 edition of the survey is the increased number of levels in which journals have been placed. Even in the space of only two years there has been a significant increase in the information available about journals. Journal websites are more informative, increasing numbers of journals have full-text access online through academic libraries, and the Philosopher’s Index has been actively attempting to fill the “holes” in its database. This improvement in information has let me evaluate a larger number of journals, as well as evaluate them based on better information. The increased number of levels in the survey reflects this increase in available information. In a small number of cases increased information has also led to significant changes in the levels at which a journal is placed, as prior information was found to be misleading. In all other cases, however, changes in level are reflections of changes in the survey’s evaluation of the journal, as new editions are considered and older ones left aside..
I would reiterate, however, that this list is intended solely as a “first instance” guide. It should be used only as a first step in investigating journals, and should not be taken as providing any form of absolute judgment on the quality or value of the journals listed. Judgments of that kind can only be made by individual philosophers based on thorough familiarity with a journal, and never by composite lists such as this one.