Documentation of the File Drawer Problem at Finance Conferences: A Follow-Up Study

  • Manoela N. Morais and Matthew R. Morey
  • Journal of Investing
  • A version of this paper can be found here
  • Want to read our summaries of academic finance papers? Check out our Academic Research Insight category

What are the research questions?

This research is an update to “Documentation of the File Drawer Problem in Academic Finance Journals” published by the same authors in the Journal of Investment Management in 2018. A summary of that article can be found here. The “file drawer problem” refers to the idea that journal editors are predisposed to accepting articles for publication, only if they contain statistically significant results.  Since editors are motivated by improving journal impact numbers and citation counts, this bias is not surprising.  Articles with significant results are more likely to be cited and thus improve journal impact.  Articles with nonsignificant results end up hidden away in the researchers’ file drawer and not submitted anywhere at all.  Putting numbers to the problem in academic journals, the authors reported only 2.1% of 29 finance journals published nonsignificant results.  Five of those 29 journals published no studies with insignificant results. This update examines the degree to which finance conferences exhibit a similar pattern.

  1. Is there a significant file drawer problem with respect to academic financial conferences?

What are the Academic Insights?

  1. YES. The file drawer problem was observed to be at least as serious at finance conferences as it is in finance journals. The authors constructed a database of 3,425 empirical articles presented at the annual Financial Management Association for 5 years.  The FMA is the largest academic conference by number of papers.  Each paper examined was a stand-along research article.  Roundtables, panel sessions, pedagogy series and debates were not included. Of the 3,425 articles, only 14 (or 0.41%) had nonsignificant results over the five year period.  This is in comparison to the 2.1% of articles published in academic journals.  It also appears that the problem within the FMA intensified between 2014 and 2018. Stunning.

Why does it matter?

As with journal publications, this article provides evidence that the file drawer problem is alive and well with respect to academic financial research conferences.  It appears that potential presenters should avoid submitting analyses that have nonsignificant results otherwise risk rejection by the conference. As a result, conference attendants see a biased set of research presentations comprised of only those papers that exhibit statistical significance. The important question here how much this bias contributes to the use of p-hacking or datamining practices in order to achieve significant results. We have seen increasing attention paid to the practice of p-hacking, datamining, and other “bad habits” and the negative impact they have on the credibility of the discipline.

In 2017, Campbell Harvey (his Presidential Address for the Am Finance Assoc) took the issue one step further into the intentional misuse of statistics in empirical research.  He defines intentional p-hacking as the practice of reporting only significant results when the researcher has conducted a myriad of statistical methods, empirical approaches or data manipulation.  The underlying motivation for the use of such practices is the desire to be published in a world where finance journals are biased towards publishing significant results almost exclusively. The underlying risk to p-hacking and datamining, especially in the investments area, is the identification of significant results when they are likely just random events.   Since random events by definition, do not repeat themselves in a predictable manner, the investment results are likely to fail on a going-forward basis. Datamining and p-hacking go a long way in explaining why investment strategies fail out-of-sample, or even worse when they are implemented in the real world.

This criticism can now be extended to finance conferences.

The most important chart from the paper

The results are hypothetical results and are NOT an indicator of future results and do NOT represent returns that any investor actually attained.  Indexes are unmanaged and do not reflect management or trading fees, and one cannot invest directly in an index.

Abstract

The file drawer problem is a publication bias where journal editors are much more likely to accept empirical papers with statistically significant results than those with statistically nonsignificant results. As a result, papers that have nonsignificant results are not published and relegated to the file drawer, never to be seen by others. In a previous paper, Morey and Yadav (2018) examined the file drawer problem in finance journals and found evidence that strongly suggests that such a publication bias exists in finance journals. In this follow-up study, we examine the prevalence of the file drawer problem at finance conferences. As such we are the first article in finance that we know of to attempt such an analysis. To do this, we examine every single empirical paper presented at the annual Financial Management Association (FMA) conference from 2014–2018. In an examination of 3,425 empirical papers, we found less than 0.5% of these papers had statistically nonsignificant results. These results suggest that there is also a significant file drawer problem at finance conferences.

About the Author: Tommi Johnsen, PhD

Tommi Johnsen, PhD
Tommi Johnsen is the former Director of the Reiman School of Finance and an Emeritus Professor at the Daniels College of Business at the University of Denver. She has worked extensively as a research consultant and investment advisor for institutional investors and wealth managers in quantitative methods and portfolio construction. She taught at the graduate and undergraduate levels and published research in several areas including: capital markets, portfolio management and performance analysis, financial applications of econometrics and the analysis of equity securities. In 2019, Dr. Johnsen published “Smarter Investing” with Palgrave/Macmillan, a top 10 in business book sales for the publisher.  She received her Ph.D. from the University of Colorado at Boulder, with a major field of study in Investments and a minor in Econometrics.  Currently, Dr. Johnsen is a consultant to wealthy families/individuals, asset managers, and wealth managers.

Important Disclosures

For informational and educational purposes only and should not be construed as specific investment, accounting, legal, or tax advice. Certain information is deemed to be reliable, but its accuracy and completeness cannot be guaranteed. Third party information may become outdated or otherwise superseded without notice.  Neither the Securities and Exchange Commission (SEC) nor any other federal or state agency has approved, determined the accuracy, or confirmed the adequacy of this article.

The views and opinions expressed herein are those of the author and do not necessarily reflect the views of Alpha Architect, its affiliates or its employees. Our full disclosures are available here. Definitions of common statistics used in our analysis are available here (towards the bottom).

Join thousands of other readers and subscribe to our blog.