Ed Altman is simply a legend.

Everyone in finance is familiar with the Altman Z-Score and many analysts use it religiously–despite the fact there has been 50 years worth of research suggesting there are better methods. The Altman Z-Score kinda reminds me of the CAPM: the CAPM has been shown to not be very effective, and yet, everyone still uses it?

Nonetheless, Prof. Altman single handily created an empire based on his original 1968 book/research on predicting corporate bankruptcy.

Prof. Altman went ahead and outdid himself with his new ZETA score:

The point of this post is to highlight some recent research that concocts a model that predicts better than the Altman-Z AND is relatively straight forward to implement (at least relative to the ZETA score, which is proprietary and involves a lot of leg work to reverse engineer).

Before we begin, here are some recent models showing that Altman-Z is dead:

- Shumway (2001)
- Shows that half of the variables included in Altman-Z are no longer predictive of bankruptcy–ouch.
- He also identifies a model that trumps the Altman-Z in out of sample tests–double ouch.

- Chava and Jarrow (2004)
- Verify with expanded data that Altman’s model sucks vs. Shumway’s model.
- Determine that controlling for industry effects can significantly improve bankruptcy prediction models.

## What does this paper add to the party?

The authors devise a model that ends up improving the forecast accuracy by 16% compared to Shumway and Chava and Jarrow’s models (which in turns mean it is *way* better than Altman-Z).

Here is how they set up their prediction variables:

- profitability measure=weighted average (quarter’s Net income/MTA)
^{(1)}- MTA=Market value of total assets=book value of liabilities +market cap.

- leverage measure=Total liabilities/MTA
- short-term liquidity=cash & equivalents/MTA
- Recent relative performance=weighted average(log(1+stock’s return)-log(1+S&P 500 return)
- Recent volatility=annualized stock’s standard deviation over the previous 3 months
- Relative size=log(stock market cap / S&P 500 total market value)
- Overvaluation factor=MTA/adjusted book value, where adjusted book value=book value+.1*(market cap-book value)
- Stock price level=Log(recent stock price), capped at $15, so a firm with a stock price of $20, would be given a value of log(15) instead of log(20).

Here is a table comparing the stats on the various measures above for the universe of firms and for firms that have failed:

It certainly seems that the variables are capturing something regarding failed firms!

So how do we go about predicting bankruptcy/failure? Behold the magical distress formula:

After running some logistic regressions the authors determine that their model beats the best alternatives out there over a variety of time periods:

Here are the results from the logistic regression:

If you forgot to get a PhD in some quantitative field and you are not sure how to use these regression results, here is a quick background on logistic regressions:

- Calculate the following:
- Logit=-8.87-.09*PRICE+MB*.07-.005*RSIZE+1.55*SIGMA-7.88*EXRETAVG-2.27*CASHMTA+1.6*TLMTA-20.12*NIMTAAVG
- P=probability of failure=1/(1+exp-(logit))

Once you have calculated your P’s you are ready to predict which firms are likely to go bankrupt. So how might one use this information? Well, if you could predict which firms are going bankrupt–assuming the market hasn’t already priced in a bankruptcy–you might be able to make some money. Let’s see.

Here are results of sorting portfolios based on “P” every January and holding stocks for one year.

Pretty wild, eh? Stocks with low probabilities of failure do well, but from a statistical standpoint, it is hard to say if there is any alpha. However, check out the high probability of default firms–wow–negative alphas galore!

We’re getting closer to affording our gold-plated Mercedes!

**Investment Strategy**

Strategy 1:

- Stop using Altman-Z to predict bankruptcy

Strategy 2:

- Calculate logits using the estimates from table 2
- Calculate the probability (P from above) of bankruptcy using the logit figures calculated in 1.
- Go long low P firms, and WAY short high P firms.
- Make money.

To juice up strategy 2:

- Focus on distressed firms in the extreme growth and value categories (see below)

**Commentary:**

I’ve been playing with the ‘financial quality’ variables for one of our in-house ‘value factor’ models recently (every quant shop has to have one, right?). I’ll admit it, we were using Altman-Z as a component of that variable, but after doing some research I’m sh&$canning Altman-Z and going test this model out instead. No disrespect to Prof. Altman, but when the evidence shows something works better, why not use it?

## Predicting Financial Distress and the Performance of Distressed Stocks

- John Y. Campbell, Jens Hilscher, and Jan Szilagyi
- A version of the paper can be found here. (Here is the Journal of Finance version)

**Abstract**

In this paper we consider the measurement and pricing of distress risk. We present a model of corporate failure in which accounting and market-based measures forecast the likelihood of future financial distress. Our best model is more accurate than leading alternative measures of corporate failure risk. We then use our measure of financial distress to examine the performance of distressed stocks from 1981 to 2008.We find that distressed stocks have highly variable returns and high market betas and that they tend to underperform safe stocks by more at times of high market volatility and risk aversion. However, investors in distressed stocks have not been rewarded for bearing these risks. Instead, distressed stocks have had very low returns, both relative to the market and after adjusting for their high risk. The underperformance of distressed stocks is present in all size and value quintiles. It is lower for stocks with low analyst coverage and institutional holdings, which suggests that information or arbitrage-related frictions may be partly responsible for the underperformance of distressed stocks.

**Data Sources**

The study covers the 1963-2008 time period. Data come from Compustat and CRSP. Data on failures come from Kamakura Risk Information Services.

- The views and opinions expressed herein are those of the author and do not necessarily reflect the views of Alpha Architect, its affiliates or its employees. Our full disclosures are available here. Definitions of common statistics used in our analysis are available here (towards the bottom).
- Join thousands of other readers and subscribe to our blog.
- This site provides
**NO**information on our value ETFs or our momentum ETFs. Please refer to this site.

References

1. | ↑ | weighted average is represented by some silly a&* equation meant to create confusion for the layman. |

Sunday links: hitting rock bottom | Abnormal ReturnsJuly 24, 2011 at 4:24 pm[…] Stop using the Altman-Z score to predict distress. (Empirical Finance Blog) […]

Stop Using the Altman-Z Score. | Empirical Finance Blog | onymotefydJuly 26, 2011 at 3:00 pm[…] Source: http://turnkeyanalyst.com/2011/07/stop-using-altman-z-score/ […]

shawnJuly 27, 2011 at 3:20 amI loved your explanation that “weighted average is represented by some silly a&* equation meant to create confusion for the layman”. Hear hear. May more researchers straight talk like you do. In particular, may more abstracts take the form of your succinct “Make Money.” summaries.

Simoleon Sense » Blog Archive » Weekly Roundup 138: A Curated Linkfest For The Smartest People On The WebJuly 31, 2011 at 6:27 pm[…] Predicting Financial Distress and the Performance of Distressed Stocks – via EmpiricalFinance- “In this paper we consider the measurement and pricing of distress risk. We present a model of corporate failure in which accounting and market-based measures forecast the likelihood of future financial distress. Our best model is more accurate than leading alternative measures of corporate failure risk. We then use our measure of financial distress to examine the performance of distressed stocks from 1981 to 2008.We find that distressed stocks have highly variable returns and high market betas and that they tend to underperform safe stocks by more at times of high market volatility and risk aversion. However, investors in distressed stocks have not been rewarded for bearing these risks. Instead, distressed stocks have had very low returns, both relative to the market and after adjusting for their high risk. The underperformance of distressed stocks is present in all size and value quintiles. It is lower for stocks with low analyst coverage and institutional holdings, which suggests that information or arbitrage-related frictions may be partly responsible for the underperformance of distressed stocks.” […]

Simoleon Sense » Blog Archive » Weekly Roundup 138: A Curated … | Research Online LaboratoryAugust 1, 2011 at 1:04 pm[…] Predicting Financial Distress and the Performance of Distressed Stocks – via EmpiricalFinance- “In this paper we consider the measurement and pricing of distress risk. We present a model of corporate failure in which accounting and market-based measures forecast the likelihood of future financial distress. Our best model is more accurate than leading alternative measures of corporate failure risk. We then use our measure of financial distress to examine the performance of distressed stocks from 1981 to 2008.We find that distressed stocks have highly variable returns and high market betas and that they tend to underperform safe stocks by more at times of high market volatility and risk aversion. However, investors in distressed stocks have not been rewarded for bearing these risks. Instead, distressed stocks have had very low returns, both relative to the market and after adjusting for their high risk. The underperformance of distressed stocks is present in all size and value quintiles. It is lower for stocks with low analyst coverage and institutional holdings, which suggests that information or arbitrage-related frictions may be partly responsible for the underperformance of distressed stocks.” […]

Stop Using the Altman-Z Score. | Empirical Finance BlogAugust 4, 2011 at 7:33 pm[…] Using the Altman-Z Score. Posted on July 23, 2011 by wes Predicting Financial Distress and the Performance of Distressed […]

Academic Alpha vs. Practical AlphaOctober 12, 2011 at 10:47 pm[…] with the Campbell Hilscher and Szilagyi distress model–we even highlighted it on the blog at http://turnkeyanalyst.com/2011/07/stop-using-altman-z-score/. The punchline was that stocks identified by the distress model were in a serious hurt locker […]

Managing the Risks of Permanent Capital Impairment (Part 1 of 4)August 22, 2012 at 2:17 pm[…] Here is a post on the subject: http://turnkeyanalyst.com/2011/07/stop-using-altman-z-score/ […]

G-ManAugust 6, 2016 at 11:37 amGreat article, thank you!

To clarify, in relation to this comment – “when the evidence shows something works better, why not use it???”

Is that to say the Z-score still works to an extent? I ask because as a retail investor the Z-score is more readily available from various financial sites.

What are your thoughts on using the more humble debt to equity ratio combined with the current and quick ratio to determine if a company is at risk of bankruptcy? These tend to be easily available/calculated.

Thanks,

G

Wesley Gray, PhDAugust 6, 2016 at 9:06 pmHey G,

Thanks for the question.

I imagine it works better than doing nothing at all…but there doesn’t seem to be a lot of recent evidence suggesting it is highly effective. Of course, I’m not sure any of these bankruptcy/distress prediction algos are ever that accurate.

Good luck!

Wes

Josh WallnerAugust 1, 2017 at 3:23 amhas this formula been incorporated into any stock scanners?

Moncho MartínezAugust 23, 2017 at 8:07 pmHello,

first of all, congrats for your great blog, it´s full of interesting stuff.

I´d like to double check the logit formula, because I got very strange results, so there is a typo somewhere or I am totally lost.

The problem is that constant in logit formula -8.87 is too big, so it´s almost impossible to get a positive value for logit and as consequence P value is always lower than 1%. I tried with companies that went broke and did not get a logit higher than -5.

Is it possible that -8,87 is a typo and the actual value is -0,87?

Kind regards.

Wesley R. Gray, PhDAugust 24, 2017 at 11:21 amMake sure all the variables are scaled correctly and in accordance with the paper. That has screwed us up in the past (ie variable is in millions, but should be in thousands)

Moncho MartínezAugust 25, 2017 at 3:13 pmThanks for your answer. Unfortunately I don´t understand your point, the elements in formula are ratios, the units doesn´t matter. I found that most relevant components are TLMTA and EXRETAVG, but -8,87 is still very high, and seems impossible to reach.

Could you add an example for the shake of clarity, please?

By the way, the link leads to a site no longer available, I got the paper here

https://dash.harvard.edu/bitstream/handle/1/9887619/joim_predicting_financial_11.pdf?sequence=2

Wesley R. Gray, PhDAugust 28, 2017 at 7:30 amHere is the JF version. https://scholar.harvard.edu/files/campbell/files/campbellhilscherszilagyi_jf2008.pdf

Similar constant estimates. When we replicated the results and compared them to the paper a long time ago we had no issues.

Not sure about your issue. Wish I was more helpful.

thx for updated link — updated in post

G-ManSeptember 11, 2017 at 1:41 pmThanks so much for this great post!

1) I note the date is July 23rd, 2011; was it updated more recently? I ask because I recall (maybe incorrectly) reading a post with this title but without e.g. a specific explanation as to how one could implement it as an individual. I’m sure I asked how well a simple Debt/Equity ratio works in the comments too… (I could be having weird dreams though…)

2) Why are there two versions of the one paper? Is that the norm? Are there any differences in the content? (I read the one Published in Jan 2010 (yeah, all of it!)) which contains the tables corresponding with the extracts in your post.

3) To apply this model to a non US stock which index would it be advisable to track the market volatility? An all world market index or the local market index?

Thanks again for taking such esoteric content and making it more accessible to everyday people!

Kind regards,

Jack Vogel, PhDSeptember 25, 2017 at 7:42 am1) I am not aware of a data update. Since this model will yield a lot of “false positives” one can use this model as a tool to eliminate the bottom x% of a universe.

2) Yes, there is generally the working version of the paper and then the published version; this is the norm.

3) The stock volatility in this paper is at the individual stock level, not the index level.

MattFebruary 15, 2018 at 5:32 amHello. I am trying to work out how I’m going wrong here. I am doing this primitively in Excel. I have used Apple (AAPL) numbers. Some of these might be a bit off at this point as they’re out of date/I’m been tweaking them a bit. But the result is fundamentally the same: near 100% probability of corporate failure (99.999%).

Here are my inputs into the equation:

NIMTA TLMTA EXRET RSIZE SIGMA CASHMTA MB PRICE

AAPL 0.014 0.236 0.393 -1.444 0.366 0.252 5.317 1.176

The weird thing is, if I make things *worse*, I get a *better* (i.e. lower) probability of failure. For example, if I increase NIMTA from 0.014 to a large 0.2, I get 100% chance of failure (versus 99.999%).

It’s like the true result is the inverse of my result (1 – x, where x is the current output probability).

FYI: My answer using the above inputs is around -11 or -12 (the output logit probability) which I’m then using 1/1(+EXP(LPFD)) to find the percentage probability.

The values shown in the paper are as follows (for reference). To be honest I can’t seem to spot any big differences from my AAPL example that can’t be “explained away” reasonably. Since they’re all ratios I don’t think there are any “scaling errors”.

paper NIMTA TLMTA EXRET RSIZE SIGMA CASHMTA MB PRICE

mean 0.000 0.445 -0.011 -10.456 0.562 0.084 2.041 2.019

median 0.006 0.427 -0.009 -10.570 0.471 0.045 1.557 2.474

February 16, 2018 at 8:20 amThis is from a few weeks ago…6% probability.

NIMTAAVG TLMTA CASHMTA EXRETAVG SIGMA RSIZE MB PRICE PFA

0.011494161 0.229423947 0.168981824 -0.190851433 0.192513981 0.626848604 5.214428525 1.176091259 0.06%

MattFebruary 16, 2018 at 1:12 pmI appreciate the feedback. I recreated the same formula and used your inputs but got the same result. Here is an open spreadsheet with the 99%+ probability of failure once again:

https://docs.google.com/spreadsheets/d/1OYY24NvtjFNaZjDu_MC9PETgOcpOqNVWrj8oiOVWR_A/edit?usp=sharing

There must be something fundamentally wrong with how I’m implementing the formula… It’s even weirder that both “good” companies and “bad” companies get similar results.

February 16, 2018 at 2:59 pmheading out, but if you are doing hte calcs right you should get around 7%

MattFebruary 16, 2018 at 5:30 pmWell thanks but I just tried again by literally copying your formula and pasting it into the spreadsheet, and then switching out the abbreviations for the relevant inputs (PRICE, MB, etc.), and I get the same result. Makes no sense to me… I’m no math guy either so I really have no idea.

Same numbers as yours.

Same formula as yours.

Different result!

MattFebruary 17, 2018 at 10:43 amYour “PFA” says 0.06% — so was it 0.06% or 6%? I am able to get 0.06% in Excel now, while the logit is -7.50 using your data. Thanks…

Shaheen IqubalApril 2, 2018 at 10:54 amI was introduced to your blogbost last week and I am really enjoying it. I am replicating this paper and found your blog very helpful. To do the sanity test, I was looking at the factor value for AAPL you posted and have following two questions:

1. My initial intuition was that RSIZE value will always be negative because the ratio of stock market vale/S&P market value will always be between 0 – 1, How did you get 0.628 for RSIZE.

2. I noticed that you are using log instead of natural log (ln) for price factor. Is there a reason?

Thanks for your help!

Shaheen IqubalApril 2, 2018 at 4:58 pmRSIZE = log(market value of firm/total market value of S&P 500). The ratio should always be less than 1 and therefore the log of the ratio should be negative. How do you get positive value of 0.626849 for RSIZE factor?

April 2, 2018 at 5:42 pmyou are correct. calc issue.

Shaheen IqubalApril 3, 2018 at 12:01 amThanks for your prompt response! I also noticed from the value you posted for apple that you are not using log instead of ln in your calculation. I noticed that on the log(price). can you please confirm if it is correct?

Like Matt, I pasted your values into an excel spreadsheet and could not replicate the calculation. I did not get a 6% probability. I will really appreciate if you can show us what we are missing. I got the same number as Matt where the value for logit was -7.05

Appreciate your help and really enjoying your blog!

MattFebruary 16, 2018 at 6:13 pmSee my image here which is a summary of the problem: https://ibb.co/mHhiqn

Sorry to be a bother but the image in a straightforward fashion shows your formula, your numbers. The logit value before getting the probability, is -7.059439442 (as shown in the screenshot).