Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!

In this Discussion

Here's a statement of the obvious: The opinions expressed here are those of the participants, not those of the Mutual Fund Observer. We cannot vouch for the accuracy or appropriateness of any of it, though we do encourage civility and good humor.

    Support MFO

  • Donate through PayPal

Does a Reversion To The Mean Follow Big Up Years?

FYI: With just a few hours left in the trading day, the S&P 500 is on track to deliver a hefty gain of over 20% to investors for 2017 and the ninth straight year of gains on a total return basis. In the S&P 500’s history, there has only been one other period where the S&P 500 was in the black for nine straight years, and that was from 1991 to 1999. A big difference between that streak and now, though, is the magnitude of the gains. During the 1990s streak, the S&P 500’s total return was 450% compared to a relatively meager gain of 261% in the current period. If the S&P 500 does make further gains next year, it will be the first ten-year winning streak for the index ever. With such a big gain this year, though, can investors really expect to see gains in the year ahead?
Regards,
Ted
https://www.bespokepremium.com/think-big-blog/does-a-reversion-to-the-mean-follow-big-years/

Comments

  • MJG
    edited December 2017
    Hi Guys,

    I suppose a very generic answer to this guesstion has to be a qualified "yes". A reversion to the mean has to happen after a particularly outlander (extremely out of the ordinary average result) outcome. But the real world answer is a definite and firm "No".

    The number of proposed methods to forecast market returns are so numerous as to challenge counting. They all work sometimes, but they also fail for other significant timeframes. If it were otherwise we would all be wealthy. Here is a Link that demonstrates the pure randomness of near time future returns following recent recorded returns:

    http://static3.businessinsider.com/image/55bf73092acae716008bc67a-1200/past-performance-is-no-predictor-of-future-returns.jpg

    If you see a correlation in these data "you're a better man than I am, Gunga Din". A correlation simply does not exist. No surprise here, no Happy New Year insight. Regardless,

    Happy New Year to all MFO participants.
  • edited December 2017
    I don’t think the person who wrote the article understands “reversion to the mean“ very well.

    (I don’t think I’d have linked this one.)
  • Mean reversion / regression, as Wolfram explains, means 'extreme event is likely to be followed by less-extreme one.'
  • edited December 2017
    Remarks by John C. Bogle http://vanguard.com/bogle_site/lib/sp19980129.html

    The title of my remarks is "Reversion to the Mean." This theme may at first blush seem a bit dry and uninspiring. But I assure you that it is anything but that. For I suggest to you that RTM is a rule of life in the world of investing—in the relative returns of equity mutual funds, in the relative returns of a whole range of stock market sectors, and, over the long-term, in the absolute returns earned by common stocks as a group. RTM represents the operation of a kind of "law of gravity" in the stock market, through which returns mysteriously seem to be drawn to norms of one kind or another over time ...

    In periods as short as one year, many mutual funds—especially small, aggressive ones—can and do defy these odds. And in some decade-long periods, perhaps one out of five funds succeeds in doing so by a material amount. But in the very long run, there is a profound tendency for the returns of high-performing funds to come down to earth, and, just as inevitably, for the returns of low-performing funds to come "up to earth," as it were. Indeed, ... the distance traveled in the course of these descents and ascents is directly proportional to the earlier distance above or below the market's return. In short: reversion to the market mean is the dominant factor in long-term mutual fund returns.
  • "But in the very long run"...

    So there is hope for Hussman!! :)

    (Or maybe he's in the exclusive very, very long run category.)
  • MJG's graph (from here, among other places) supports the thesis that mean reversion (in the literal mathematical sense) does hold in the investing world.

    The first sentence is almost right: "A reversion to the mean has to happen after a particularly outlander." Mean reversion says simply that for a sequence of random numbers, the further away from the mean the current value is the more likely the next value is to be closer to the mean. More likely, but it does not "have to happen."

    This is one of those self evident statements when you think about it. Suppose we've got something with a normal probability distribution (bell curve) centered at zero (mean). The probability of the next value being above zero is 50%, and below zero is 50%.

    The probability of the next value being below 1 is higher than 50% (since 50% will fall below zero, let alone 1). The probability of the next value being below 10 is even higher; the probability of the next value being below 20 is higher still. The further away from the mean, the higher the probability that the next value will be closer. Duh!

    (Technically, I'm illustrating something slightly different from mean reversion, but it's close enough to convey the concept.)

    In order for this "rule" to be valid, we have to be observing something random. That's what gives us the bell curve. MJG's graph purports to show that next year's returns are indeed random (zero correlation with the previous five years' performance). So rather than showing lack of mean reversion, the graph suggests the opposite - that the yearly returns are random - a necessary condition for mean reversion.

    The reason why I said "purport" is that the graph is not showing the correlation between the current year's return and the next year's. (Rather it shows the lack of correlation between the past five years' cumulative performance and the next year's.) The original article looks at next year's performance in comparison with the current year's performance (not prior five year's total).

    Looking at performance following 20%+ years, the data evinces mean reversion. The subsequent years averaged a return that was about the same as the market long term average of 11.4% between 1928 and 2015. That suggests that next year performance of these years was random. Further, in these years immediately following 20% gains, the market went up roughly the same percentage of the time (69%) as the market did over all years (72%). So it looks like the randomness requirement for mean reversion is not violated.

    More important, for most of these outlier years (20%+ returns), the next year's returns were closer to the mean. The exceptions were:
    1936 (1937 returned less, but further away from the mean of 12%),
    1942 (higher in 1943),
    1961 (1962 returned less, but further away from the mean),
    1976 (1977 less but further from mean),
    1982 (1983 slightly higher),
    1996 (1997 higher),
    1999 (2000 less but further from mean),

    That means that 25 of 32 years following 20%+ returns were closer to the mean than the years they followed. That is, there was mean reversion. It happens most of the time, but not always. And 20% isn't even that much of an outlier - less than one standard deviation (19.7%) from the mean (11.4%).
  • edited December 2017
    I guess we all agree on meaning of “reversion”. Where I think there might be different assumptions is in the application / use of “mean“ and also whether a reversion needs necessarily to somehow arrive at some actual numerically significant mean, or whether the reversion simply needs to move in the direction of the mean.

    The author’s analysis is accurate if he’s only deliberating as to whether there will be (next year) movement in the direction of some type of more normal valuation. To me, that’s the premise @msf is working from. And to that extent, it’s a logical deliberation. But, I get the impression in reading Bogle that he views “mean” as a more concrete permenant valuation metric, and that this mean is normally arrived at gradually over longer periods (years and decades).

    I’d be more inclined to agree with a lot of the analysis here if the term “reversion towards the mean” were being employed.
  • haha, so seemingly true

    See the bean machine and https://en.wikipedia.org/wiki/Regression_toward_the_mean

    Tougher than the Wolfram explan :)

    and this:

    The notion of random motion in a potential well is elemental in the physical sciences and beyond. Quantitatively, this notion is described by reverting diffusions—asymptotically stationary diffusion processes which are simultaneously (i) driven toward a reversion level by a deterministic force, and (ii) perturbed off the reversion level by a random white noise. The archetypal example of reverting diffusions is the Ornstein–Uhlenbeck process, which is mean-reverting. In this paper we analyze reverting diffusions and establish that: (i) if the magnitude of the perturbing noise is constant, then the diffusion's stationary density is unimodal and the diffusion is mode-reverting; (ii) if the magnitude of the perturbing noise is nonconstant, then, in general, neither is the diffusion's stationary density unimodal, nor is the diffusion mode-reverting. In the latter case we further establish a result asserting when unimodality and mode-reversion do hold. In particular, we demonstrate that the notion of mean reversion, which is fundamental in economics and finance, is a misconception—as mean reversion is an exception rather than the norm.

    underlining mine
  • msf
    edited December 2017
    Yes, but what does it mean that the magnitude of perturbing noise is non-constant?

    Looking at section 3 of the paper (where the constant magnitude σ is replaced by a diffusion function of the current value D(X(t)), it seems that the magnitude of the "randomness" of next year's return depends on this year's return, i.e. the magnitude of the noise depends on X(t) where t is the current year.

    Still not sure what that means or why it would not be a constant σ, i.e. why would the size of the random portion of next year's return depend on the current return? At least it seems to preserve a sense of randomness.

    FWIW, here's the full paper:
    https://www.researchgate.net/publication/254496792_The_misconception_of_mean-reversion

    @hank - I think you're referring to the Law of Large Numbers. One expects future returns to come out average. It doesn't matter if recent returns were above average. That won't make future returns come out below average. They're independent, so the future returns are expected to come out average.

    When you add decades of average returns to recent above average returns, the result is still above average. But it gets very close to average because the future decades of average returns numerically swamp a few years of recent outperformance.

    http://whatis.techtarget.com/definition/law-of-large-numbers

    Edit: From your Wiki cite, a short paragraph stating what I've been trying to explain about mean regression (next random result is closer to mean, nothing more), and law of large numbers (lots of average results swamp a few good returns):
    https://en.wikipedia.org/wiki/Regression_toward_the_mean#Other_statistical_phenomena
  • @msf

    >> it seems that the magnitude of the "randomness" of next year's return depends on this year's return,

    is this not typically the case when speaking of a recent regression?

    I guess it is a point worth exploring, where point is the creation of the constant. (My understanding, not imputing it to you.) What the base sample is. If we say a bball player shooting really well the last week must surely mean-revert, do we mean to his yearly (this year) average percentage, something longer, or what.
  • Here's the "simple" equation, with a constant magnitude σ for the white noise (W) and a constant tendency to regress (r):

    X' (t) =−r(X (t) − l ) + σ W' (t)

    (I've substituted prime notation for derivative, because the conventional dot notation for time derivative doesn't seem to transcribe here.)

    The tendency for the hot hand to revert to the mean varies by how hot the hand is. That's represented by the first part of the right hand side expression. "l" is the mean to which it is regressing.

    As you can see, the further away from the mean (the better the shooter is doing), the greater is (X(t) - l), and thus the stronger the tendency to revert to l.

    This is independent of the noise factor - the second expression on the right hand side. It's the constant coefficient there (σ) that is being swapped out for a function.

    The equation is arguably continuous, but one tends to think in terms of time series. That is, each "tick" of time t represents whatever the base unit is - each game's performance for example, or each month's returns, or each year's. A unit-less equation.

  • k

    >> varies by how hot the hand is.

    Right. So the wolfram 'extreme event is likely to be followed by less-extreme one' is both accurate and not very specifically useful. While 'unitlessness' makes for learning problems (me).

    You read 538, I bet.

    >> varies by how hot the hand is.

    Is this a way of advising (sometimes) to bail from extreme runups? I luckily did this, sort of, w/ CGMFX. Should have seen it coming w FLVCX back when? I suppose so.
    But I bet this is not like the novice saying 'must sell, this can't continue.'
  • edited December 2017

    >> varies by how hot the hand is. Is this a way of advising (sometimes) to bail from extreme runups?

    Point well taken. In the 90s - a few years from retirement - I found myself taking a crash course on investing. Bogle either drove me to drink or put me to sleep. Andrew Tobias, while less esteemed in the investment community, was much easier to digest (The Only Investment Guide You’ll Ever Need).

    One thing that has always influenced me is Tobias’s suggestion small investors pick up all their marbles and walk away from the stock market if they ever find themselves in the midst of a gigantic bubble. Of course, the problem with this is in actually knowing whether a bubble exists. And he wasn’t clear on how to determine that. An additional problem now is that interest rates are much lower and the bond market likely more precarious than at the time Tobias wrote. So there isn’t a good alternative place to hide.

    I did listen to Tobias, and also Bill Fleckenstein, in the late 90s and moved mostly to bonds prior to the 2000 equity rout. Fair to say the two of them saved me some money back than. Today I’m light on equity exposure - but not completely out as Tobias suggests. Subscribed recently to Fleck’s Daily Rap. Can’t say whether it’s a good investment or not, but does offer a starkly different perspective to most of the Bull crap coming from mainstream financial sources.
  • Hi Guys,

    Thanks one and all for a very stimulating and detailed discussion exchange.

    Forecasting equity returns is an extremely challenging assignment. Separating "The Signal and the Noise"' is not easy work. The Signal and the Noise is an excellent book authored by Nate Silver. Here is a Link that accesses the full text:

    http://www.stavochka.com/files/Nate_Silver_The_Signal_and_the_Noise.pdf

    Developing some method to reasonably predict upcoming market returns from numerous historical data correlation candidates has escaped just about everyone, including industry giants like Vanguard.

    Here is a Link to a rather exhaustive and comprehensive attempt reported by Vanguard that explored a wide array of potential correlating parameters:

    https://personal.vanguard.com/pdf/s338.pdf

    Vanguard does honest work. All correlation attempts failed, including the use of last year's returns to project next year's equity rewards. See Figure 2 in the referenced document. I suppose one response to these forecasting failures is to accept prediction defeat and simply stay invested for the long haul.

    Best Wishes for a happy and prosperous New Year.
  • Let me suggest one other way of thinking about this ...

    Suppose the market always returns -10%, 0%, 10%, 20%, or 30%, each 1/5 of the time. So on average, the market returns 10%.

    If the market is truly random like the toss of a die, then each year the market is just as likely to return -10% as 10% or 30%, regardless of this year's performance. If this year returned 30%, then there's a 3/5 chance (0%, 10%, 20%) that next year's returns will be closer to the mean (10%).. That's regression toward the mean from an extremely good (or bad) year.

    Still, no reason to sell just because you had a good year.

    It's like saying that a hot hand will cool off. Of course it will, because it's got no place to go but down. But you don't know when that will happen, and when it does, it's just as likely to be a very good (but not great) year as it is likely to be a bad one.

    That's the simple math of random selection. In reality, the market has causes and effects, even if in the aggregate the numbers come out looking random. One may be thinking of those other factors when investing. Business cycles tend to have cycles (though of seemingly random duration and magnitude). Other investors may be acting irrationally (e.g bubbles, gambler's fallacy, etc.).

    Investors may also be acting rationally in choosing suboptimal strategies. Taking money off the table if you are satisfied with your winnings is not optimal (from a purely monetary perspective), because over time market returns are positive. But you may be happier with what you have and with avoiding ulcers than with making more money in the long term.

    This brings us full circle to the hot hand. It's more likely for one to be content with one's winnings after a big win. So the inclination to reduce exposure after a good year is understandable and reasonable. Just not on the basis of law of averages, mean regression, etc.
  • Right, the equities market is not gambler's ruin

    >> it's just as likely to be a very good (but not great) year as it is likely to be a bad one.

    Are there nonrandom situations where one should not conclude this? Not a well-phrased thought, I believe.

    >> It's more likely for one to be content with one's winnings after a big win.

    Many in behavioral economics would argue quite the opposite (greed overcoming other feelings)

    >> So the inclination to reduce exposure after a good year is understandable and reasonable. Just not on the basis of law of averages, mean regression, etc.

    ? And so the moral is, since the market generally rises over time, that mean regression as a basis for any investment decisionmaking is not a helpful way to think about or guide investing and investments ? Is that a fair statement? Except for sleep-at-night.
  • A necessary condition for gambler's ruin to be valid is that there be a zero sum game. One doesn't even have to know what gambler's ruin is to know that it doesn't apply to the equities market (long term upward trend - not zero sum).

    Similarly, randomness is a necessary condition for mean reversion. Even if one doesn't know what mean reversion is.

    The moral is that if one assumes that the market is random, then by that very hypothesis last year's returns mean nothing for next year, the last nine year's bull market means nothing. And if one doesn't assume the market is random, then by definition mean regression is invalid (doesn't apply).

    So one either buys into the randomness assumption and ignores recent data, or doesn't buy into it, in which case "mean regression" can't be used to justify decisions. Any attempt to the contrary is effectively an appeal to "fake math".

    Find the economic/financial rationale as you asked about for your short term decisions. Otherwise, yes, you're making investment decisions to "sleep-at-night".

    P.S. Gambler's ruin is not gambler's fallacy. The former is a long term concept, the latter a short term one. The latter says that just because you've tossed five heads in a row, don't rely on that to bet on tails (as being "overdue"). But you might just check that coin to see if it is two headed before placing your next bet. Just as you might want to look for reasons why the market did well recently.
  • Interesting if (let us say) the market is adaptive and not random in the long run (rises) that there is always so much writing about mean regression.

    OT,
    https://www.nytimes.com/2017/12/30/opinion/the-gamblers-ruin-of-small-cities-wonkish.html
  • Random numbers don't have to average zero. You can have a random distribution around any value you want, say 11%/year returns. So a rising market can still be entirely random.

    Why there's so much writing about mean regression ... they'd lose readers if they kept calling it gambler's fallacy:-)

    I'll keep trying to convey a sense of what mean regression is. Here' a pretty good example - a short video from the old game show Card Sharks. The object was to guess whether the next card would be higher or lower than the one showing.

    Obviously the higher the current card, the greater the odds of the next one being lower. Not because the cards must average out (though that happens to be true with a fixed size deck), but because a card selected entirely at random must come out lower than an ace (or at least not higher). Likewise, you bet the farm with a deuce showing, because you can't get anything lower.


  • >> Random numbers don't have to average zero. You can have a random distribution around any value you want, say 11%/year returns. So a rising market can still be entirely random.

    Sure; there's such a thing as a good shooter. But 'random distribution around any value you want' is not what most people (even statisticians) mean when they say 'random', don't you think?
Sign In or Register to comment.