Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!

In this Discussion

Here's a statement of the obvious: The opinions expressed here are those of the participants, not those of the Mutual Fund Observer. We cannot vouch for the accuracy or appropriateness of any of it, though we do encourage civility and good humor.

    Support MFO

  • Donate through PayPal

Terrible Twos? The two-year-old funds which are most out-of-step with their peers

We thought we’d start catching up with the 130 U.S. equity funds which have passed their second anniversary but have not yet reached their third, which is when conventional trackers such as Morningstar and Lipper pick them up. As Charles has repeatedly demonstrated, the screener at MFO Premium allows you to answer odd and interesting questions. I’ll try to look at several questions over the next week, starting with “which of these new funds might be badly miscategorized?”

That’s an important question, since investors tend to buy the (Morning)stars. In general, that’s an okay decision: five star funds rarely become stinkers, one star funds rarely become gems. Except when a fund has gotten dropped in an inappropriate peer group, so that Morningstar is looking at a banana and trying to judge it as an apple. Our two favorite examples are RiverPark Short Term High Yield (RPHYX) and Zeo Strategic Income (ZEOIX). Both are outstanding at what they do: generate low single-digit returns (say, 2-4%) with negligible volatility. And both get one star from Morningstar because they’re being benchmarked against funds with very different characteristics.

How did we check for miscategorized funds? Simple, we get our screener to identify all U.S. equity funds that had been around for under three years. We downloaded that to Excel, eliminated funds with under two years of history then sorted them by their correlation to their peers. We found that over half of the funds were indexes or closet indexes (correlations over 95, with some “active” funds at 98). Just six funds, three active and three index, had correlations under 75.

Cambria Value and Momentum ETF (VAMO, as in Vamoose?) has the lowest correlation (0.43) with peers of any of the two-year-olds; Lipper thinks it's a large cap value fund. Why should you care? Because a low correlation with the peer group raises the prospect that a fund has been miscategorized and it makes it very likely that any rating it receives – positive or negative – will be unreliable. One illustration of that possibility: 5 of 6 six low correlation funds trail their peer group with VAMO lagging by 14% annually. Does that mean they’re bad funds? No, it means that its strengths and weakness can’t be predicted from its peer group.

The other two-year-olds with peer group correlations under 0.75 so far:

HTDIX Hanlon Tactical Dividend and Momentum Fund (Lipper: Equity Income)
PTMC Pacer Trendpilot 450 ETF (Mid-Cap Core)
BMVIX* Baird Small/Mid Cap Value Fund (Small-Cap Core)
PTLC Pacer Trendpilot 750 ETF (Large-Cap Core)
FSUVX Fidelity SAI US Minimum Volatility Index Fund (Multi-Cap Core)

Note: BMVIX is actually just shy of 2 years through October, but I want to touch on it for December commentary.

Next up: two-year-olds leading their packs.

David

Comments

  • There's a subtle implication made when stating that a fund is miscategorized. It suggests (however slightly) that the fund was put into the wrong category, as opposed to there not being a category for the fund. This is reinforced by the use of "peer group", implying that every fund does in fact have a group of "peers".

    For example, ISTM that BCHYX is miscategorized. It is grouped with California long term munis. While BCHYX certainly holds Calif. long term munis, I respectfully suggest that its salient feature is that it holds long term muni junk. That would make its peer group (long term) HY munis.

    Contrast that with the situation that RPHYX is in. Given the existing category systems (Lipper, M*), into which category does it best fit? If the answer is "none of the above", then would you create a singleton category for it? What purpose would that serve? Note that M* already has a category for nontraditional bonds. That's a clear marker for funds that don't fit anywhere else and are not comparable among themselves (though M* still insists on giving these funds star ratings).

    If BMVIX is miscategorized, then it would seem so is its sibling BSVIX, with the same manager and about 2/3 overlap between their portfolios.

    Any classification system has limitations. And no two funds are strictly comparable (well, except for funds like BMVIX and BSVIX:-) ). Ultimately, one should look at each fund based on its distinctive attributes.

    Identifying funds that are "miscategorized" does aid in informing people that some funds are "more unique" (ack) than others. In short, thanks for the informative list.
  • I think the "Correlation with Peers" metric, which David advocated for on the site, like Max Drawdown, Rolling Averages, and beta, becomes invaluable when assessing a fund ... once you get used to having it, always want it. It's also a proxy for Active Share.
  • I was looking at the MultiSearch results for ZEOIX. After consulting their website (ZEO now has a real website) ZEOIX looks like a superior mattress to me. (that's a compliment). What puzzled me on the MultiSearch tool was the characterization of David's take on this fund as "mixed". It was actually Chip's take but that's fine. I just can't find anything mixed in the 2014 description of this fund. It look entirely positive. Have I missed a subtly stated reservation? (other than saying this fund is not for everyone. But neither is RPHYX, with a "positive take". )
  • Just added metrics to screen for "Out of Step Twos" with MFO Premium MultiSearch tool (click to enlarge)...

    image
  • edited November 2017
    Ben. Just for record, we addressed your ZEOIX comment above here:

    https://www.mutualfundobserver.com/discuss/discussion/comment/95497/#Comment_95497
Sign In or Register to comment.