Here's a statement of the obvious: The opinions expressed here are those of the participants, not those of the Mutual Fund Observer. We cannot vouch for the accuracy or appropriateness of any of it, though we do encourage civility and good humor.
The big irony is that this is from the VP of research at M*, one of the biggest purveyors of bad math and statistics. He should go take a good look at his own investor returns math and the conclusions drawn from it.
This article will be taken to bash the peddlers of the latest snake oil for investing strategies as they rightly should be. But the use of bad math/science is endemic to the entire domain of finance including indexology all of whom push their own agendas.
The problems are manifold and only one aspect is covered here. Use of science/math includes the following each of which can be independently correct or wrong.
Model - To analyze the subject, one needs to model reality. Problem is people assuming a bad model is better than no model. Examples include the risk analysis of CDOs based on efficient market models that blew up, use of simplistic distribution assumptions for parameters in various simulations including shortfall simulations, extreme sensitivity to assumptions behind the model which makes the confidence level in output very small, etc. Garbage models, garbage conclusions.
Methodology - includes the problems pointed out in the article in back testing. This is the most often used to discount bad math/science as they should be. Unfortunately, the bad science purveyors who claim to use this correctly fail in the other dimensions. Garbage measurements, garbage result.
Interpretation of results - this is the most problematic area where results or calculations are used to draw conclusions out of context or to overgeneralize. The interpretation of M*'s or Dalbar calculations of returns based on money flow being interpreted as average investor returns is a perfect example of this bad math. Using unrealistic performance persistence criterion to claim conclusions on the performance of active managers is another example that has been debunked here. Garbage interpretations, garbage results.
Intellectual honesty - a much less understood part of science that one should avoid logical fallacies and subject their own studies to the same level of critical analysis as what is demanded of others. This is the biggest problem in finance especially from evangelists promoting an agenda very similar to creationists or global warming skeptics that claim to use science to make their point.
@davidrmoran Forgot to mention how much I enjoyed your posting last week of the party pic from the D&C annual mtg. That gal looked like she underestimated just how taxing her first wk as head of the D&C trading desk would be (or perhaps she was just decompressing and I inferred too much from a single camera moment).
Comments
This article will be taken to bash the peddlers of the latest snake oil for investing strategies as they rightly should be. But the use of bad math/science is endemic to the entire domain of finance including indexology all of whom push their own agendas.
The problems are manifold and only one aspect is covered here. Use of science/math includes the following each of which can be independently correct or wrong.
Model - To analyze the subject, one needs to model reality. Problem is people assuming a bad model is better than no model. Examples include the risk analysis of CDOs based on efficient market models that blew up, use of simplistic distribution assumptions for parameters in various simulations including shortfall simulations, extreme sensitivity to assumptions behind the model which makes the confidence level in output very small, etc. Garbage models, garbage conclusions.
Methodology - includes the problems pointed out in the article in back testing. This is the most often used to discount bad math/science as they should be. Unfortunately, the bad science purveyors who claim to use this correctly fail in the other dimensions. Garbage measurements, garbage result.
Interpretation of results - this is the most problematic area where results or calculations are used to draw conclusions out of context or to overgeneralize. The interpretation of M*'s or Dalbar calculations of returns based on money flow being interpreted as average investor returns is a perfect example of this bad math. Using unrealistic performance persistence criterion to claim conclusions on the performance of active managers is another example that has been debunked here. Garbage interpretations, garbage results.
Intellectual honesty - a much less understood part of science that one should avoid logical fallacies and subject their own studies to the same level of critical analysis as what is demanded of others. This is the biggest problem in finance especially from evangelists promoting an agenda very similar to creationists or global warming skeptics that claim to use science to make their point.
Dismal science for a reason.