Problems With Morningstar’s New Rating System

The new ratings aren’t much better at predicting performance than the old system.

By SHARE

Many investors rely on Morningstar’s views when selecting actively-managed mutual funds. Morningstar’s web page correctly notes that it is a “leading provider” of independent investment research. It has 195 financial analysts and “proprietary tools” to aid its research efforts.

For many years, Morningstar ranked mutual funds using its “star” system. Five-star rated funds were considered the cream of crop. Funds rated one star were the designated losers. Brokers and mutual funds touted high star rankings as a reason to purchase those funds.

The star system didn’t work well for those relying on it for predictive value. In a well-researched study by Vanguard, the authors found that, on average, only 39 percent of funds with a five-star rating outperformed their style benchmarks for the three-year period following the rating, while 46 percent of funds with a one-star rating outperformed their benchmark for the same period. The authors concluded “the top-rated funds are shown to have actually generated the lowest excess returns across time, while the lowest-rated funds generated the highest excess returns.”

Undaunted, Morningstar unveiled a new rating system in late 2011. This system sought to avoid “recency bias” by keeping a long-term perspective and by not being influenced by “what investors are buying and selling.” The new system promised to consider risk profile changes and changes in fund management. Morningstar summarized this system in a way that certainly sounded impressive: It would consider a fund’s strengths and weaknesses across five pillars: people, process, parent, performance, and price.

Fast forward one year. How did the new ratings system do in 2012? Not well.

A blog post by Wall Street Rant looked at the performance of funds rated gold (the highest rating), silver, bronze, neutral, and negative (the lowest rating) under the new Morningstar system. The results showed that the new system wasn’t much better than the old star system. Only 35.3 percent of funds rated gold finished in the top quartile. That’s not much better than the 31.3 percent of funds rated negative that finished in the top quartile. The highest percentage of funds in the top quartile belonged to those rated neutral, with 45.1 percent of those funds making the cut.

As the blog post correctly notes, data for one year should not be given too much weight. Nevertheless, presumably some brokers touted the gold ranked mutual funds to their clients, so this early analysis should be considered in assessing the predictive value of the new ratings system going forward.

The quest for outperforming actively managed funds (using Morningstar’s ratings or other criteria) is an activity that you should reassess. As the Vanguard study correctly noted, a low management fee, tax-efficient indexing strategy has “been so difficult to consistently beat over time.”

Instead of relying on Morningstar or anyone else to help you select outperforming actively-managed funds, you should shift your focus to capturing global market returns using low-cost index funds in an asset allocation suitable for you.

Dan Solin is the director of investor advocacy for the BAM Alliance and a wealth adviser with Buckingham Asset Management. He is a New York Times best-selling author of the Smartest series of books. His latest book, 7 Steps to Save Your Financial Life Now, was published on Dec. 31, 2012.

The views of the author are his alone and may not represent the views of his affiliated firms. Any data, information, and content on this blog is for information purposes only and should not be construed as an offer of advisory services.