Measuring the Unmeasurable
Ask ProManage, LLC, Chairman and CEO Carl Londe how to benchmark managed accounts’ performance, and he will give you an honest—and frustrating—answer.
“We have not figured out how to do it. We have talked to academics, and we have talked to consultants,” says Londe, referring to managed accounts that have the potential to develop a different portfolio for each participant and that use the funds already offered in a plan. “We have not been able to crack a way to measure performance that is meaningful, and I do not know of anyone who has.” The Chicago-based ProManage reports aggregate performance numbers of a plan’s managed accounts, he says. “However, the value of what we are doing is at an individual level” in areas such as asset allocation and diversification, he adds.
Many advisers and plan sponsors may take a second look at managed accounts, since the Department of Labor (DoL) blessed them in October as a qualified default investment alternative (QDIA). Although providers such as ProManage take into account other factors beyond retirement date, a managed account can be a QDIA even if the provider has only that data, Londe says: “Target-date funds are permitted as one of the three permissible types of QDIAs under the regs, and they only take into account retirement date plus or minus five to 10 years.” The same applies to the use of managed accounts: The DoL says asset allocation is to be based on a participant’s age, target retirement date, or life expectancy, and that such allocation decisions are not required to take into account risk tolerances, other investments, or other preferences of an individual participant. Although it is not required to look beyond a participant’s age, some managed account providers do so. Londe says, “We take into account other factors and we feel it is important to do so, if possible, for a variety of reasons.”
Those who reconsider managed accounts will find that no industry standard has surfaced for benchmarking these accounts, with most benchmarking attention focused on lifecycle funds up until now, but sponsors and advisers need to figure out how to monitor them. The QDIA rules add a safe harbor for sponsors that choose to take advantage of the rules, Londe says. “However, the rules make it clear that the choice of a managed-account provider and the monitoring of the managed-account provider are fiduciary responsibilities,’ he adds. “Therefore, plan sponsors cannot avoid all fiduciary responsibility associated with QDIAs, but they can significantly manage their exposure by using a QDIA.” A condition of the QDIA is that, if it is a managed account, the investment-management service needs to operate as a fiduciary, Londe adds.
“We have seen a tremendous inflow into professionally managed solutions, such as managed accounts and lifecycle funds,” says Christopher Jones, Chief Investment Officer of Palo Alto, California-based Financial Engines, Inc. “Because of that, there is going to be a significant demand for benchmarking services that answer the question, “OK, if we put all this money into a managed-account program, how do we make sure the vendor is doing a good job?””
Whether a sponsor goes with managed accounts put together at a plan level or individual-participant level, their performance should be monitored. That monitoring happens on a plan and plan-subset level rather than an individual level, says adviser Vincent Morris, Kansas City-based Vice President at Bukaty Companies, a National Retirement Partners member firm.
In the future, expect a lot of wary sponsors to ask questions about gauging vendor performance, says Brian Ward, Nashville, Tennessee-based Managing Director at Ward Financial Advisory of Wachovia Securities. “Is this an appropriate solution, given all the uncertainties about benchmarking?” he asks. “Are some plan sponsors just going to say, “Why pay someone X amount of money to manage these solutions and bring in more potential liability, when we can bring in target-date-based solutions that we already have research on?” Right now, their plates are full with all the fiduciary decisions they need to make.”
Going the Extra Step
Benchmarking managed accounts is definitely harder than gauging target-date funds, Morris says. “By design, they are supposed to be customized for each participant. There are a lot more moving parts.” The more inputs managed accounts have, he adds, the more difficult they are to benchmark.
Managed-account providers tailor asset allocations to factors such as a participant’s age, risk tolerance, outside assets, and other retirement plans. “Managed accounts provide personalized portfolio allocations, whereas target-date funds are a more generic, one-size-fits-all product,” says Ray Martin, CEO and President of CitiStreet Advisors in Quincy, Massachusetts, which partners with Financial Engines to deliver its managed-account services.
Managed-account providers generally utilize the underlying funds in a sponsor’s defined contribution plan, Londe says, which differ from plan to plan. “So, to compare one plan sponsor to another does not work,” he says, “and each plan sponsor’s demographics are different.”
Given the complexities, many small-plan advisers in particular probably rely on any information they can get from the third-party money managers involved, says adviser Rich Behr, Managing Partner at 401(k)², Inc. in Littleton, Colorado. “[Managed-account benchmarking] is more of an afterthought than something that is done proactively,” he says. “In the competitive environment we run into in Denver, we see little or none of the [customized] indexing by our competitors.” Managed-account providers routinely share with clients an index that lines up most appropriately against the style of that particular mix, he says. However, the quality of that information varies among providers, he says, so relying solely on that increases the risk that participants get subpar investment management.
Yet, smart advisers will go the extra step and offer sponsors more benchmarking data. Here are a half-dozen ways to do it:
Get more details from providers. Ask providers for specific information about results, Morris recommends, such as how certain subsets of participants did compared to the aggregate participant-result numbers. “We are trying to define what we are looking for, and have the investment manager or recordkeeper provide that to us,” he says. “I know what the average participant did, and we may ask the provider to break that down by age range.” That way, he can pinpoint things like how managed-account participants ages 20 to 30 fared. The information is available, he says, if the recordkeeper will cooperate in getting the data together—but that willingness differs significantly among recordkeepers, he adds.
Gauge asset allocations. Those who opt for a managed account make an asset-allocation decision more than anything, Ward says. “So, you have to say, is the asset-allocation decision appropriate? That is something you can benchmark, to see if they are outside the norm.” Compare the allocations to third-party sources such as the Morningstar Moderate Allocation category.
Compare with target-date funds. Jones says Financial Engines uses a benchmarking framework that tries to control for the biggest source of variance across individuals: the length of the time horizon. “Take a population of people who are going to retire in 2020, for example, and compare their results to a composite benchmark of the top five 2020 target-date funds,” he suggests. “It provides an interesting benchmark in that it is investable. Second, it is easily understood. Third, it is a reasonable alternative for what people might do if they were not in a managed–account program.” Of course, no industry standard has emerged yet for benchmarking lifecycle funds, either, so picking a comparison opens up another issue.
Put together partially customized indexes. Ideally, benchmarking managed accounts involves creating a customized index for each participant, Martin says. “You can have an infinite number of investment recommendations, and take into account outside assets and risk preferences,” he explains. “The correct way to benchmark a managed account would be to do that on an individual-account basis, but that is just not a practical solution,” given the many customer accounts. He suggests, as a practical alternative, looking at an array of possible managed-account allocations (assuming no influence of outside accounts), and comparing that with a weighted average of benchmarks representative of those portfolios.
401(k)² indexes all of its client accounts, Behr says, considering a mixed asset-allocation strategy. “We then go back and parse the percentage for each manager to the index that they most appropriately line up against,” he says. He and his colleagues use a mix of the S&P 500 and the Lehman Brothers Aggregate Bond Index. They usually adjust the equity/bond weighting on a plan level but, for individual participants with a balance of $100,000 and more, they may change the mix of the two indexes to reflect an individual’s portfolio more closely.
Include other measures of success. With target-date funds, Londe says, advisers can track performance using the traditional investment concept of “I put in a dollar; how did that perform?” Managed accounts may need a broader definition of success, he suggests. Factors like whether participants get more diversified also play an important role, Jones says. “Investment performance is not the only value-add here.”
Benchmark the provider’s service. The industry has focused its efforts thus far on figuring out how to benchmark performance, Morris says. However, benchmarking managedaccount provider service “will probably be the next level,” he says, especially since the choice and monitoring of a provider remain fiduciary responsibilities. That means looking at things like whether a managed-account provider offers participants Web-based service only, versus the ability to speak with someone, and how easy it is for participants to drop the service if they wish.
Eluding Everyone
Advisers and plan sponsors need a way to compare the results of different providers’ managed-account offerings, Morris says—especially if a sponsor chooses managed accounts as the default investment. “That seems to be the tough part,” he says. Advisers can break down a single provider’s allocation and benchmark it, he says, but currently they cannot do an apples-to-apples comparison of providers.
Right now, a lot of the industry’s benchmarking effort revolves around target-date solutions, Morris says, and a whole sector has grown up quickly around that need, although there are still questions about how to benchmark those funds as well. However, that has not happened with managed accounts, he adds. “There has to be some sort of standard or benchmarking research. To my knowledge, those studies have not occurred yet.”
No workable methodology has surfaced, Londe agrees. “I do not know how you would develop a tool unless you have a theory to work from. We are not there yet,” he says. “So far, the basic concept has eluded everyone.”
Some of the larger consulting firms are likely working on it, Jones says. “This is a really new product, so there is not a lot of history. We are probably at least two or three years away from seeing a consistent third-party benchmarking tool. My sense is that people are still learning how a program like this works, and how to interpret the results.”
Expect a lot of contenders to try to establish a standard, Ward says. “We will see a bunch of people trying to come in and be the benchmark.” However, he does not see one approach becoming a standard anytime soon. “Managed accounts are too early in the product-adoption stage.”
While that research and development happens, Behr says, advisers have a chance to fill a need by offering sponsors customized benchmarking as part of their broader efforts to help participants build adequate retirement savings. “In the meantime,” he says, “we see an opportunity.”