Dispatch from the Mission Investing Institute: Measurement
This newsletter is coming to you from Troy, Michigan where we are on day three of our over-subscribed Mission Investing Institute. Scores of foundation leaders and mission investing experts have been convening here at The Kresge Foundation headquarters since Monday, focused on how to activate around impact investing.
There have been many themes emerging throughout the Institute. One of the most popular among newcomers and seasoned investors alike is measurement. How do you measure impact, both social and financial? Must we agree to tradeoffs on mission or money? What standards are there? How can we respond to skeptics within our foundation? And so on.
“People are getting stuck when it comes to measurement,” founder and senior advisor of ORS Impact Jane Reisman, Ph.D. told attendees. “The good news is that slowly but surely, forward-thinking foundations are beginning to report out on their experiences and returns, with help from intermediaries. Their trials are becoming models for others seeking justification for their impact investments, as they build a case for full board support.
Case Example: Northwest Area Foundation
Northwest Area Foundation’s Investment Director Amy Jensen spoke about the foundation's work measuring the impact of its mission-related investments for more than a decade, and attested to the challenges, confusion and exhilaration that come from the measurement process.
Northwest Area Foundation invests in good jobs, and the financial capability to build assets for under-resourced communities across its region, which includes eight states and more than 75 Native nations. But, as Amy shared with the audience, they weren’t always clear on the strategy, starting with their first mission-related investment (MRI) in 2004, a $10 million private equity fund designed to support companies in the target region.
“In the early years, we would report on how many jobs were created and where. The investment committee said, ‘Okay, it did what we asked by making investments in the right states.’ Yet as we looked deeper into the results we were disappointed that the investments weren’t being made in our higher priority demographic communities.”
Amy continued, “With help from Pacific Community Ventures we built our own measurement criteria. We looked at what makes a high-quality job: Is there room for career advancement? An opportunity to save? Does it offer health benefits and sick days? Over time, and with experience, we used what we learned to inform what’s become our current measurement strategy.”
Some Advice: Five Lessons Learned
While measurement may look different depending on any given foundation’s sector, geography or investment asset class, Amy shared some top learnings that can apply across the board:
-
Be flexible with measurement targets. We put $10 million in our original investment and thought we would raise another $40 million. But we quickly learned that there was little appetite in the market to invest in just our eight-state region of focus. We had to become more flexible in defining what success looks like. We are willing to compromise on geography if it means partnering with investors who share our impact goals.
-
It’s okay to outsource to experienced managers. Initially, we thought that we had to control our fund to better influence the social impact outcomes we wanted to achieve. But we learned that doesn’t have to be the case. As the field has developed, there are more experienced managers who have proven they can execute investments according to our impact strategies, and thus measure performance according to our mission metrics.
-
A preference for private investments. Similarly, we have found that we can architect more specific impact benchmarks when we make a private investment and better measure progress over time. By clarifying our impact objectives at the outset, like jobs goals, we can better ensure our capital is being invested most effectively to match our mission.
-
Get real with examples. It has been a huge breakthrough for us to be able to review examples from others. Examples are very helpful in getting a committee or board to have thoughtful conversations around both mission-related investments and impact assessment. This is still a new field, and committee members may not have a sense of what impact assessment looks like. The ability to show real examples allows them to discuss what specifics they want to see and what has value in their decision-making process.
State of Impact Measurement
Jane is optimistic and told attendees there is progress in establishing measurement conventions, but it’s not what the field had expected.
Instead of the industry adopting a single set of standards that would enable benchmarking across all portfolios and funds, the impact investing field is experiencing an evolution that is giving rise to what she calls “fit for purpose” approaches, where you design your measurement methodology to align with your impact goals.
“For instance, the Impact Management Project facilitated by Bridges Fund Management, convened huddles to talk about measurement and management, where measurement isn’t the end goal. Rather, the end goal is establishing conventions and norms to manage your investment better,” Jane said.
“The investment side of foundations typically considered social impact measurement an expensive cost center, that is until institutions began to weave it into an overall investment management framework. What are the standards of evidence? How do you use measurement throughout the lifecycle of your investment? What about exits? Every aspect needs to be considered.”
Importantly, she adds, there is a strong commitment to coordination in approaches, testing and refining models, and development of case illustrations that can inform the field overall. She shared results of research she conducted with The Rockefeller Foundation* with attendees that found common clusters and themes for measuring impact are beginning to emerge. You can read about them here.
“It seems so obvious,” Jane said. “Few impact investments clearly defined the intended impact up front.” As the saying goes: What gets measured gets done.