Thought Leadership

Philanthropy’s Role in Responsible Adoption of AI

Anju Suresh
Officer, Sustainable & Impact Investing
Glenmede
Rei Tran
Technology Fellow, Mission Investments 
Ford Foundation
 
by Anju Suresh, Officer, Sustainable & Impact Investing, Glenmede, and Rei Tran, Technology Fellow, Mission Investments, Ford Foundation
 
At Mission Investors Exchange’s (MIE) 2024 National Conference, Glenmede Endowment & Foundation Management hosted a dine-around with the Ford Foundation to discuss philanthropy’s role in promoting practical and ethical deployment of AI via the capital markets, grantmaking, and shareholder engagement. During this Dine Around, we talked about the opportunities and risks AI presents to mission-driven investors and how philanthropic organizations can participate in this growth while maintaining standards around governance, environmental sustainability, and social justice.

The companies behind “Generative AI”[1] have already upended traditional business models and seek to radically transform every sector of the economy. The adoption of AI can have meaningful applications for mission-driven organizations, including innovations that support social and environmental impact work. Among recent examples:
  • Advancements in patient care: Johns Hopkins University has developed an AI system that scours electronic medical records and clinical notes to catch sepsis symptoms hours earlier than human diagnosis, reducing fatality rates by 20%.
  • Efficient climate impact modeling: One of AI’s most promising applications is in mapping climate change, such as mapping Antarctic icebergs using satellite imagery 10,000x faster than humans, or modeling the behavior of animal species to support conservation.
  • Accelerating the study of diseases: The top two most cited AI papers in 2022 were on protein folding, ColabFold and AlphaFold, whose works were on making protein folding research more comprehensive and accessible. A failure in protein folding causes diseases like Alzheimer’s, Parkinsons, etc.
These achievements, however, are balanced by the rapid speed of development and adoption that has been a growing cause of concern for mission-driven investors. As with any tool, there are scenarios of intentional misuse as well as unintended externalities.
  • High energy consumption: Training models is incredibly energy consuming, as is using them. On average, a ChatGPT query needs 10x more energy than a Google search. The question of how to solve AI's extraordinary demand for data center power and its strain on the electrical grid has not been answered thoughtfully aside from expressing hope that immediate strain is a worthwhile tradeoff until AI can be used to achieve breakthroughs in more efficient energy solutions.
  • Women’s rights and privacy concerns: Generative imagery and video assets have been used to victimize women through the creation of non-consensual intimate imagery (NCII). Studies have shown that the 90%+ of deepfake explicit imagery is non consensual, and 90% of that is of women. Most victims of generative image-based abuse have limited recourse because there is currently no federal law over the creation and distribution of deep fake content.
  • Voter disinformation: Generative voice assets have already been used in voter suppression in an election year. In a deliberate disinformation attempt, robocalls impersonated an authority figure and targeted voters in New Hampshire, telling them not to vote in the state’s primary election.
During the Dine Around, we discussed how philanthropies can invest in AI technologies to further their mission, while also strengthening safeguards for responsible adoption in the industry.

The Key Takeaways Included from the Dine-Around and client conversations:
  • When reviewing direct investment opportunities, philanthropies should discern whether founders understand all the potential use cases of their generative technology and that the founders have sufficient competency in developing technical safeguards for loopholes or backdoors in their software.
  • Mission-driven investors shying away from participating in the capital markets may mean that the only participants at the investment table are ones that do not have the willingness or aptitude to advocate for underrepresented people and vulnerable causes.
  • Philanthropies can play a critical role in holding companies and fund managers accountable for responsible AI practices by integrating relevant questions into their standard due diligence and monitoring processes. These questions could include requests for data and disclosures regarding a company’s governance and oversight of AI implementation, impact assessment on stakeholders, and energy consumption practices.

For organizations that are interested in understanding how to address concerns and opportunities around AI in their own investment portfolios, raising this topic with your investment team, Outsourced Chief Investment Officer, or investment consultant is a prudent first step. Additionally, reaching out to civil societies and academic researchers for input and leveraging investor networks like MIE are important platforms for organizations to share strategies and outcomes.


[1] The terms “AI” and “Generative AI” have been used interchangeably to refer to the capability of machines generating high-quality text, images, videos, or other data in response to human-provided text prompts. To the untrained eye, this content is sometimes indistinguishable from human-generated content.

Have a question, website feedback, or idea to make our services better?

X

Welcome!

Please contact [email protected] if you have trouble logging in.