Date: Tuesday, July 1, 2025
Hi! Iβm Paul Lagunes, a social scientist, and Co-Chair of the ΔϋΓΚ΅ΌΊ½βs Government Evaluation Topical Interest Group. In this post, I reflect on how embracing experimentation can lead to more innovative, effective governance.
Many of us who serve the U.S. federal government are committed to advancing policies that work. We aim to use data, evaluation, and scientific integrity to advance the public good. But what happens when the evidence is limited, the stakes are high, and the path forward is unclear?
Thatβs where experimentation comes in.
A fascinating example comes from New Yorkβs Hudson River, where excess pollution has harmed marine life. In response, a coalition of state agencies launched an innovative solution: using oysters to clean the water.
Oysters are filter feeders β each one can clean several gallons of water per day by removing impurities. Theyβve been native to the Hudson for thousands of years, but their populations had dwindled. Now, through strategic planning, regulatory oversight, and public funding, state agencies are supporting reef restoration efforts to bring them back.
This nature-based approach is grounded in both ecological history and scientific evidence. Itβs also a bold experiment. No one can guarantee success. But by testing this solution, monitoring outcomes, and adapting as needed, public institutions are demonstrating a commitment to learning and innovation.
Another example β less visible in nature but arguably more transformative β comes from the early development of the internet. In the late 1960s, the U.S. Department of Defense funded ARPANET, an experimental project to connect computers across vast distances. At the time, the idea of a decentralized, packet-switched network was unproven and risky. Yet through iterative testing and collaboration with academic institutions, ARPANET laid the foundation for the modern internet. This bold investment in uncertain technology has since revolutionized communication, commerce, and governance itself.
In evaluation, we often hear the question: βBut what if it doesnβt work?β That fear can lead to paralysis. But real learning requires both courage and humility. It means being willing to be proven wrong β and to move forward with lessons learned.
The Hudson River oyster project β and decades earlier, the creation of the internet through ARPANET β show how public agencies can take calculated risks in pursuit of better outcomes. Whether restoring ecosystems or building digital infrastructures, these efforts reflect a willingness to experiment in the face of uncertainty. They also highlight the role of evaluators in supporting this mindset. We can help design evaluations that are rigorous yet flexible, and that prioritize learning over perfection β ensuring that bold ideas are tested, refined, and ultimately made more effective.
From the perspective of learning, gains are possible even in the face of failure. To echo a recent , itβs important to pause and reflect on both what does and doesnβt work. By embracing a posture of experimentation, failures are also learning experiences.
The Evidence Act offers a durable framework for a kind of governance that invites experimentation. Passed in 2018 with bipartisan support, the Act breathes new life to centuries-old insights. As Thomas Hobbes, the 17th-century philosopher, famously observed, βwe make the commonwealth ourselves.β And if the state is a human creation, then it can be studied, tested, and improved. β[W]here the causes are known,β Hobbes added, βthere is place for demonstration.β
When we treat government as something we can understand, rework, and improve β through data, evaluation, and experimentation β we create space for innovation that is bold and grounded in evidence.
The ΔϋΓΚ΅ΌΊ½ is hosting Govβt Eval TIG Week with our colleagues in the Government Evaluation Topical Interest Group. The contributions all this week to AEA365 come from our Govβt Eval TIG members. Do you have questions, concerns, kudos, or content to extend this AEA365 contribution? Please add them in the comments section for this post on the so that we may enrich our community of practice. Would you like to submit an AEA365 Tip? Please send a note of interest to AEA365@eval.org. AEA365 is sponsored by the and provides a Tip-a-Day by and for evaluators. The views and opinions expressed on the AEA365 blog are solely those of the original authors and other contributors. These views and opinions do not necessarily represent those of the ΔϋΓΚ΅ΌΊ½, and/or any/all contributors to this site.