With the American Evaluation Association annual conference coming up Nov. 6-11 in Washington, D.C., our team has been thinking about the theme of the conference “From Learning to Action.” We constantly challenge ourselves to ensure our values are present in our work; our practices produce actionable learning; and that we share our experiences to build the evaluation field, and more importantly, to test our perspectives.

We’re excited to present with our evaluation colleagues and share innovative practices that lead to meaningful change in ever-changing contexts. See below for an overview our sessions, or download a printable schedule.

Wednesday, November 8

Thursday, November 9

Friday, November 10

Wednesday, November 8 | 6:15-7:15 p.m.

Rowing against the current: Advocacy evaluation in difficult political contexts
Advocacy and Policy Change (Panel)

Spark Staff: Jewlya Lynn, Founder & Chief Learning Officer; Laura Trent, Senior Consultant

Session Colleagues: Jared Raynor, TCC Group; David Devlin-Foltz, The Aspen Institute

Many advocates in 2017 experienced a transition from a supportive political environment to an oppositional one. At the same time, public willingness to engage in advocacy also shifted, with increased advocacy action and willingness to participate in larger movements. Advocates cannot ignore these changes. Neither can evaluators.

We propose five dynamics evaluators should consider in order to be good learning partners in difficult political contexts including changes in transparency; expectations for success; field capacity (current and needed); advocacy strategies; and evaluation methods. The panel will be structured as a facilitated discussion among three experts in advocacy evaluation, each speaking to experiences with one or more of the five dynamics.

Thursday, November 9 | 3:15-4:15 p.m.

Advanced session on designing and implementing an evaluation when the foundation is part of the intervention
Nonprofit and Foundations (Experiential Learning)

Spark Staff: Rebecca Ochtera, Associate Director; Laura Pinsoneault, Director of Evaluation

Session Colleagues: Kelci Price, The Colorado Health Foundation; Meg Long, Equal Measure

Increasingly foundations are engaging in their work beyond the investment, and more as a participatory partner. They may be implementing the program officer role differently, participating at the planning tables, or taking actions to support implementation of strategies. This creates both a challenge and an opportunity for evaluators. The evaluator needs to think differently about the evaluand, purpose of evaluation, types of questions asked, who receives results and when, and even the evaluator’s role.

This advanced skill building session will draw on three examples: a foundation with a long-established role as part of the change; a foundation transitioning to this role; and a multi-funder collaborative taking action at the national level while funding locally. Participants will learn about evaluation design and implementation in these settings from the presenters and from each other through an interactive dialogue.

Thursday, November 9 | 3:15-4:15 p.m.

The intersection of evaluation & strategy: Exploring the role and tools needed for evaluators to be at the strategy table
Nonprofit and Foundations (Think Tank)

Spark Staff: Jewlya Lynn, Founder & Chief Learning Officer

Session Colleagues: Hallie Preskill and Marcie Parkhurst, FSG; Tanya Beer, Center for Evaluation Innovation

When evaluation and foundation strategy are intertwined, they support real-time learning practices that range from making major decisions about overall grantmaking strategies to experimenting with tactical shifts in existing strategy. Evaluators and foundations working in complex settings are increasingly embracing and experimenting with this concept of evaluation and strategy as partners.

This session is designed to codify some of the insights evaluators are gaining as they join the strategy table, including exploring how to reframe evaluation’s role in strategy, practical ways that evaluation can become embedded into strategy, the roles that evaluation can take on, and the implications for the skills of evaluators. It will begin with a presentation based on the facilitators’ experiences and lessons learned. From there, a robust discussion of audience experience will be used to generate insights that can guide the field, to be including in a pending white paper.

Friday, November 10 | 8:00-9:30 a.m.

Evaluating and advancing equity within the context of a systems change strategy
Systems in Evaluation (Panel)

Spark Staff: Rebecca Ochtera, Associate Director; Laura Trent, Spark Policy Institute, Senior Consultant

Session Colleagues: Tracy Marie Hilliard, Jennifer Beyers, and Paula Rowland, ORS Impact

Systemic changes to public and private institutions and systems are critical if we are going to advance equity comprehensively (e.g. across employment, health, housing, safety). Many initiatives focused on equity recognize this and are working on systemic changes in settings that range from municipal, state and federal government to academic institutions to private sector companies. Some initiatives are also focused on changing the broader ecosystems that institutions operate within, such as fields of advocates, community organizations, and faith-based groups. Evaluation for these initiatives must bring in both a systems lens and a clear and operationalized understanding of equity, recognizing the overlap, but not losing what is distinct in each. This session will share three examples of evaluations that kept a dual focus on equity and systems, used participatory methods to do so, that allowed them to meaningfully inform strategies through actionable learning.

Friday, November 10 | 6:30-7:15 p.m.

Strategies and Tools for Multi-Stakeholders Systems Change
Use and Influence of Evaluation (TIG Multipaper)

Spark Staff: Lauren Nichol Gase, Senior Researcher

Session Colleagues: Jessica Shaw, Boston College; Jacqueline Pei, University of Alberta; Susan H Chibnall, Caliber Associates; Barbara Szijarto, University of Ottawa; Mari Kemis, Iowa State University

Designing Evaluation to Improve Complex Systems: A Case Study from the Los Angeles Juvenile Court
As many as 70% of youth who come into contact with the juvenile justice system have a diagnosable mental health concern; however, many youth enter a system that is ill-equipped to assist them. In 2016, the Los Angeles County (LAC) Department of Public Health partnered with the LAC Juvenile Court to assess the processes, programs, and other services being implemented to address the mental health needs of justice-involved youth, with the aim of identifying strategies to improve system functioning. Methods included (1) semi-structured interviews with organizational leaders to identify system processes, strengths, and challenges; (2) surveys with ground-level staff (probation officers, attorneys) to understand variation in practice and gaps in knowledge; and (3) abstraction of administrative data to understand service delivery and youth outcomes. This presentation will describe the participatory evaluation process, with a focus on strategies and tools to improve of multi-stakeholder initiatives and systems.