Source: UNIV OF WISCONSIN submitted to NRP
A PRINCIPLES-BASED EVALUATION FRAMEWORK FOR TRANSFORMATIVE AGRICULTURAL PROJECTS
Sponsoring Institution
National Institute of Food and Agriculture
Project Status
ACTIVE
Funding Source
Reporting Frequency
Annual
Accession No.
1027734
Grant No.
2022-68012-36145
Cumulative Award Amt.
$961,000.00
Proposal No.
2021-10219
Multistate No.
(N/A)
Project Start Date
Nov 1, 2021
Project End Date
Oct 31, 2026
Grant Year
2022
Program Code
[A9211]- Sustainable Agricultural Systems Program Evaluation
Recipient Organization
UNIV OF WISCONSIN
21 N PARK ST STE 6401
MADISON,WI 53715-1218
Performing Department
Natural Res Inst
Non Technical Summary
Evaluation is a critical part of effective agricultural research and land management, but the new Sustainable Agricultural Systems (SAS) Program's large, transdisciplinary, innovative projects leave traditional evaluation metrics inadequate to measure meaningful change. This project will develop a new evaluation framework focused on core principles agricultural transformation. In developing this framework, we will compile tools to support collaboration and communication, assess the emerging impacts of the SAS program, and build evaluative capacity in agricultural research, extension, and education.Our evaluation team at the Natural Resources Institute will synthesize the key outcomes and activities of the SAS program thus far and identify areas for growth and development. We will work closely with SAS project leadership, evaluators, and National Institute of Food and Agriculture (NIFA) staff to identify key questions about program processes and indicators of transformation. We will develop and implement an evaluation framework focused on shared principles that can align and integrate project activities with program priorities. We will conduct a multi-state survey to assess engagement and inclusion of stakeholders across SAS projects. Our comprehensive synthesis will identify strategies to expand program participation and direct the creation of evaluation tools, trainings, and methods across projects. The resulting principles-focused evaluation framework and its implementation will provide a detailed picture of SAS program successes, products, and priorities. Ultimately, this project aims to develop a robust set of evaluation approaches to guide innovation, increase accountability and transparency, and support a nationwide transition to more regenerative, sustainable agroecosystems.
Animal Health Component
40%
Research Effort Categories
Basic
0%
Applied
40%
Developmental
60%
Classification

Knowledge Area (KA)Subject of Investigation (SOI)Field of Science (FOS)Percent
9010001310060%
9020001310040%
Goals / Objectives
The primary goal of this project is to create a principles-focused evaluation framework for the Sustainable Agricultural Systems (SAS) Program, and use it to track project development, prioritize resources, provide feedback, and identify strategies that increase inclusion and accessibility for the wide range of stakeholders served by the program, both within and across projects. The secondary goal is to increase evaluation capacity among leaders of sustainable agriculture projects and programs using participatory approaches to develop, implement, and communicate evaluation processes, tools, and findings. The third, long-term goal is to strengthen a culture of evaluative thinking in agricultural research and education programs and model a set of evaluation strategies to support transdisciplinary collaboration and transformation in our agroecosystems.We will address these two goals through four objectives.Objective 1: Synthesize program activities. We will develop a comprehensive synthesis of SAS project activities, processes, personnel, and impacts to date. This synthesis will form the core of the project, where we will identify the strengths and weaknesses of the SAS program and compile a database of topic areas, initial findings, and metrics used to evaluate SAS projects thus far, as well as their intended audiences and partners.Objective 2: Develop principles-focused evaluation framework. Using this initial synthesis, we will develop and implement an evaluation framework focused on shared principles of the SAS program, using core values to align and integrate project activities with program priorities. This framework will enable us to align program priorities with project strategies, and work with project directors, managers, evaluators, and other key stakeholders to integrate the framework with existing evaluation plans, producing more consistent metrics across projects.Objective 3: Assess stakeholder engagement. We will conduct a multi-state assessment of stakeholder engagement and inclusion to validate the intended participants, users, and audiences associated with all past and present projects. We will incorporate this assessment of stakeholder engagement into our final summary of impacts, best practices, and recommendations for the SAS program.Objective 4: Build evaluation capacity in the SAS program. We will use the final program evaluation summary to direct the creation tools, trainings, and activities to build evaluation capacity among project leaders and NIFA staff, ultimately cultivating a community of practice that continues to contribute to the success and documentation of NIFA projects and the SAS award program.
Project Methods
We will structure this mixed methods project around four key objectives, guided by frameworks from both utilization-focused evaluation and culturally-responsive evaluation to ensure our work is ultimately useful, usable, and appropriate for our key audiences. We will develop a comprehensive synthesis of SAS project activities, processes, personnel, and impacts (Obj. 1). This synthesis will form the core of the project, where we continually adapt and iterate on what we learn to inform the other three objectives. Using the initial synthesis and assessment of program strengths, weaknesses, and gaps, we will create a scalable evaluation framework to facilitate adaptive management and consistent metrics across projects (Obj. 2). In developing this principles-focused evaluation framework, we will align program priorities with project strategies, and work with project directors, managers, evaluators, and other key stakeholders to integrate the framework with existing evaluation plans, producing more consistent metrics across projects. As we work to implement this framework, we will conduct a multi-state assessment of stakeholder engagement and inclusion to validate the intended participants, users, and audiences associated with all past and present projects (Obj. 3). We will incorporate this assessment of stakeholder engagement into our final summary of impacts, best practices, and recommendations for the SAS program. We will then use that summary to direct the creation of a set of tools, trainings, and activities to build evaluation capacity among project leaders and NIFA staff (Obj. 4) ultimately cultivating a community of practice that continues to contribute to the success and documentation of NIFA projects and the SAS award program.Principles-focused evaluation establishes a set of core ideas and values to measure project activities against. Even as planned outcomes, personnel, or programs change to meet evolving needs or issues, the core set of values defined by principles help align diverse actors and activities toward their shared mission. Principles-focused evaluation is an excellent fit to assess the SAS program and its initiatives because of the program's ambitious goals and the transdisciplinary, transformative, stakeholder-focused approaches employed across projects.We will work to build relationships with both NIFA staff as well as funded SAS project directors, project managers, and evaluators. We will reach out individually to each SAS project director and conduct semi-structured interviews to discuss their processes, accomplishments, challenges, plans, and questions for us. We will use appreciative inquiry to facilitate conversions that are inclusive and trust-building and will include questions about evaluation concerns and issues to make sure we are addressing hesitancy, insecurity, and fears about project evaluation from the start. Our team will conduct a systematic analysis of project and program documents as well as a thematic analysis of interview transcripts using open coding. We will draw on methodologies from social impact assessment to describe and quantify the reach of the SAS program in relation to its guiding frameworks.Using key informant interviews and collaborator contact lists, we will draw on techniques from social network analysis to create a series of conceptual diagrams and stakeholder maps to illustrate what kinds of organizations are actively involved in the projects, and where they are distributed. Our team will assess the extent of engagement using a participation continuum rubric, and utilize software like ArcGIS, Tableau, Miro, Plectica, or SocioViz to visualize the data geographically (across states and watersheds) and categorically (by organization, sector, or level of engagement with project teams), or by relationships and networks. Our evaluation team will then conduct a needs assessment to further understanding of how projects have changed or adapted since their initial proposals, and what's been added or cut. This needs assessment will include developing and sending a brief online survey to project staff, students, and paid collaborators on projects to collect team input on areas of strength and areas for improvement in their work and the SAS program more broadly, and to collect demographic information such as age, education level, gender identity, race, and ethnicity to see how project resources are being distributed.In addition to these document analyses and the development of an evaluation framework, we will pilot and administer a multi-state survey of key SAS stakeholders and partners across projects including farmers and landowners, farm laborers and trainees, students, extension educators and technical advisors, policy makers and elected officials, nonprofit leaders, and other actors throughout the supply chain. We will draw on the principles-focused framework to inform the key questions on the survey. We will work closely with the Office of Cultural and Linguistic Services at UW-Madison to develop a Spanish language version and conduct phone surveys to pilot the survey with Hmong-, Mandarin-, and Spanish-speaking stakeholders. The survey will be administered in accordance with Dillman's tailored design method, with multiple waves and reminders to increase participation, and data will be analyzed using the Statistical Package for the Social Sciences (SPSS) or similar statistical analysis software.Our evaluation team will draft a list of guiding principles that summarize NIFA values and goals into clear, meaningful, actionable statements that can be operationalized and evaluated. We will finalize our methods with SAS leaders and NIFA program staff, drawing on literature related to collective impact, systems change, and the extensive resources we can access through the American Evaluation Association to identify processes that fit the needs and constraints of large, transdisciplinary, innovative projects. We'll use a range of quantitative and qualitative methods, such as self-assessment and journaling exercises, retrospective question design in project reporting, and virtually facilitated group exercises with project leadership such as Ripple Effects Mapping and Most-Significant Change to understand how the values captured in principles have shaped their work and impacted project stakeholders.We will compile the results of this evaluation of program impact, development, and stakeholder engagement a summary report for NIFA on the findings of the principles-focused evaluation framework and recommendations for further improvement of and implementation strategies for the SAS program. Our evaluation team will also share our processes and best practices with other evaluators and Extension colleagues through conference presentations, workshops, and peer-reviewed publications. Throughout the timeline of this project, we will conduct a meta-evaluation of our evaluation strategies, facilitated activities, and presentations from project leaders and NIFA staff to continually make improvements to our processes and approaches.

Progress 11/01/22 to 10/31/23

Outputs
Target Audience:Throughout the first year of the SAS Evaluation Project, we continued to connect with our primary intended audiences of NIFA and USDA staff, team leaders of research projects in the Sustainable Agriculture Systems program including project directors, project managers, and evaluators. We also connected with our national network of program evaluators through the American Evaluation Association, including their topical interest groups in environmental and extension evaluation. We reached these groups through virtual community of practice meetings between November 2022 through October of 2023, through group meetings, information interviews with academic researchers and NIFA staff, and through our webinar series on the SAS evaluation project. A detailed list of these audiences is below. We connected in our second project year with... NIFA staff: NIFA Institute Leads, Division Directors (or Acting Division Directors), National Program Leaders (NPLs), National Science Liaisons (NSLs), and other NIFA staff (10+) SAS project teams: Project directors (39) Project managers (30) Evaluation community SAS project evaluators and students (38) Evaluation colleagues at Extension (15+) Evaluation colleagues through the American Evaluation Association (40+) Changes/Problems:The primary challenge we experienced this year was graduate student Brittany Isidore'sexpected departure from the project in September 2023. While graduate student May Pannchi has stepped up to fill the role and the research assistantship, we fell significantly behind on our data analysis plans during the transition. We've hired former student Maggie Afshar to help support the planning of the 2024 SAS program meeting in Madison. What opportunities for training and professional development has the project provided?The second year of the project supported professional development opportunities for our team members, first by building our professional evaluation network through the SAS evaluation community of practice. Presenting on our emerging evaluation findings at national meetings and through our webinar series allowed us to build connections with more professionals in the field, and develop rapport. We provided technical expertise through one-on-one coaching calls and messages with other evaluators, sharing resources including sample surveys and interview scripts, activity plans, evaluation papers, and other methodologies. Our team also built our data analysis techniques through the thematic coding of interview transcripts and project documents. Our two graduate students in particular, May Pannchi and Brittany Isidore, took on leadership roles building a searchable dataset of over 1400 coded excerpts, using Dedoose qualitative software and a two-person system to verify their coding skills. How have the results been disseminated to communities of interest?The second year of the EvalSAS project brought emerging findings on the common strengths, challenges, best practices, and other attributes of the SAS program to a wider audience. The emerging results of the analysis of project director interviews and document reviews were shared through a public webinar series and the recordings were shared with over 200 people across more than 75 institutions and organizations at 39 different SAS projects. These audiences included NIFA staff, members of the SAS Evaluators' Community of Practice, the Project Managers' Community of Practice, and the project directors who contributed interview data. We also presented on lessons learned from evaluating SAS projects at different stages of their development with our professional evaluation network, the American Evaluation Association. We will build out this audience in the coming year with the synthesis of the SAS program meeting in Madison, WI. ? What do you plan to do during the next reporting period to accomplish the goals?The third year of the EvalSAS project will have three main activities to address our four objectives: Hosting and facilitating the 2024 SAS program meeting in Madison, WI. We will bring over 150 people from 40+ SAS project teams to Madison to discuss best practices, celebrate program accomplishments, and learn about equitable data governance and management strategies for collaborative projects (Obj. 1, 2, 3, and 4); Report-writing on the content distilled from the SAS program meeting notes, presentations, and project director interviews on best practices to share back with NIFA project liaisons and leadership from both current and incoming SAS project teams (Obj. 1); Drafting evaluation tools based on the emerging principles of the SAS program, working with the evaluators' community of practice to pilot principles-focused evaluation resources (Obj. 2 and 4) including the first draft of a SAS program-wide stakeholder survey (Obj. 3).

Impacts
What was accomplished under these goals? In year 2, the EvalSAS team analyzed and reported initial findings on the common strengths, challenges, and best practices of the SAS program from 2019 - 2022, coordinated capacity-building activities for evaluators including leading a community of practice, and began planning the 2024 SAS program meeting for all project teams and NIFA staff. We focused on three primary activities across our four objectives: Data analysis and reporting: We completed thematic coding of our 28 key informant interviews with 42 project directors and their team members, and further developed our document review of SAS program materials. The resulting datasets will be used as the basis of both our summary report on SAS program impacts (Obj.1) and our principles-focused evaluation framework based on the guiding values of the SAS projects and NIFA guidance to date (Obj. 2). We shared back the initial summary of our work in a three part webinar series on the common strengths, challenges, best practices, and emerging values of the SAS program. We also summarized feedback related to the 2022 SAS program meeting in Kansas City, MO, and used it as the basis for the 2024 SAS program meeting in Madison, WI (Obj. 4). We shared back our findings with NIFA liaisons and staff during periodic updates. Building the evaluation capacity: We established monthly calls for a learning community with a shared repository of resources and a listserv for SAS program evaluators. The community of practice discussed evaluation planning and data collection methods, evaluation utilization and preferred resources, and shared ideas for reflective practice and evaluative activities (Obj. 2 and 4). Our team also took on hosting a bi-monthly community of practice for project managers in the SAS program, filling an organizational gap when a former SAS project manager left for a new position. We also shared back evaluation learnings in the SAS program through two presentations to our community of professional evaluators at two national American Evaluation Association meetings (Obj. 4). Planning meetings and facilitated activities: Our primary area of focus in the second half of year 2 was initiating planning of an in-person SAS program wide meeting in 2024 to engage with project team leads and evaluators on SAS project practices and support cross-program learning (Obj. 1, 2, 3, and 4). The meeting will be a discussion-based opportunity to celebrate success and distill best practices for project leadership, equitable management and data governance, and strategies for effective collaboration with different stakeholder groups.

Publications

  • Type: Conference Papers and Presentations Status: Accepted Year Published: 2022 Citation: Isidore, B., Landis, G., Mase, A., and Pratsch, S. November 2022. Applications of Principles-Focused Evaluation for Sustainability and Agroecological Contexts. Workshop for the American Evaluation Associations Annual Meeting. New Orleans, LA.
  • Type: Conference Papers and Presentations Status: Published Year Published: 2023 Citation: Landis, G., Mase, A., Pratsch, S., Isidore, B., and Pannchi, M. June 2023. SAS Summary: Understanding the Scale, Scope, and Strengths of the SAS Program. Webinar presentation.
  • Type: Conference Papers and Presentations Status: Published Year Published: 2023 Citation: Landis, G., Mase, A., Pratsch, S., Isidore, B., and Pannchi, M. July 2023. SAS Values: Using Project Principles to Evaluate Agroecosystems Transformation. Webinar presentation.
  • Type: Conference Papers and Presentations Status: Published Year Published: 2023 Citation: Landis, G., Mase, A., Pratsch, S., Isidore, B., and Pannchi, M. September 2023. SAS Strategies: Emerging best practices for SAS Project Management and Coordination. Webinar presentation.
  • Type: Conference Papers and Presentations Status: Published Year Published: 2023 Citation: Landis, G., Mase, A., Pratsch, S., Isidore, B., and Pannchi, M. October 2023. Farming by our Principles: Measurements for Principles-Focused Evaluation in Agroecosystems Research. Poster presentation. American Evaluation Associations Annual Meeting. Indianapolis, IN.
  • Type: Other Status: Published Year Published: 2023 Citation: Landis, G. April 2023. Principles in Practice: Using Mixed Methods Measurements for Principles-focused Evaluation in Agroecosystems Research. Blog Post. American Evaluation Associations AEA365 Blog.


Progress 11/01/21 to 10/31/22

Outputs
Target Audience:Throughout the first year of the SAS Evaluation Project, we connected with our primary intended audiences of NIFA and USDA staff, team leaders of research projects in the Sustainable Agriculture Systems program including project directors, project managers, and evaluators.. We also connected with our national network of program evaluators through the American Evaluation Association, including their topical interest groups in environmental and extension evaluation. We reached these groups through virtual group meetings between December of 2021 through October of 2022, through key informant interviews with project teams, informal calls with other project evaluators, and at the SAS Program Project Directors Meeting in Kansas City from April 18-20 in Kansas City, Missouri. The Kansas City meeting increased our reach, including 55 online attendees and 155 in-person attendees. A detailed list of these audiences is below: Connected in first year with... NIFA staff: NIFA Institute Leads, Division Directors (or Acting Division Directors), National Program Leaders (NPLs), National Science Liaisons (NSLs), and other NIFA staff (35+) SAS project teams: Project directors (28) Project evaluators (22) Project managers (25) Evaluation community SAS project evaluators (24) Evaluation colleagues at the University of Wisconsin Division of Extension (10) Evaluation colleagues through the American Evaluation Association (25+) Changes/Problems:We did not make major changes in the project plans this year, though we traveled less than we alloted for the project. We intend to use our travel budget to support additional professional development opportunities and conduct site visits with SAS projects in years 3-4 of the project. What opportunities for training and professional development has the project provided?The first year of the project supported professional development for our project team members. These development opportunities included building facilitation skills, and data collection and analysis techniques. Our two graduate students in particular, May Pannchi and Brittany Isidore, took on leadership roles to share group interviewing with our project personnel using a semi-structured script, and built our their thematic coding skills using Dedoose qualitative software and a two person system to verify their coding skills. Year 2 of the project will support broader professional development opportunities for the SAS evaluation community and for our project team in the evaluators' community of practice.? How have the results been disseminated to communities of interest?The first year of the project primarily focused on set-up and background research, with limited results to disseminate thus far. However, the emerging results of the analysis of project director interviews and document reviews are still emergent, but we reported back on emerging themes from the project director interviews with our NIFA staff liaisons and NPLs in October 2022. We plan to share our findings in the coming year with a report of SAS program reach, best practices, and emergent impacts, getting feedback from the SAS Evaluators' Community of Practice, the Project Managers' Community of Practice, and the project directors who contributed interview data. We also plan to present initial findings and lessons learned with our colleagues at the American Evaluation Association. What do you plan to do during the next reporting period to accomplish the goals?The second year of the project will build out three main activities to address our objectives: Data analysis and reporting (Obj. 1 and 3): We will finalize the thematic coding of the year 1 interviews with project directors, our document review, and our stakeholder inventory from project narratives and synthesize our findings into a report for NIFA and the project teams. We will provide multiple opportunities for feedback and input on the report from project managers, project evaluators, and our NIFA liaisons before submitting our findings to SAS program leaders and administrators at the end of year 2. Building the evaluation community of practice (Obj. 2 and 4): We will establish a monthly learning community, shared repository of resources, and a listserv for SAS program evaluators. We will also work with this group to develop the first draft of the principles evaluation framework for the SAS program and strategies for implementation in year 3. Planning meetings and facilitated activities for a virtual SAS program conference in 2023 and an in-person conference in 2024 (Obj. 1, 2, 3, and 4) to engage with project team leads and evaluators and support cross-program learning.

Impacts
What was accomplished under these goals? In year 1, the EvalSAS team established project organizational structure and data management structures while building relationships with SAS project teams and NIFA staff. We focused on three primary activities across our four objectives: Document review: Conducting a review of SAS program documents and peer-reviewed principles-focused evaluation literature; Meetings and facilitation: Supporting planning and facilitation of a SAS program-wide meeting in Kansas City; Key informant interviews: Conducting key informant interviews with project directors and team members to analyze common strengths, challenges, best practices, and motivating values of the SAS program to date. Within each of these activities, we worked to address our four key objectives. Objective 1: Synthesize program activities. Document review: To synthesize the activities of the SAS program to date, our team is conducting a document review. We began reviewing the 2022-2026 USDA strategic plan, individual SAS project narratives, and project progress reports from the 32 current projects funded from 2019-2021. This review and draft summary is still underway and will be incorporated as background information for the analysis of project director interviews. Meetings and facilitation: We met virtually with NIFA National Program Leaders and staff to introduce our team and our plans, to orient ourselves to the SAS program and discuss the current program goals and challenges, and to provide updates on our progress and solicit feedback. At the SAS program directors' meeting in Kansas City, we designed and facilitated a hybrid strengths-mapping exercise with the project leads present to support cross-project discussion of key activities and underlying values of those activities. In addition, we collected feedback on the structure of the program and future directions from 50 short written surveys and a follow-up discussion. Key informant interviews: We conducted key informant interviews with project directors and some team members (a total of 28 hour-long interviews with 42 individuals) to better understand the scope, structure, and function of successful SAS projects at different stages of development. The interview guide centered on key project activities, project strengths, challenges, stakeholders, motivating values, and feedback for the SAS program. We began analysis and thematic coding of the interview transcripts in the start of year 2 and the summary report will be completed in project year 2. Objective 2: Develop principles-focused evaluation framework. Document review: We conducted a review of peer-reviewed principles-focused literature to update our theoretical grounding on the latest applications of evaluation methodologies for systems-change projects. Graduate student Brittany Isidore compiled an annotated bibliography of the 20 most relevant papers we found related to principles-focused evaluation in addition to the literature search we conducted for our proposal. This document provided the background for a workshop on principles-focused evaluation that our team led at the American Evaluation Association's annual meeting. Meetings and facilitation: We conducted several exercises to get project teams discussing values and principles that guide their work at the meeting in Kansas City. These facilitated sessions were designed with the goal of helping project directors find common ground across agricultural and scientific sectors, and provide structured space for reflection about team dynamics, motivation, and strengths. We heard themes including adaptability, collaboration and participation, inclusion, sustainability and stewardship, service, and fun. The results of this activity were compiled in our post-meeting feedback and will provide additional background for the summary of SAS program reach and the draft principles-based evaluation framework. Key informant interviews: We've centered values and motivations behind SAS projects in our semi-structured open-ended interviews with project directors (see interview script). These will be included in thematic analysis of interview transcripts and highlighted in our SAS program emerging impacts report in year 2. Objective 3: Assess stakeholder engagement. Document review: While the majority of our activities to assess stakeholder engagement in the SAS program will be in years 3-4 of the project, we began our initial review of the stakeholders currently reached by the program in our review of project narratives. We've begun compiling these main groups of growers and ranchers, students, junior scientists, and agricultural processing or business representatives in our database of project narratives. Meetings and facilitation: We included questions about stakeholder engagement with the SAS program in our facilitated project exercises at the NIFA Project Directors Meeting in Kansas City, but otherwise did not explicitly address this area in our first year. Key informant interviews: We included questions about primary stakeholders in our series of interviews with project directors. We learned more about how these groups are involved in projects through partnerships including on-farm trials, field days, and advisory boards. The analysis of interview transcripts is ongoing and we expect to build out this list of stakeholders in the summary report of common best practices, strengths, and common challenges that our project will produce in year 2. Objective 4: Build evaluation capacity in the SAS program. Document review: The document review of principles- and agroecosystems-focused evaluation conducted by evaluation student Brittany Isidore identified key practices and gaps in principles-focused evaluation literature, as well as key areas for our to contribute to evaluation literature. This literature review was the basis for topics in the evaluation community of practice and an accepted workshop proposal for the American Evaluation Association's annual meeting. Meetings and facilitation: We addressed evaluation capacity-building in our meetings, working with the NIFA Project Directors Meeting planning team weekly from February until April 2022 to plan the hybrid meeting in Kansas City and help the team align the agenda and meeting activities with the meeting goals and program priorities of sharing best practices and building momentum in the program. We collected evaluative feedback both at the meeting through written paper surveys and an informal feedback station, and again in key informant interviews. In addition, we organized an evaluators' forum at the meeting to highlight strategies for evaluators to support and document SAS project success. We used the feedback from evaluators present at the meeting both to plan a SAS-program wide evaluation community of practice and to design a workshop for the American Evaluation Association. Key informant interviews: We had a few informal informational interview calls with evaluators in the SAS program between April and September of 2022, and used those calls to exchange methodologies and share project challenges and organizational tools.

Publications

  • Type: Conference Papers and Presentations Status: Published Year Published: 2022 Citation: Landis, G., Mase, A., Pratsch, S., Isidore, B. April 2022. Evaluating progress and learning in complex systems: Introduction to Sustainable Agricultural Systems Project 2021-10219. Hybrid presentation at USDA-NIFA SAS Project Directors' Meeting.
  • Type: Conference Papers and Presentations Status: Accepted Year Published: 2022 Citation: Isidore, B., Landis, G., Mase, A., and Pratsch, S. June 2022. Applications of Principles-Focused Evaluation for Sustainability and Agroecological Contexts. Workshop proposal for the American Evaluation Associations Annual Meeting.