Date

Environmental Education Workshop Series: Exploring Program Outcomes and Shared Metrics

stock_data_image

The San Diego STEM Ecosystem Environmental Education Working Group hosted a  Exploring Program Outcomes and Shared Metrics workshop on Tuesday, November 20, at the Fleet Science Center. This workshop centered around engaging local, cross-county organizations in conversation with each other to identify common outcomes desired by environmental education providers, share methods of data collection and program evaluation and identify ways data is being utilized.

 

A previous Working Group workshop at the Living Coast Discovery Center was facilitated by Jim Marshall, Ph.D., from the Department of Education Leadership at San Diego State University.  It focused on different types of evaluation, case studies, methods of data collection and conducting surveys from the initial stages to the end. The most recent workshop acted as a follow up, allowing participants an opportunity to dig deeper and share best practices and methods in a more discussion-based environment.

 

Led by Tara Fuad (San Elijo Lagoon Conservancy) and Amy Whitehouse (The Energy Coalition), the workshop started with a simple, introductory activity: the famous “inner circle/outer circle” icebreaker. Giving everybody in the group an opportunity to meet somebody new, it helped set the stage for the rest of the workshop and asked the questions, “Who are you?” “What do you do?” and “What are your goals from your work and this workshop?”

 

Afterward as a larger group, participants identified three general, common outcomes that they hoped to achieve through their work and around which they could collect shared metrics: behavioral change, knowledge gain and an understanding and appreciation of nature.

 

These goals can present numerous challenges to tackle:

 

  • How can we measure behavior change, particularly over the long term?

  • How can changes in attitude relative to nature be measured, especially when students and visitors start out with relatively good attitudes?  

  • What is a suitable or sufficient response rate in order to measure program effectiveness? Once data is gathered, how exactly should it be used?  

 

Together, the group discussed various ways to approach these different kinds of challenges.

 

Surveys were the most popular and efficient way to immediately and efficiently evaluate a program—usually, this was done electronically (via Google Forms, Survey Monkey, etc.).

Beyond surveys, there was also discussion about speaking directly to participants and educators—whether that be “walk and talks” with students, holding teacher-specific focus groups or even working with educators to implement behavioral change projects with their students.  

 

Finding ways to standardize survey questions and implementing less static and more “testimonial” types of responses were hot topics of discussion. One measure of the value of environmental education efforts could be sustained interest for an individual program and could be measured through documented re-visitations and re-bookings. Practicing direct communication with the communities served was seen as a more pragmatic way to evaluate variables that might not be so easily quantified (most specifically behavioral change).

 

Once all of this data has been gathered, it can feel almost overwhelming to determine and carry out  the next steps. Evaluating large amounts of data can seem daunting, but there were many different suggestions as to how to approach that, such as hiring third-party organizations to assess big data, or even collaborating with research institutions to publish any important findings. The importance of working with this data extends further than just monetarily, as comprehensive data, at its core, can help us improve our programming and how we work with the communities that we serve.

 

Working with metrics and data can feel intimidating, but through these conversations, we found paths to navigate these processes. As a Working Group, there is hope that the common outcomes can lead to shared metrics where we can develop an understanding of promising practices in environmental education in San Diego County.  

 

Additional Literature


Alex Russ, A. (Ed).  (2014). Measuring Environmental Education Outcomes.