Curriculum

What the doctor ordered

What should an Extension program team have on hand to draft a public value message that secures a skeptical stakeholder’s support? Here’s my prescription:

prescription

What’s yours? 

Announcing March 2010 train-the-trainer course

You know how your Extension programs benefit your participants, but your programs also create public value when they benefit the rest of the community. Nationwide, participants in “Building Extension’s Public Value”? workshops have learned how their programs create public value and how to communicate this value to stakeholders whose support is crucial to Extension.

Now, you have an opportunity to learn how to conduct these workshops for Extension scholars at your own institution by participating in an online train-the-trainer program for “Building Extension’s Public Value.”?

With your registration fee, you get:

• Four hours of instruction in how to conduct “Building Extension’s Public Value”? workshops from the creator of the workshops, Dr. Laura Kalambokidis, Associate Professor in the Department of Applied Economics at the University of Minnesota.
• Access to the Building Extension’s Public Value Presenter’s Guide, the Building Extension’s Public Value Workbook, and accompanying Powerpoint™ presentation to download and print for your use in conducting workshops for University and Extension scholars at your institution.

* To register, go here. The registration fee is $100 per participant. To encourage institutions to send teams of staff to the training, the maximum total registration fee for any institution is $500.

* The training will be conducted online, via UMConnect, and will consist of two, two-hour sessions, with all participants attending both sessions. The training sessions will be Monday, March 29, and Wednesday, March 31, 2010, at 2:00-4:00 Eastern; 1:00-3:00 Central; 12:00-2:00 Mountain; 11:00-1:00 Pacific; 9:00-11:00 Hawaii.

* Prior to the beginning of the sessions, participants will receive an email notifying them of how to participate in the two online sessions and how to download the training materials, including the Building Extension’s Public Value Presenter’s Guide, the Building Extension’s Public Value Workbook, and accompanying Powerpoint™ presentation.

* Questions about registration? Contact our help desk at [email protected] or 800-876-8636.

* Questions about program content and relevance to your work? Contact Laura Kalambokidis at [email protected].

* Other questions? Contact Diane McAfee at [email protected].

This I believe to be true today

Substantiating the claims that we make about Extension programs’ public value is crucial to Extension’s credibility. However, we don’t always have enough time in a “Building Extension’s Public Value” workshop to assemble the documentation (journal articles, program evaluation reports, etc.) to support the claims embedded in a newly drafted public value message. The purpose of the “Research Agenda” workshop module is to list those claims and create a plan for assembling the supporting documents, or even for conducting new program evaluations or research.

research agenda

Sometimes, a workshop group is torn between wanting to draft a public value message that is persuasive–but, maybe a bit aspirational–and one that contains only claims for which the team has strong supporting evidence. I usually encourage groups to be creative and persuasive during the workshop and worry about the documentation later, but not to publicly use a public value message until they are sure it is defensible. Understandably, this guidance occasionally leads to draft public value messages that include some pretty far-fetched claims.

Cynthia Crawford, Family Financial Education Specialist and County Program Director for University of Missouri Extension in Saline County, MO, has a suggestion for helping workshop groups stay creative while not veering too far off into “aspirational” territory. Cynthia suggests telling teams drafting public value statements that they don’t have to have the documentation to substantiate their claims today (during the workshop), but they do have to believe the statements are true today. Cynthia reports that this bit of direction has lead to remarkably strong–and credible–draft public value messages in short amounts of work time.

I will definitely adopt Cynthia’s “you have to believe it today” guidance the next time I teach a BEPV workshop. Do you have any other suggestions for helping teams “think big” while staying grounded?

Reporting by topic, not by table

In the Building Extension’s Public Value Presenter’s Guide, a small-group activity follows the presentation of the various ways a program creates public value. It is a brainstorming exercise, during which groups record as many ways as they can that their program satisfies any of the public value criteria, listed below.

summary

How I handle reporting back at the end of this activity depends on time constraints. If I have plenty of time, I ask groups to report back any number of the ways their program meets any of the criteria. When time is tight, I ask them only to share their reactions to the activity, itself (what worked, what didn’t, what questions came up), noting that they will use all of their notes from the exercise later in the workshop.

I tried something different last week when I taught the workshop for LSU AgCenter in Baton Rouge, LA. Instead of asking each table to report, one by one, I went down the list of criteria. First, any group was welcome to share ways their program satisfied the information criterion, next any group could report how their program addressed fairness, etc. This approach takes a bit of time, but I think it might help to break up the workshop structure a little bit.

What do you think? Have you taught the BEPV workshop? Have you tried different ways of having groups report back? Were you at the LSU workshop? How do you think it worked there, aside from the fact that I gave confusing directions to start

Extension, Show me the money! Or not.

While the objective of the “Building Extension’s Public Value” workshop is to draft a qualitative message about a program’s public value, many of our stakeholders are concerned about programs’ financial impacts. For example, county commissioners and state legislators want to know how much a program will cost, and whether it’s impacts will reduce strain on the county or state budget. A lot of us, therefore, are eager to quantify the impacts of Extension programs and, wherever possible, convert those impacts into dollars and cents.

money

Some exciting work is being done on monetizing Extension program impacts. These economic impact briefs from Texas AgriLife Extension are a strong example, and I know there are many more studies.

In future blog entries, I’ll write more about ways researchers and program evaluators are quantifying and monetizing Extension program impacts. However, as persuasive as a dollars-and-cents case can be with some stakeholders, I can think of two reasons to proceed with caution as we pursue more financial and fiscal impact studies.

First, Cooperative Extension does not yet have all the resources and tools necessary to estimate the financial and fiscal benefits of all of our programs. To do a credible job, applied economists, program evaluators and others would need to devote many more hours to this effort than are currently available. Data must be collected and analyzed, models built and tested, reports written and vetted. The likely result of pressuring program teams to estimate financial impacts while providing them with inadequate resources is a collection of poor quality analyses that erode Extension’s credibility.

Second, some programs’ public value lends itself more readily to monetization than others. For example, a program that helps reduce a county’s cost of managing its waste can make a strong, straightforward, dollars-and-cents case. On the other hand, methodologies for estimating the fiscal impact of social capital improvements are less well-developed.

Because so many of Extension’s stakeholders are concerned about monetary value, I am concerned that those programs whose public value is more easily monetized will rise to the top of the priority list–not because they contribute more public value, but because their value is easier to translate into currency.

The objective of the BEPV workshop is to make strong qualitative cases for all Extension programs that create public value. I hope we can keep doing this, even while we seek the resources necessary to estimate the financial and fiscal impacts of those programs for which that is possible.  

Using cultural diversity to narrow an information gap

One of the ways Extension and other outreach programs can build public value is by providing information that allows consumers and business owners to make better choices. In other words, Extension programs help to close the “information gap” that prevents people from doing the best they can for themselves and their businesses.

information

In the “Building Extension’s Public Value” workshop, I caution participants about over-using the information gap as an argument for Extension program funding. All of our programs provide information, but if we try to use a single argument to justify all of our programs, we are unlikely to be successful. I suggest that we reserve the information gap argument for cases where it is likely to be strongest. Which cases are those? I think that when you can answer “yes” to at least some of the following questions, you can make a strong case with the information gap.

information

information


Take a look at the fifth question on the list, providing information to people who would not otherwise have access. The variety of ways that Extension programs address access includes providing information at low or no cost, bring programs to geographically isolated areas, giving people materials written in their native language, and delivering information in ways other than through written materials.

At a recent BEPV workshop for the University of Hawaii Cooperative Extension Service, a participant suggested that cultural norms could also create a barrier to accessing and using Extension’s information. For example, traditional gender or generational roles might prevent some members of a community from participating in a program. When we identify and address such barriers–indeed, when we allow cultural differences to inform and enrich our programs–we can be more successful in closing the information gap and building public value.

Hawaii license plate
Incidentally, it is not at all surprising that the suggestion to consider cultural norms arose at the Hawaii workshop. Culture is deeply valued by the residents of the 50th state, which has the highest ethnic minority population in the nation. Mahalo, Hawaii, for reminding us to consider both cultural barriers and cultural contributions to Extension programs. 

Closing the loop between research and Extension

When I ask Extension professionals to name Extension’s strengths relative to other providers of outreach education, the connection between Extension programs and university research inevitably is the the first item on the list. We build on that key strength when we deliver programs that are based on the best research, and the community’s needs inform the research agenda: that is, when we close the loop between research and Extension. I focused on this relationship–substituting “engagement” for “Extension”–at the Purdue Scholarship of Engagement Workshop last week.

loop

Here’s how I think an Extension team can close that loop: They (or someone else) conduct research that leads to a discovery (knowledge creation) that could help address a condition of concern in a community (middle left box in the diagram). The team designs their Extension or engagement program with a curriculum that is based on the new knowledge, as well as existing best practices regarding program design and delivery (middle box). If the team is truly “engage” with their community partner, then the partner’s needs and strengths will also inform the design of the program. The team conducts their program (middle right box) while also collecting data and observations that can be used to inform the research agenda (top box). This way, what is observed and learned “in the field” makes its way back to the lab to influence the direction of future research. The team also implements their program evaluation plan, which helps them evaluate the impact of the Extension or engagement program (lower right box). The results of the evaluation helps them improve the program design (lower middle box), so greater impact will result next time.

detail loop

Where does public value come into this scheme? I can think of at least two places: First, in the design phase, the team will plan how they expect the program to create public value. What are the expected impacts and outcomes, and how do they create benefits for sstakeholders who are not the program’s direct beneficiaries? Second, in the evaluation phase, team members will assess whether those expected outcomes were generated: whether public value was created.

public value loop

I can think of a few ways a team can increase their success at closing the loop:

*Form a team that includes researchers, Extension educators, and program evaluators.
*Embed the program evaluation plan into program design.
*Develop and implement a plan for collecting observations and data arising from the Extension or engagement program.
*Keep up to date on relevant research developments.
*Plan for steps to take once the program ends (e.g., analyzing data and revisiting the program design).

Do you think closing the loop between research and engagement is crucial? Can you suggest ways to make it happen more systematically in Extension? 

Revise and rewrite

During a typical public value workshop, participants draft a public value message for an Extension program and the presenter and other participants provide feedback. Most groups will need to revise the messages post-workshop before they can be used in publications, websites, or grant proposals.

To help with the revising step, I cobbled together a list of criteria for evaluating messages. Some of the criteria came from University of Minnesota Extension’s Aimee Viniard-Weideman, and I thought up some myself. Recently, I have incorporated the checklist into workshops for Texas, Nebraska, and Missouri Extension. With University of Missouri Extension, we went a step further and developed an exercise using the checklist. Workgroups started with a message they had drafted earlier and critiqued and revised it according to the criteria on the list–and any other criteria they thought were important. Each group received some feedback from a colleague from a different program area, and they revised the messages a second time. Some really strong messages emerged!

checklist.bmp

The checklist–and the accompanying revise-and-rewrite exercise–are not yet part of the public value curriculum, but I am thinking about including them in the next revision. Do you think your organization would find the exercise useful? If so, how would you change or add to the list? What other criteria do you think drafters should consider when they are writing messages for use in their work?

To get the ball rolling, here are some thoughts I’ve had about the criteria::

==This is not an exhaustive list: Workgroups may have other criteria that are important for a particular program, stakeholder, or delivery method. For example, some messages will be very effective in print, but should be differently composed for legislative testimony.
==Revising a message will involve balancing these criteria; some will be more important than others in a particular case. For example, there is a natural tension between the objectives of brevity and credibility, and a group might opt for a slightly longer message in order to present some evidence in support of their case. Additionally, there will be some instances where negative framing will make a better case than positive framing.
==The first three items–all about focusing on the stakeholder–could be combined.

Other ideas? Or can you suggest a completely different direction? 

Learn first, then do

The behavior changes that we seek from Extension’s interventions only arise once program participants learn something new: through our programs they gain knowledge, skills or awareness. For example, the Alaska Extension client below is learning how to plant a community garden. (Photo by Edwin Remsberg USDA/CSREES.)

planting a garden

The diagram that I usually use to illustrate a public value message leaves out this “learning” step. In two recent public value workshops–for Texas Agrilife Extension and for Missouri Extension–I presented the public value message diagram slightly differently than I have done before.

learning.bmp

Many of us document the learning step in our logic models. End-of-workshop evaluations and follow-up evaluations often measure the increases in knowledge, skills, and awareness. And for program evaluation, that step will continue to be crucial. For a public value statement, however, I tend to de-emphasize the learning step. Because I think that stakeholders are more interested in what happened, as a result of the learning, I like to move quickly to the behavior changes, outcomes, and public value a program generates. Learning is part of the mechanism that gets us to public value, but it is not the end in itself.

What do you think? Should a public value message keep the learning step implicit, or should it receive more emphasis when we communicate with stakeholders? Do you think the (not very dramatically) altered public value message diagram is a helpful tool or an unnecessary distraction?  

Announcing March 2009 train-the-trainer course for BEPV

You know how your Extension programs benefit your participants, but your programs also create public value when they benefit the rest of the community. Nationwide, participants in “Building Extension’s Public Value" workshops have learned how their programs create public value and how to communicate this value to stakeholders whose support is crucial to Extension.

Now, you have an opportunity to learn how to conduct these workshops for Extension scholars at your own institution by participating in an online train-the-trainer program for “Building Extension’s Public Value."

With your registration fee, you get:

• Four hours of instruction in how to conduct “Building Extension’s Public Value" workshops from the creator of the workshops, Dr. Laura Kalambokidis, Associate Professor in the Department of Applied Economics at the University of Minnesota.
• Access to the Building Extension’s Public Value Presenter’s Guide, the Building Extension’s Public Value Workbook, and accompanying Powerpoint™ presentation to download and print for your use in conducting workshops for University and Extension scholars at your institution.

* To register, go here. The registration fee is $100 per participant. To encourage institutions to send teams of staff to the training, the maximum total registration fee for any institution is $500.

* The training will be conducted online, via UMConnect, and will consist of two, two-hour sessions, with all participants attending both sessions. The training sessions will be Tuesday, March 3, and Thursday, March 5, 2009, at 11:00-1:00 Eastern; 10:00-12:00 Central; 9:00-11:00 Mountain; 8:00-10:00 Pacific.

* Prior to the beginning of the sessions, participants will receive an email notifying them of how to participate in the two online sessions and how to download the training materials, including the Building Extension’s Public Value Presenter’s Guide, the Building Extension’s Public Value Workbook, and accompanying Powerpoint™ presentation.

* Questions about registration? Contact our help desk at [email protected] or 800-876-8636.

* Questions about program content and relevance to your work? Contact Laura Kalambokidis at [email protected].

* Other questions? Contact Diane McAfee at [email protected].