Texas

Extension, Show me the money! Or not.

While the objective of the “Building Extension’s Public Value” workshop is to draft a qualitative message about a program’s public value, many of our stakeholders are concerned about programs’ financial impacts. For example, county commissioners and state legislators want to know how much a program will cost, and whether it’s impacts will reduce strain on the county or state budget. A lot of us, therefore, are eager to quantify the impacts of Extension programs and, wherever possible, convert those impacts into dollars and cents.

money

Some exciting work is being done on monetizing Extension program impacts. These economic impact briefs from Texas AgriLife Extension are a strong example, and I know there are many more studies.

In future blog entries, I’ll write more about ways researchers and program evaluators are quantifying and monetizing Extension program impacts. However, as persuasive as a dollars-and-cents case can be with some stakeholders, I can think of two reasons to proceed with caution as we pursue more financial and fiscal impact studies.

First, Cooperative Extension does not yet have all the resources and tools necessary to estimate the financial and fiscal benefits of all of our programs. To do a credible job, applied economists, program evaluators and others would need to devote many more hours to this effort than are currently available. Data must be collected and analyzed, models built and tested, reports written and vetted. The likely result of pressuring program teams to estimate financial impacts while providing them with inadequate resources is a collection of poor quality analyses that erode Extension’s credibility.

Second, some programs’ public value lends itself more readily to monetization than others. For example, a program that helps reduce a county’s cost of managing its waste can make a strong, straightforward, dollars-and-cents case. On the other hand, methodologies for estimating the fiscal impact of social capital improvements are less well-developed.

Because so many of Extension’s stakeholders are concerned about monetary value, I am concerned that those programs whose public value is more easily monetized will rise to the top of the priority list–not because they contribute more public value, but because their value is easier to translate into currency.

The objective of the BEPV workshop is to make strong qualitative cases for all Extension programs that create public value. I hope we can keep doing this, even while we seek the resources necessary to estimate the financial and fiscal impacts of those programs for which that is possible.  

Revise and rewrite

During a typical public value workshop, participants draft a public value message for an Extension program and the presenter and other participants provide feedback. Most groups will need to revise the messages post-workshop before they can be used in publications, websites, or grant proposals.

To help with the revising step, I cobbled together a list of criteria for evaluating messages. Some of the criteria came from University of Minnesota Extension’s Aimee Viniard-Weideman, and I thought up some myself. Recently, I have incorporated the checklist into workshops for Texas, Nebraska, and Missouri Extension. With University of Missouri Extension, we went a step further and developed an exercise using the checklist. Workgroups started with a message they had drafted earlier and critiqued and revised it according to the criteria on the list–and any other criteria they thought were important. Each group received some feedback from a colleague from a different program area, and they revised the messages a second time. Some really strong messages emerged!

checklist.bmp

The checklist–and the accompanying revise-and-rewrite exercise–are not yet part of the public value curriculum, but I am thinking about including them in the next revision. Do you think your organization would find the exercise useful? If so, how would you change or add to the list? What other criteria do you think drafters should consider when they are writing messages for use in their work?

To get the ball rolling, here are some thoughts I’ve had about the criteria::

==This is not an exhaustive list: Workgroups may have other criteria that are important for a particular program, stakeholder, or delivery method. For example, some messages will be very effective in print, but should be differently composed for legislative testimony.
==Revising a message will involve balancing these criteria; some will be more important than others in a particular case. For example, there is a natural tension between the objectives of brevity and credibility, and a group might opt for a slightly longer message in order to present some evidence in support of their case. Additionally, there will be some instances where negative framing will make a better case than positive framing.
==The first three items–all about focusing on the stakeholder–could be combined.

Other ideas? Or can you suggest a completely different direction? 

Learn first, then do

The behavior changes that we seek from Extension’s interventions only arise once program participants learn something new: through our programs they gain knowledge, skills or awareness. For example, the Alaska Extension client below is learning how to plant a community garden. (Photo by Edwin Remsberg USDA/CSREES.)

planting a garden

The diagram that I usually use to illustrate a public value message leaves out this “learning” step. In two recent public value workshops–for Texas Agrilife Extension and for Missouri Extension–I presented the public value message diagram slightly differently than I have done before.

learning.bmp

Many of us document the learning step in our logic models. End-of-workshop evaluations and follow-up evaluations often measure the increases in knowledge, skills, and awareness. And for program evaluation, that step will continue to be crucial. For a public value statement, however, I tend to de-emphasize the learning step. Because I think that stakeholders are more interested in what happened, as a result of the learning, I like to move quickly to the behavior changes, outcomes, and public value a program generates. Learning is part of the mechanism that gets us to public value, but it is not the end in itself.

What do you think? Should a public value message keep the learning step implicit, or should it receive more emphasis when we communicate with stakeholders? Do you think the (not very dramatically) altered public value message diagram is a helpful tool or an unnecessary distraction?