Think about what the League of Women
Voters does, in
particular their citizen-education efforts, in relation to helping the public
cut through distorted and confusing campaign rhetoric in order to make better
informed decisions when it comes to voting.
I liken their efforts to the ways in which Consumer Reports has
helped me and others make decisions about purchasing appliances and cars
without the “assistance” of advertising or impression-management marketing. Essentially, what the League and Consumer
Reports offer is a set of evaluation criteria and a simple framework that
can be used to compare apples to apples, oranges to oranges.
If consumers
and voters had a powerful Chamber of Consumption and Public Participation
(CCPP), the way businesses have their local, regional, and national Chambers
of Commerce, (yea right, the day after hell freezes over) there would be no
need for the League of Women Voters or Consumer Reports. Well funded CCPP educational campaigns might,
over time, exercise and develop our technical and socio-cultural
decision-making competencies such that the practice of creating and
disseminating fallacious, ridiculous, and offensive ads and campaigns might
begin to tapper off and the public appetite for better and more use-able
information might begin to grow.
Some would
argue that the media serves the public the way I am dreaming about the Chamber
of Consumption and Public Participation.
I would argue that the public can only dream about the media ever
serving its needs in this way given the social scale and cultural depth of the
educational challenge. However, the Des Moines Register was right to ask the
Branstad administration to share its “plan” for reforming schooling throughout
Iowa (see “Start selling
schools plan now” 09/11/11 http://www.desmoinesregister.com/article/20110911/OPINION03/309110017/Start-selling-schools-plan-now
). The public is, in fact, only getting
bits and pieces of the plan; but who knows, maybe all they have are “ideas” right
now (See “Iowa officals (sic)
unveil ideas for education reform” 09/06/11 http://www.desmoinesregister.com/article/20110906/NEWS/110906034/Iowa-officals-unveil-ideas-education-reform).
In the editorial from 09/11/11, the
paper rightly states, “There are few specifics right now… The administration
does not plan to release a comprehensive plan with details, including costs,
for about a month.” But then the
editorial goes on to say, “That makes it difficult for Iowans to fully evaluate
and debate what is being proposed (my emphasis).”
It was at this particular point that my “evaluator” ears
went up. The paper’s assumption here is
that, once the administration releases the plan, then Iowans can go to down
evaluating it and debating it. I think
it does a disservice to treat evaluation so superficially. Granted, as an evaluator, I would be
professionally hesitant to expect that anybody could just commence to evaluatin’ without some sort of preparation, if not
formal training. I don’t see much
evidence that the public is actually accustomed to engaging in an actual
evaluation, unless, of course, you redefine evaluation to mean jumping to an
opinion-driven conclusion, which, if we could harness its combustibility, we’d
be energy independent in a flash!
Evaluation, like research, is an inquiry strategy that draws on the
scientific method for its execution; and we know how some people feel about
science.
Now, here’s what the Register could have said, “OK, while
the administration is finishing its comprehensive, school-reform plan, we have
a month to formulate what evaluators call an evaluability assessment.“
(For an overview of Evaluability assessments, see http://tinyurl.com/5rqgo54 ) “We hope
that readers will become familiar with and make use of the evaluation criteria
and framework that we plan to introduce in the next few weeks. And once the administration does make the
plan available, we encourage you to join us in a Public Evaluability Assessment
of the Comprehensive Plan so that we can collectively examine a.) the clarity
of its goal, b.) the extent to which it takes into consideration all of the
stakeholders’ views, and c.) the intervention strategy itself that promises to
make a difference. We will be working
with Paul Longo , who has graciously offered to share his framework, based on
the Performance Blueprint (Longo, 2004 & 2002), for conducting an Evaluability Assessment.* We have inserted
the following table as an introduction.
In the next few days we will begin by outlining what we already know
about the plan. In this way we will become familiar with some of the technical terminology
and anticipate the purposes and potential positive consequences of conducting a
Public Evaluability Assessment. The
paper is also considering applying the same process and your valuable input to
the Capital Crosswords initiative.”
* Some of you will recognize this framework as the template for formulating a fully articulated strategy.
CONDUCTING AN EVALUABILITY ASSESSMENT |
|
Does the PLAN address the following considerations? |
|
1.
Identify the
desired External (community) and Internal (organizational)
Outcomes corresponding to the
required “effort” and intended “effect.”
|
|
2.
Identify the
direct and indirect Targets of the
strategic initiative (i.e., customers,
clients, beneficiaries, communities,
and of course, the intended consumers and/or users of the performance information that will be generated
throughout the execution of this initiative).
|
|
3.
Identify the Desired Effects of the strategic
initiative on the targets (i.e.,
gains in knowledge, awareness, attitude, skills, and conditional status)
along with Measures of Quantity &
Quality to provide evidence of the attainment, internalization,
acquisition, or approximation of those effects among the targets.
|
|
4.
Develop the
work-in-progress articulation of the Strategy
to outline how the strategic initiative will facilitate the attainment,
internalization, acquisition, or approximation of the desired effects among the targets
|
|
5.
Identify the Direct, Indirect, and Collaborating
Service Providers who are positioned to and/or capable of bringing about
the attainment, internalization, acquisition, or approximation of the desired effects among the targets
|
|
6.
Identify how
the Direct, Indirect, and Collaborating Service Providers are expected to
perform by establishing Measures of
Efficiency and Equity in resource management, collaboration, and service
delivery, i.e., the “effort,” so as to envision how much service is/how many
services are to be provided and how well those services are to be provided.
|
|
7.
Set forth some
idea of how the Performance
Information, generated by Items #3 and #6, will be collected and used,
by whom, how frequently, for whose benefit, for what internal and external
purposes (e.g., mandatory reporting, voluntary reporting, resource
allocation, decision-making, public education, marketing, continuous
improvement, and so forth). There will
be an ongoing challenge to explain the relationship(s) between OUTPUTS and
OUTCOMES in an evidentiary, valid, and credible fashion.
|
|
8.
Some
articulation of the Tangible and
Intangible Resources needed to execute, maintain, and grow the strategic
initiative.
|