What is programme evaluation?

Published 11 Apr 2017

Programme process evaluation focuses on program implementation and operation. A process evaluation can answer questions regarding program effort; identify processes or procedures used to carry out the functions of the program; and address program operation and performance. Process evaluation adds a qualitative dimension to the descriptive statistics furnished by a monitoring system. From a monitoring system, one might learn, for instance, how many counseling sessions of what duration were received by participants in a drug treatment program; or how many new cases were opened for investigation by a law enforcement agency. A process evaluation would expand these indicators by providing information on particular modalities of treatment or tactics of investigation. In addition to furnishing richer data on some of the more quantitatively elusive aspects of program operations, a process assessment can also capture important information about a program’s social, legal, and political context. Since local cultures play a key role in shaping the direction and results of new public policies, understanding the local environment is crucial to understanding how a program works.

Undergraduates Very Often Tell Us:

I’m not in the mood to write my essay. Because I want to spend time with my friends

CRITERIA FOR JUDGING PROGRAMME PROCESS

The first criteria for judging a programme process is for you have a quality understanding of how a programme operates, and how it may be influenced by various other factors. The basic foundational knowledge of the person assessing is the programme process is highly essential. Secondly, qualitative data are also essential for interpreting the results of a quantitative inquiry. In doesn’t make sense, for instance, to declare that a programme has no discernible effect on its target population or area, if observation has revealed that the activities described by the available statistics either failed to take place or were implemented in a qualitative different form.

Also, in cases where program impacts cannot be empirically measured, a case study may serve as a surrogate source of information about the nature of result achieved in a particular issue being studied.

FORMS EVALUATION CAN TAKE

Evaluation can take any of the following forms: administration of questionnaires, soliciting for feedback and opinions, field observations, reviews of media reports and legislative hearing, interviews on programmme personnel, host agency staff, talking with key personnel in related agencies, as well as other stake holders to the programme being implemented.

TYPICAL ISSUES IN MONITORING SERVICE UTILIZATION

Most of the typical issues may include the followings:

  • Description of the program environment and supplying data.
  • Description of the process used to design and implement the program.
  • Description of program operations, including any changes in the program.
  • Identification and description of intervening events that may have affected implementation and outcomes.
  • Documentation such as meeting minutes, reports, memorandums, newsletters, and forms.

MONITORING PERFORMANCE OF PROGRAMME DELIVERY SYSTEM.

To be able to monitor the performance of a programme delivery system, it is important for the person doing the assessment to keep the following factors in mind:

  • What problems were encountered in implementing objectives? How were they resolved?
  • Have all planned activities been implemented? If not, what remains to be done? Were they accomplished on schedule?
  • If objectives, plans, or timetables were revised, why was this necessary?
  • What new objectives were added and why?
  • What changes occurred in leadership or personnel? Does it have any effect?
  • What costs were incurred? Did they exceed initial projections?
  • What was the level of resident support in targeted neighborhoods? How did this affect the overall enforcement effort?
  • What lessons have been learned that might be useful to other jurisdictions?
  • Does the process proceed smoothly, or are communications and relations difficult and strained?
  • Do participants work together to identify a range of potential strategies?
  • Do the status and hierarchy of involved personnel interfere with communications?

THREE TYPES OF IMPLEMENTATION FAILURE

  • Inability to define clearly the project goals, the scope and overall approach.
  • Was the project chances of success well articulated? What were the key expectations? Where they realistic or utopian?
  • Was the tool and other variables for measuring expectations simple or complex?

REFERENCE:

  • Ginzberg, M.J. (April 1981). Early Diagnosis MIS Implimentation…Management Science Vvol. 27. No.4. p.459-478
Did it help you?