OMD/Evaluation
Work in progress, expect frequent changes. Help and feedback is welcome. See discussion page. |
Contents
- 1 Welcome to the Organization Management and Development Page
Welcome to the Organization Management and Development Page
Featuring Useful OD Resources, Readings & Strategies
Indicators
Some indicators that are applicable to both female and male workers might be:
- opportunities for advancement
- feeling their role and/or work was respected
- lack of mentoring
- equality in payment for work of equal value
- ability to take leave for family reasons
- reasonable expectations with respect to time spent working (i.e. acceptance of the need for a work-life balance)
- perceptions of bullying in the workplace
- lack of action in relation to complaints about bullying or dishonesty e.g.
- flexible hours to allow for child care
- no child care close to work place
- sexual harassment
- Other: Explain: _________________
Unobtrusive Measures
Absolutely! We in museum and cultural evaluation have long-employed them. In fact, some of us AEAers who are also Visitor Studies Association (VSA) members did a demo at last years AEA conference in San Antonio using tracking and timing to evaluate the reception/poster session. I'm usually one to recommend it's one method in a multi-method design though; not the lone data source. (PS: More visitor studies will be profiled at this year's conference, if you're planning to attend.)
-Kathleen
Malcolm Baldridge Internal Quality Award
- Malcolm Baldridge Internal Quality Award
- National Institute of Standards and Technology
- The Cure for the Non-Profit Crisis
Outcomes Mapping
- See post - Justifying OM for Sceptics (sic)
- "How do we get the backing to implement these bottom up approaches i.e. OM, participative techniques etc.? Explaining the process of OM and how each stage serves a purpose, is okay for people who are going to apply OM, but takes too long for most decision makers! "
- application and integration of (participatory) action research
Logical Framework Analysis
The concept of the Logical Framework stems from Operational Research during WW2 and was developed by RAND co-operation, with the following definition: "...the mathematical or logical framework or set of equations showing the interdependencies of the objectives, the techniques and instrumentalities, the environment, and the resources." (see P. Checkland, Systems Thinking, Systems Practice, page 136).
This concept was in the 60's further developed into the Logical Framework Matrix, by Leon Rosenberg, the founder of Practical Concepts inc, who was hired by USAID to improve their planning system.
Now, the big jump ahead was made in 1974, when Ludwig Zils and Peter Siebenhuhner from GTZ teamed up with Moises Thompson from Practical Concepts inc. and designed together a procedure that allowed stakeholders to build plans parting from an interactive problem analysis and negotiate a workable solution, which found its form in the logical framework matrix. They integrated Metaplan communication techniques to build a consistent facilitation procedure. Thus, for the first time, a stakeholder based interactive planning system was devised, of which the Logical Framework was the end product. This method was called ZOPP in German, or Goal Oriented Project Planning (GOPP) in English. Later, other names such as Logical Framework Analysis or Logical Framework Approach appeared, adding to the confusion. But Harry is right that GOPP was the first planning system that allowed stakeholders to make their own plan together, and this was indeed a 'revolution', because until that moment plans could only be made by 'experts'. This is why it was seen by some optimists as a step towards the democratisation of development planning.
Well, this did not happen at all, as we all know, because you may have a nice interactive planning tool, but if the way the project cycle is being managed by aid agencies (or any other policy agent) is as vertical as ever, participation remains window dressing. In fact, recently in a discussion list of LFA moderators we made an inventory of all the complaints about the Logical Framework, and it appeared that most of them related to the way the Project Cycle is managed by aid agencies, not the planning tool as such. However, no tool is perfect, and in the course of time practioners continue to develop it further, like removing jargon like 'results' and 'purpose' is being replaced with more practical terms like 'services' and 'benefits', and elements from other interactive methods (including OM) are integrated. - Charles
Programs of Study
Articles, Resources
- Program Evaluation (Wikipedia)
- Evaluation Resources - North Dakota State University
- Evaluation Careers
- Somali Advocacy Centre
- Evidence-Based Evaluation Checklist
- Another checklist OR http://nrepp.samhsa.gov/ReviewQOR.aspx
- Phil Bartle's Monitoring & Evaluation page
- Community Empowerment website
- TriGrammic, South Africa (Jonathan Miller)
Community-Based Evaluation (CDC)
CDC's Building Our Understanding: Key Concepts of Evaluation
- Tools for Community Action - What is it, and How do you do it?
- [http://ag.arizona.edu/sfcs/cyfernet/cyfar/evalgde.htm
Community-Based Project Evaluation Guide ]
- http://www.sohe.wisc.edu/hdfs/undergrad/syllabi/HDFS766_CommunityBased_Research_Evaluation.pdf Curriculum (PDF)]
- Certificate Programs & Training
Guides, Toolkits
Baldridge
- Landmark Case Study, 2005
Public Health
- Implementing Practice-Based Evidence: Success Stories
- "Community Engagement, Organization and Development for Public Health Practice." Frederick Murphy, editor. Springer Publishing, 2012. (ISBN: 9780826108012)
Wishlist
Other
- Canadian Evaluation Society
- uOttawa Promotions list
- WooBoard
- Most Significant Change. Rick Davies'
http://mande.co.uk/special-issues/most-significant-change-msc/
- Also see: http://www.mande.co.uk/docs/MSCGuide.pdf
- Project Types
- CDC's National Asthma Control Program. Module 2 of their evaluation manual, Learning and Growing Through Evaluation, has a short section on evaluation anxiety that covers both client and evaluator perspectives. You can find it in Appendix D, here: http://www.cdc.gov/asthma/program_eval/LG-Mod2_DraftFinal_Allsections_Wordaym.pdf
Journals
- Free Online Journals - http://gsociology.icaap.org/methods/resrch.htm
- Sage Journals has periodic free access to all of their journals during October 2012. The journals include
- American Journal of Evaluation
- Journal of Mixed Methods Research
- Sociological Methods & Research
- Sociological Methodology
Other 2
- Excellent Evaluation Resources Site
- What to look for in an evaluator - 1
- What to look for in an evaluator - 2
- What to look for in an evaluator - 3 (page 47) - Kellogg Foundation Evaluation Handbook
Here are some great resources - and discussions! - on preparing an RFP:
- http://pdf.usaid.gov/pdf_docs/PNADO824.pdf
- http://www.chillibreeze.com/articles_various/RFP-writing.asp
- http://www.spectrumscience.com/blog/2011/09/30/top-10-tips-for-writing-a-great-rfp/
- https://bama.ua.edu/cgi-bin/wa?A2=ind9706&L=evaltalk&P=R10930&I=1&X=42F984
3D00CB556DC5
- https://bama.ua.edu/cgi-bin/wa?A2=ind1112C&L=EVALTALK&P=R1276&1=EVALTALK&9=A&J=on&X=42F9843D00CB556DC5&Y=leah%40theimprovegroup.com&d=No+Match%3BMatch%3BMatches&z=4 - a very heated (!) discussion on the merits/demerits of
RFP's
Surveys & Questionnaires
Methods
- Global Empowerment Evaluation blog]
- CDC produced an excellent and comprehensive process evaluation of tobacco control initiatives.
Bibliography
Impact Analysis (quantitative)
- General Elimination Methodology (GEM) approach to evaluating impacts of a particular international aid intervention in 20+ countries. The model is explained in: Scriven, M. (2008)*. *A summative evaluation of RCT methodology: & an lternative approach to causal research.* Journal of MultiDisciplinary Evaluation*, *5* (9), 11-24.
- Realist evaluation: an emerging theory in support of practice", 1998. Henry, Julnes & Mark (eds), Jossey-Bass.
Stakeholder Analysis
Alexander, I. F. (2005). A Taxonomy of Stakeholders: Human Roles in System Development. International Journal of Technology and Human Interaction, 1(1), 23-59.
Brugha, R., & Varvasovszky, Z. (2000). Stakeholder analysis: a review. Health Policy and Planning, 15(3), 239-246.
Nowell, B. (2009). Profiling Capacity for Coordination and Systems Change: The Relative Contribution of Stakeholder Relationships in Interorganizational Collaboratives. American Journal of Community Psychology, 44, 196-212
Williams, Bob. Systems Concepts in Action : A Practitioner's Toolkit http://www.sup.org/book.cgi?isbn=080477062X Kindle and Adobe Digital Edition versions available
Williams, Bob. Making evaluations matter: A practical guide for evaluators http://www.cdi.wur.nl/UK/resources/Publications/
Williams, Bob - http://www.bobwilliams.co.nz
(: Stakeholders vs. Stakes (Bob Williams) - Not only do the same set of stakeholders contain different stakes, different stakeholders groupings will share the same stakes. That latter idea has been the basis of conflict resolution and mediation processes for years. Indeed we all have different stakes in a single endeavour whatever our stakeholder role or position. I believe that understanding how people juggle the contradictions in their own stakes (e.g. employability vs quality) is at the core of program behaviour. In contrast, identifying stakeholders tells you less than half the story - and may indeed lead you up the wrong path.)
- Making sense of what "objective data" means via our own values, beliefs, experiences and motivational (and cultural) drivers. Indeed this post is based on the fact that I am interpreting, making meaning, of your “objective” story in a different way to you.