IAF Handbook Covers
The IAF Handbooks

Creating a Culture of Collaboration:

The International Association of Facilitators Handbook

Sandor Schuman, Editor

IAF Handbook Covers
The IAF Handbooks


Collaboration Assessment Tools and Resources

From: Executive Decision Services

The Collaborative Decision Process Questionnaire

A Conceptual Framework: An introductory chapter that explains the conceptual framework for the Collaborative Decision Process Questionnaire

Summary of Questions

Sample Questionnaire

Index of Interdisciplinary Collaboration

From: Consensus Building Institute

Community-based Collaboration on Federal Lands Across the Intermountain West: An Evaluation of Participant Satisfaction

“The specific objectives of this research project are to: 1) Determine the effectiveness of community-based collaboration from the perspective of the participants; 2) identify which criteria or indicators of success are important to participants; 3) move beyond the use of case studies and conduct an aggregate analysis of data from many cases; and 4) test and refine the Participant Satisfaction Scorecard as a state-of-the-art instrument.” (Participant Satisfaction Scorecard can be found at the end of the paper.)

An Evaluation of Community-Based Collaborative Approaches

“Evaluating these (collaborative approaches to public participation and resource management) is an important step in drawing lessons to improve collaborative processes. In fall 2001, The Hewlett Foundation awarded a grant to the Montana Consensus Council (MCC) and CBI to test and refine the “participant satisfaction scorecard,“ a tool used to evaluate the success of community collaborations. The term community-based collaboration (CBC) was defined as the use of partnering, facilitation, mediation, consensus building, and other “alternative dispute resolution“ techniques to prevent and resolve public interest conflicts (i.e. conflicts involving local, state, and/or federal governments as at least one party to the conflict). The goals of the project were to: 1) evaluate how successful CBCs are from the perspective of participants; 2) determine which criteria or indicators of success are most important to participants; and 3) refine the scorecard as an evaluation tool.“

From: U. S. Environmental Protection Agency

Public Involvement – Feedback and Evaluation

“Evaluation, both formal and informal, helps to define, measure and improve public involvement effectiveness. Feedback from people who participate in public involvement events or processes often points out what works and what does not. Knowing that can lead to events and processes that are more meaningful for participants and contribute more to EPA's decision making processes.“

This site includes, among other things, Public Involvement Activities Questionaires for: Community Advisory Groups, Federal Advisory Committees, Listening Sessions, Small Discussion Groups, Public Meetings, Public Hearings, and Stakeholder Negotiations.

From: U. S. Institute for Environmental Conflict Resolution

Program Evaluation System

“The U.S. Institute is firmly committed to evaluating environmental conflict resolution and collaborative problem solving projects and services. The evaluation system is necessary to (a) measure and report on performance and (b) to facilitate continual learning and improvement when evaluation feedback is gathered, analyzed, and shared with appropriate audiences.

“The evaluation system focuses on six areas. Specifically, these are:, Situation/Conflict Assessment Services, Mediation/Facilitation Services, Training and Workshop Services, Facilitated Meeting Services, Roster Program Services, and Program Support and System Design Services.“

From: The Evaluation Center

Deliberative Democracy Evaluation Checklist

Ernest R. House and Kenneth R. Howe, October 2000

“The purpose of this checklist is to guide evaluations from a deliberative democratic perspective. Such evaluation incorporates democratic processes within the evaluation to secure better conclusions. The aspiration is to construct valid conclusions where there are conflicting views. The approach extends impartiality by including relevant interests, values, and views so that conclusions can be unbiased in value as well as factual aspects. Relevant value positions are included, but are subject to criticism the way other findings are. Not all value claims are equally defensible. The evaluator is still responsible for unbiased data collection, analysis, and arriving at sound conclusions. The guiding principles are inclusion, dialogue, and deliberation, which work in tandem with the professional canons of research validity.“

From: Communities Committee

Criteria and Indicators: Finding Meaning for Communities

by Ian Leahy, Editor, and Gerry Gray. Communities and Forests, Fall 2004, Volume 8, Number 3.

“In community forestry circles, we often talk about “Criteria and Indicators,“ a shorthand term for an assessment and learning process that is still evolving. From a practical perspective, what are criteria and indicators? More importantly, what, if anything, do they mean to community forestry practitioners more concerned about on-the-ground projects than abstract international agreements? Now, as a three-year project focused on connecting communities into the Montreal Process framework comes to its conclusion, we may finally have an answer.“

Forest Sustainability Indicator Tools for Communities: Indicator ToolKit

“This document is intended to serve as an indicator 'tool kit' for forest-based communities that are working on maintaining and enhancing their natural resources as a basis for long-term economic, social and environmental health. A key component of the tool kit is the Montreal Process Criteria and Indicators - a framework, which helps assess ecological, economic and social aspects of forest resources. Although originally developed to evaluate national progress toward sustainable forests, the framework can be adopted at local level. Three communities tested this tool kit as part of the three-year pilot project, funded by USDA Forest Service. Appendix D describes the process each community went through and some of the key lessons learned.“

From: Study Circles Resource Center

Evaluation: How Are Things Going?

“Evaluation is a key part of the overall plan to train and support facilitators. Evaluating the facilitation component of your program can help you: learn what is and isn’t working well; monitor how the facilitators are doing and respond to their needs; come up with new strategies to improve your program; explore the impact of the study circle process on the facilitators.“

From: Deliberative Democracy Consortium

Report to Deliberative Democracy Consortium: Building a Deliberative Measurement Toolbox, by Peter Muhlberger.

“In addition to describing a promising toolbox of measures, this report also introduces a theoretical framework that makes sense of the findings regarding these measures and may help researchers and practitioners better focus their efforts to understand deliberation. Finally, it makes a variety of practical suggestions regarding how practitioners can improve their efforts to demonstrate the value of deliberations.“

From: Center for the Advancement of Collaborative Strategies in Health

Partnership Self-Assessment Tool

The Partnership Self-Assessment Tool was designed to help partnerships:
  • Understand how collaboration works and what it means to create a successful collaborative process;
  • Assess how well their collaborative process is working;
  • Identify specific areas they can focus on to make their collaborative process work better.

From: Sustainable Northwest

Performance Measures Issue Paper

“Performance measures are emerging among federal land management agencies as a new approach to gauge agency progress towards goals, as a basis for funding allocations, and to provide accountability to the Administration, Congress, and the public. … Key Recommendations:  1. Include performance measures related to collaboration, community benefit/capacity building, and change over time in land conditions (forest and watershed resilience; risk reduction from catastrophic wildland fire) in all future agency strategic planning initiatives.“

From: Amherst H. Wilder Foundation

Collaboration: What Makes It Work (Second Edition):

A Review of Research Literature on Factors Influencing Successful Collaborations. by Paul Mattessich, Barbara Monsey, and Marta Murray-Close, June 2001.

“This new, revised edition reaffirms the success factors found in the first edition and adds a new success factor. It also identifies ways in which the previous report has been put to use to improve the practice and research on collaborations. Also included are: a working definition of collaboration, summaries of the major findings, detailed descriptions of each factor, and an extensive bibliography. Finally, and perhaps most important for organizations involved in or considering collaboration, it includes a Collaboration Factors Inventory--a self-guided assessment tool that potential or current collaborators can take to assess the presence of each of the twenty factors.“

Forest Service's Partnership Capacity Assessment Tool

Additional Publications

Assessing Your Collaboration: A Self Evaluation Tool

by Lynne M. Borden and Daniel F. Perkins. Journal of Extension, April 1999, Volume 37 Number 2.

“The tool is a self-assessment exercise allowing groups to rate their collaboration on key factors … goals, communication, sustainability, evaluation, political climate, resources, catalysts, policies/laws/regulations, history, connectedness, leadership, community development, and understanding community. With this tool, collaborative groups identified … factors that need to be worked on. … In all cases, the self-evaluation tool can be used to strengthen the collaborative group.“

Communicating Successes of Public-Private Partnerships:

A Primer on How to Develop Metrics for Sharing Your White Water to Blue Water Partnership Successes. Price Waterhouse Coopers, January 2005.

“Now that you have formed a White Water to Blue Water Public-Private Partnership, how will you communicate the successes and challenges of the Partnership to your various and diverse stakeholders? How do you construct metrics so that they can be clearly understood by the WW2BW PPP management, employees, and other stakeholders and how do you ensure that the metrics developed serve the WW2BW PPP in meeting its objectives? What information is needed to share your metrics with internal and external stakeholders and to demonstrate progress?

The Partnership Quiz: How Well Do We Know Each Other?

Other Organizations

The Community Based Collaboratives Research Consortium

National Coalition for Dialogue and Deliberation

National Network for Collaboration

Community Based Collaboration

Meta Collab


Creating a Culture of Collaboration:

The International Association of Facilitators Handbook

Sandor Schuman, Editor

ISBN: 0-7879-8116-8

Hardcover 498 pp; 2006

Jossey-Bass, An Imprint of John Wiley & Sons, Incorporated