The content is relevant and well-structured, put together in an understandable manner. Thank you.
Are your Programmes Achieving Intended Outcomes?
Community engagement is not a one-off effort.
To achieve programme objectives and sustainable impact,
it is crucial to evaluate if outreach efforts have met stakeholders’ needs.
How to identify relevant impact indicators?
What are the tools to collect useful data?
How to analyse data to evaluate actual effectiveness?
How to present data in a user-friendly manner?
Define Success Indicators, Evaluate Social Impact
Join this 2-day practical workshop to acquire the different approaches in community engagement measurement – formative, summative, process and outcomes. Discover how to set objectives for evaluation based on your programme goals. Adopt a step-by-step process on developing evaluation questions and success indicators. Acquire a data collection and analysis framework including quantitative and qualitative methods. Learn how to effectively present data to internal and external stakeholders.
- Led by Matt Healey from Australia, 10 years consulting experience to government and NPOs
- In-depth analysis of real-life case studies
- Hands-on exercises for application to your organisation’s context
- Receive data collection templates including surveys and interview guides
- Takeaway a guide on ‘Rapid Data Collection for Events’
- Develop a programme logic to outline programme outcomes and goals
- Map stakeholders and information needs to set evaluation objectives
- Draft evaluation questions and indicators for programme outcomes
- Build a data collection and analysis framework for your programme
Benefits of Attending
- Identify what your programmes are seeking to achieve
- Develop a programme logic to outline outcomes and goals
- Understand stakeholder needs to inform results reporting
- Determine what ‘success’ looks like to stakeholders
- Develop different types of evaluation questions
- Identify indicators for outcomes and programme success
- Classify data sources and prioritise the types of data you need
- Examine different types of data collection tools and approaches
- Assess strengths and weaknesses of quantitative and qualitative data
- Acquire data analysis and data visualisation techniques
- Adopt various reporting formats for different stakeholders
Matt has over 10 years’ experience in the not-for-profit sector and international education. Matt co-founded the social research and evaluation firm First Person Consulting in early 2015 and has provided research and evaluation services to all types of organisations, including community organisations, large government agencies and international not-for-profits. He is recognised for his adaptable and valued services which support his clients in delivering projects and programmes to improve social and environmental outcomes.
Engaging the community is a key element of many of the projects and programmes that Matt has evaluated and these projects cover a range of content areas including health promotion, mental health, education, financial literacy, sustainability and climate change.
Recent evaluation experience includes:
– Evaluation of the Ride or Walk to School Programme, ACT Health: Focused on increasing rates of active travel (riding or walking) among primary school aged children to school
– You’re the Boss Financial Literacy Programme Evaluation, The Salvation Army: Engaging community members from low socio-economic backgrounds to build their skills in financial literacy
– Evaluation of the Social Innovator’s Challenge, Movember Foundation: Seed funding provided to 12 projects to test different approaches to building social connectedness amongst different groups of men
– Evaluation of the Community Engagement Programme, Victorian Responsible Gambling Foundation: A place-based programme engaging with communities to identify interventions that best meet their need
– Co-Design Learning Project, Australian Red Cross: A co-design programme to design three new services for indigenous and other groups that access Red Cross services
– Design of a Community Education Programme to support the Wildlife Biodiversity Reforms, Office of Environment and Heritage: Consulting a range of community members and stakeholders to inform the design of a new community education programme
Matt is Secretary of the Australasian Evaluation Society (AES) Special Interest Group (SIG) on Human Centered Design and Evaluation. He also presents regularly at the annual AES conference on a range of topics suitable for attendees of all types. Matt is on the convening committee for the 2018 annual conference for the AES.
Past Delegate Testimonials
I liked that Matt uses a lot of real-life case studies to illustrate the application of the concepts shared.
Matt is high energy, very knowledgeable and thorough. The programme is comprehensive and he explains it so that the material is easy to apply immediately.
Very informative session, good templates.
A lot of useful content.
Who Should Attend
Senior level executives responsible for Community Engagement, Public Outreach, Programme Development
Registration: 8.30am • Workshop: 9.00am – 5.00pm
Morning, afternoon refreshments & lunch will be served at appropriate intervals.
Session 1: What are we evaluating? Understanding your programme
Evaluations are about understanding the outcomes and achievements of programmes and learning how those programmes can be improved in the future. With an introduction to the evaluation planning process using a pre-prepared case study, this session explores the first step to evaluating your programme – identifying what your programmes are hoping to achieve.
- Introduction to evaluation: What it is, when you do it, why it is useful
- What is the problem or need being addressed by the programme?
- What are the outcomes and goals of the programme?
- Who in the community will be engaged through the programme?
- What resources are available for the programme and the evaluation?
Exercises: Develop a programme logic to describe the programme activities, outcomes and goals
Session 2: Why are we evaluating? Understanding your audience and defining objectives
Once you understand your programme with a programme logic, it is then important to set objectives for the evaluation. These can relate to a range of priorities, with one of the most common being the ‘audience’ – usually sponsors / funders, the community and other organisations. Having clear objectives and understanding the needs of your audience allow the evaluation to directly inform future decision making, as well as inform how the results will be reported.
- Who are the key stakeholders in the programme?
- What do they need to know, and when?
- What does ‘success’ look like to these stakeholders?
Exercises: Stakeholder mapping and information needs template, develop evaluation objectives
Session 3: What do we want to find out? Focusing on the measurement effort
With a programme logic, an understanding of stakeholders and objectives, the next step is to focus on our measurement efforts. This stage helps to identify the specific areas of interest for our measurement efforts, as well as the indicators that will determine if our programme is successful.
- Identifying evaluation questions (what we want to know)
- Developing different types of evaluation and measurement questions for the case study
- Identifying indicators for outcomes and programme success
Exercises: Drafting evaluation questions and indicators for programme outcomes
Session 4: Data and reporting
The next stage in the evaluation planning phase requires us to think about data and how data is communicated. Community engagement programmes produce a range of data, and understanding what data is available and where it comes from is important for effective evaluation and measurement. Another consideration is prioritising the types of data you will need to avoid burdening community members, as well as the various approaches to data analysis and data visualisation.
- Identify data sources and role in the evaluation
- Review different types of data collection tools and approaches, such as surveys (in-person and online), interviews, vox pops, rapid data collection methods and observations
- Strengths and weaknesses of different data types including quantitative, qualitative and mixed methods
- Data analysis and data visualisation including how to best present data to a variety of audiences (including internal and external stakeholders)
- Formats for reporting depending on different stakeholders
Exercises: Completion of data collection and analysis framework, planning for reporting
Templates: Delegates will receive worked-out example data collection templates (e.g. surveys, interview guides) and a guide on ‘Rapid Data Collection for Events’
Session 5: Evaluation planning for your context
Drawing on our learning from sessions 1-4, this session will focus on understanding how evaluation works in the context of delegates’ own programmes, projects or services. This includes:
- Why is evaluation done in your organisation?
- Who uses the evaluation results?
- What sorts of projects or programmes typically require evaluation?
Exercises: Identify a relevant programme, project or service for evaluation planning
Session 6: Logic models and evaluation objectives
This session takes the exercises from sessions 1-5 and apply them to delegates’ case studies. This provides an opportunity to further apply the skills and tools learned, and also to ensure that everyone receives an opportunity to practice and learn from each other.
Exercises: Develop a programme logic model and a set of objectives that help to identify what delegates are interested in learning about their programme
Session 7: Evaluation questions, indicators and data
Once we have developed a logic model and objectives for evaluation, we will walk through a step-by-step process on:
- Developing evaluation questions and indicators
- Identifying the appropriate methods for data collection and analysis
- Understand how ‘success’ will be measured for your programme, project or service
- Different opportunities for reporting
Session 8: Are you on the right track?
To help consolidate learning, the final session will be a reflective session for delegates to discuss what they have learned, what they still need to know, and opportunities for applying evaluation in their work.