Evaluation and Youth Work
Public Group
Public Group
Active 23 hours ago
How can evaluation support youth programs to share the REAL story of their programs? We invite you... View more
Public Group
Special Topics
-
Special Topics
Posted by REX on November 5, 2021 at 1:40 pmA collection of thoughts, reflections and resources from the Evaluation and Youth Work community.
-
This discussion was modified 23 hours, 55 minutes ago by
REX Virtual Cafe.
-
This discussion was modified 23 hours, 54 minutes ago by
REX Virtual Cafe.
-
This discussion was modified 23 hours, 47 minutes ago by
REX Virtual Cafe.
Saima replied 4 months ago 13 Members · 18 Replies -
This discussion was modified 23 hours, 55 minutes ago by
-
18 Replies
-
I feel deeply disappointed by the limited availability of internship and apprenticeship opportunities in the youth sector, especially for skilled trades. Many young people graduating from competitive college programs—such as electricians—struggle to find apprenticeships in their field, which are essential for entering the job market. After repeatedly applying and receiving rejections, they often become discouraged, stressed, and may turn to unhealthy coping mechanisms, developing a negative outlook on life. This is not only harmful to youth, but also to society as a whole.
Is there any ongoing evaluation or research being conducted to understand and address this issue? Will there ever be a system that guarantees or at least increases the certainty of securing apprenticeships after graduation, helping youth transition into the workforce meaningfully and responsibly? Many young workers who do find placements, particularly in smaller enterprises, report experiencing mistreatment or unprofessional behavior from employers.
I believe this area urgently needs attention, and evaluations of such issues could help identify gaps and lead to the development of more supportive, structured, and positive strategies for youth employment in skilled trades.
-
This is a great resource. Really helped me wrap my head around what counts as process evaluation. Thanks for creating such a helpful, clear glossary!
-
Thanks, Nads, for posing this question! Great to see the discussion and resources shared by folks; I’m happy to provide a few summarizing points and additional resources!
Program evaluation and performance measurement are distinct, yet complimentary methods for learning about and improving organizations or program activities. To understand the differences between the two, we can explore their definitions, some guiding questions, and the frequency in which you would engage in these inquiry activities:
Program evaluation (PE): PE offers a systematic approach to which we collect, analyze, and make use of information. The intention of evaluation is to assess the effectiveness and efficiency of a program or organization, and to improve and inform future activities. PE is conducted as a discrete activity to assess whether a program is working, why, and for who.
Performance measurement (PM): In contrast, PM is an ongoing process that aims to monitor and report on a program’s progress towards its pre-established goals. As a continuous process, PM uses key indicators which are reported on regularly, and used to identify the need for adjustments in order to achieve stated objectives or goals.
Gary provided a great quote that offers an important difference between the two approaches! To add to this, each of these approaches can be guided by different questions. Program evaluation addresses the questions, “why did it happen?”, “how did it happen?” and “where do we go from here?” Whereas when using PM, we may aim to answer: “what occurred?” or “what is occurring?”
Given the inter-related nature of these two approaches, they are often confused, making it challenging to determine when, where, and how to use each tool. The CDC offers a great resource that delves into how program evaluation and PM can complement and support one another, and provides considerations for employing PE or PM.
For a comprehensive comparison of these methods, including two case studies examples, I also recommend exploring: https://www.evaluation.gov/assets/resources/Performance-Measurement-and-Evaluation.pdf
-
Check google for a doc called Performance Measurement to
Evaluation from the Urban Institute. It’s a great read.
“Performance measurement tells what a program did and how well it did it. Evaluation tells the program’s effect on the people, families, or communities it is serving,”
-
Airtable is an excellent tool for project management. I haven’t explored the chart options, but I bet it will be more of a 3D feel in connecting data than excel 2D.
I hadn’t explored pivot tables before this program, but this program introduced them to me. I still struggle to grasp how to use it primarily based on the type of data I keep.
Perhaps when I have more quantitative data that needs quick presentation in a digestible manner, I can use it. I can see the pros, mainly as discussed in the video.The cons could be that it doesn’t work for all types of data, i.e. quantitative, and it can be complex to get the desired diagram/presentation unless one can use third-party design software to get the desired outlook.
In the primary time, I will keep learning and practising.
At the moment, the Pivot table wouldn’t intimidate me anymore.
-
Has anyone tried using pivot tables to analyze program data in spreadsheets? What are some pros and cons? (Using Spreadsheets in Program Evaluation Module 3 Discussion Topic)
Sonya Howard discussion post response:
Apart from this course, I haven’t actually had a chance or need to use pivot tables in order to analyze program data in spreadsheets. I keep meaning to teach myself how to do it using online resources, and this course is an excellent opportunity to try it.
I don’t really know the pros and cons of using pivot tables, beyond those mentioned in the lectures for the YouthREX course, Using Spreadsheets in Program Evaluation. One pro seems to be a quicker way of focusing in on and analyzing the data you may be most interested.
I am aware that there are some cloud-based tools, like Airtable, that may be perhaps more robust than a spreadsheet and that has some database-like functionality. From the minimal exposure I’ve had to Airtable, it seemed to take things perhaps a step further than pivot table, and allows the user to customize and program it even more. As robust as Airtable is, it does seem to take a lot of work and planning on the front end to develop the table exactly how you need. It also seems to involve a bit of a mental shift from 2-D spreadsheet to more of an almost 3-D view of connecting data across worksheets and databases (if that makes any sense.) It also costs money with an annual subscription (I believe) to set up and use (beyond perhaps a possible free smaller version; I’m not sure), but I think data can be imported up from Excel or Google Sheets into Airtable and exported back out again into a spreadsheet or possibly database format.
-
It was mentioned in module 2 that short-term outcomes are measured within 6 months to 1 year after programme implementation, intermediate outcomes are measured within 1-2 years after programme implementation, and long-term outcomes are measured after a long time (no definitive time given). Even barring long-term outcomes, it would mean reaching out to participants of the programme about half a year, a year and two years after their participation in the programme. Do any organisation actually track and measure the outcomes a year or two after programme implementation, and what is the percentages of participants that usually contribute the data after a year on? Also, I was wondering, if only a small percentage of participants actually contribute their data a year or two on, how accurate will the outcome results be in evaluating whether the programme meets its objectives?
-
In May, @KhadijahKanji and I co-hosted a workshop for Cultural Pluralism in the Arts Movement Ontario’s Gathering Divergence Multi-Arts Festival & Conference. This session unpacked Asking About Gender: A Toolkit for Youth Workers and created space to explore strategies for inclusion and innovation for youth arts programs.
We had a great conversation with Daniel Carter from Buddies in Bad Times Theatre, Justine Abigail Yu from Living Hyphen, and Qwyn Charter MacLachlan from Community Music Schools of Toronto (expanding from Regent Park School of Music), and the workshop recording is now available for you to watch on our Knowledge Hub! 💻🏳️🌈🤝
-
In my experience working with small Grass Roots organization in the GTA, finding an independent evaluator that they can afford has been a big challenge and a topic of discussion among management that can sometimes drag on. Community non-profit organizations are underfunded as is and have taken big hits in recent years under the current Premier, allotting the time and resources necessary to undergo a proper evaluation can place extra stress onto organizations that already have their hands full keeping the doors open.
-
The recording of our April 21st webinar with LGBT YouthLine is available on the Knowledge Hub! Be sure to watch and check out the related resources. 💻📚
-
The way that we ask about gender is important. This week, we released two NEW resources developed in partnership with LGBT YouthLine:
@KhadijahKanji , @katarina and I also took to REX Blog to share six considerations for youth work when asking about gender. Check it out!
How do you confront some of the assumptions about gender that we unpack in the toolkit? How do you strive for equity (and accuracy!)? We want to hear from you!
And don’t miss our upcoming webinar, Asking About Gender: Confronting Assumptions and Challenging Transphobia, on Thursday, April 21st, from 1PM to 2:30PM ET.
-
How do you go from moving evaluation being the “downer” of your organizational meetings to being the “fun” part? In this short blog post by Dr. Joanna Prout (lead evaluator for the National Center for School Mental Health), two simple tricks are provided to leverage data visualization in your next evaluation meeting.
Link: Using Data Visualization to Boost Engagement in School Mental Health Evaluation (ncs3.org)
-
Hello, one of our high schools in our community have a form of an advisory committee for Black youth, I am interested in learning ways we might be able to creatively incorporate the evaluative process- thanks for the suggestion.
-
Youth-serving organizations benefit from effective and cost-efficient solutions to build evidence and
advance their impact. Evaluation advisory boards (EABs) are a low-
cost solution to add evaluation capacity and can be mutually beneficial to both youth-serving
organizations and evaluation experts. Check out the article (linked below) to learn more about them!Link: bit.ly/3F06UOd
-
YouthREX is excited to partner with professional information designer Chris Lysy
to offer two interactive webinars to support your youth program evaluations: User Experience (UX) Evaluation and Creative Reporting in Evaulation. To learn more about what each webinar involves, please visit https://youthrex.com/webinars/.🔗Register for User Experience (UX) Evaluation: https://www.eventbrite.ca/e/user-experience-ux-evaluation-registration-210022341477
🔗Register for Creative Reporting in Evaluation: https://www.eventbrite.ca/e/creative-reporting-in-evaluation-registration-210028590167
-
Program evaluation is important for youth sector stakeholders, but it can be tricky remembering its various components. Our team at YouthREX has created a user-friendly glossary that provides a summary of all of the key evaluation terms. Check it out below!
You can also visit YouthREX’s website to
learn more about our Framework for Evaluating Youth Wellbeing and online Evaluation Toolkit. -
Definitely, Caroline! Evaluation capacity is a big barrier to engaging in evaluation by too many youth programs. Two of the key findings from YouthREX’s Beyond Measure study that examined evaluation practices in youth organizations found that they do understand the benefits of evaluation and are enthusiastic about it but need evaluation processes and practices that can make evaluation less burdensome.
Let’s use this Community of Practice to share any processes and practices that make evaluation less burdensome.
-
One of the biggest barriers to evaluation work can be a lack of capacity (e.g., time, resources, and staff). Check out this article by Carmen and Fredericks (2010) that examines capacity for evaluation in non-profit organizations.
Article Link: https://journals.sagepub.com/doi/pdf/10.1177/1098214009352361)
What are some barriers that your organization has experienced when it comes to evaluation?
Log in to reply.