Testimonials

“I think one of my biggest takeaways, and I still use this to this day, was I was able to come up with our organization’s evaluation framework, just based off a class assignment. Nonprofits generally do evaluations because our funders demand it, but we don’t necessarily have an evaluation framework. We don’t have a way that we think about things. But we were able to realize, ‘We actually do: We have a peer-based framework.’ We believe that childcare providers are uniquely situated to deliver professional development to their peers…. We had trouble communicating our model before because we just didn’t have the language. Now, thanks to the certificate program, we’re able to show just how effective this model is to advocate for this in other states.” — Amelia Vassar, Program Evaluation Online Graduate Certificate, Graduate Fall 2022

AmeliaVassar headshot

 

 

Evaluating to Support Continuous Improvement and Adaptability

Throughout her career, Amelia Vassar has demonstrated her passion for helping others. Her roles have included community organizing for local unions, advocating for human rights, and implementing an antiracism curriculum for childcare providers. When Amelia was appointed to her new role as Senior Director of Equity and Evaluation at The Imagine Institute, she recognized gaps in her knowledge of program evaluation. The Program Evaluation Online Graduate Certificate program at the University of Connecticut (UConn) proved the perfect fit to fill those gaps. Not only did Amelia build her own foundational knowledge, but she has increased her colleagues’ capacity to engage more meaningfully with the evaluation process, strengthen their programs, and dynamically adapt to the needs of diverse communities. By sharing her knowledge and insights, Amelia has empowered her organization with the language and tools to communicate the effectiveness of their peer-based model of professional development (PD) and to advocate for its implementation in other states.

For the past six years, Amelia has worked for The Imagine Institute, a grassroots nonprofit organization located in Washington State. Amelia explains the organization’s general structure: “We support early learning professionals in Washington State in a variety of ways. We support them with professional development and with business supports. We have an incubator program (Imagine U) that helps individuals open new family home businesses, a Technical Assistance program, and a Shared Services Hub to provide backend business supports for early learning professionals. We also contract with external partners in Washington State. These may include government agencies or employers interested in childcare solutions. So, we might work with a city agency that’s interested in expanding childcare access in their city. For instance, the Port of Seattle was interested in childcare solutions for the airport. With a 365-day work environment, they have to grapple with childcare for their employees who live all across the state. They wanted to work with us to understand what they could feasibly do for their workforce. So, sometimes we do internal work; sometimes we do external work, as well.”

Committed to helping others, Amelia loves a good challenge. She describes her initial role with The Imagine Institute and how that became the jumping-off point to her current role: “I was hired to implement an anti-racist curriculum for childcare providers in Washington State. It was the first time in Washington State that this type of curriculum was invested in for early learning, and it was a 60-hour intensive curriculum. We piloted this in King County, then we expanded statewide, before eventually implementing in other states. Getting it off the ground and running was my initial project and then expanding it. As part of that role, I was doing evaluations for this project.  My boss was like, ‘You’re kind of good at that; you should just do it for the whole organization.’ I was like, ‘I’m good at that report, but I don’t really know anything about evaluation.’

Recognizing knowledge gaps

In spring 2021, Amelia was appointed to her current role, Senior Director of Equity and Evaluation. Recognizing the need for professional development to fill the gaps in her knowledge, Amelia chose to pursue UConn’s Program Evaluation Online Graduate Certificate program. “Once I was appointed to my current role, my boss asked me what sort of professional development I thought I would need. I was like, ‘I am not an evaluator at all.’ I had a mentor that I worked with, but I was missing a lot of foundational knowledge, and so a lot of the things that she was discussing weren’t landing. That was why I felt like I needed more background before what she told me would make any sense to me. My mentor recommended a few programs, including UConn. She went to UMass, so I looked at both programs. UConn worked a little bit better for me, schedule-wise.”  

Online format ideal

In tandem with pivoting to her new role, Amelia began UConn’s 4-course, 12-credit certificate program in spring 2021, and taking one course per semester, she graduated in fall 2022. Juggling many competing responsibilities, the flexibility afforded by the program’s 100% online, asynchronous format was essential for Amelia. “As a single parent, working full time at a pretty intense job, the online format was ideal for me. I loved the flexibility of it. The way that it was structured just allowed me space to be able to do the assignments in a way that made sense to me, where I didn’t feel massively overwhelmed.”

Wrestling with academic jargon

While the structure worked for her, Amelia admits to feeling intimidated when first encountering the academic jargon in her textbook, but she says her professor’s support made all the difference. “The program was structured in a way that made sense to me, but I still wanted to cry all the time. The academic jargon, especially in the evaluation program, is just a lot. My professor, Dr. Holli Bayonas, was a jewel to work with. Any time that I had questions, I could reach out to her. I was very nakedly honest and told her, ‘Reading this book makes me want to cry. I can’t even understand a paragraph of this jargon. I feel stupid reading this book.’ I was just very honest with her. She was like, ‘Okay, I get it. I get where you’re at. Don’t read it that way: That’s not working for you. Don’t read it paragraph-by-paragraph, just scan it. You have to know where things are. You have to be able to take away the key points. So just scan it, know where the stuff is located, be able to take key takeaways.’ That was helpful for me. She was amazing.”

Amelia goes on to advise others, “If you are an older head like me, and you open that textbook and your eyes start to cross and you start to feel like maybe this program isn’t for you, take a beat and just take it day by day. Don’t get discouraged. Find a way to make it interesting. Find those things that can make it interesting and relevant to you.  And then also find a paradigm or a model that speaks to you as well, because not all of them will.”

Learning the language to communicate their model

For Amelia, a major highlight of the program was the immediate practical applications of her coursework. “A lot of the stuff I was learning really stuck to me because I was able to do practical things for my organization as part of the coursework. It was really helpful for me to be able to really think through what I wanted to do for the organization. It was great.”

Sharing an illustrative example, Amelia describes how identifying her organization’s evaluation framework has shifted their capacity to communicate the effectiveness of their peer-based model of professional development: “I think one of my biggest takeaways, and I still use this to this day, was I was able to come up with our organization’s evaluation framework, just based off a class assignment. Nonprofits generally do evaluations because our funders demand it, but we don’t necessarily have an evaluation framework. We don’t have a way that we think about things. But we were able to realize, ‘We actually do: We have a peer-based framework.’ We believe that childcare providers are uniquely situated to deliver professional development to their peers. We truly believe this, which is why we don’t deliver the professional development. We contract with childcare providers to deliver the PD. They’re the ones doing the work. We just stay out of the way. We provide administrative support, but they’re best positioned to do this. Because we believe this, our logic should reflect this, and our outcomes should reflect this. Everything should test this theory. So, it was great for me to be able to explain to the leadership team that we should be able to not only explain this framework, but we should be able to test this out. We should be able to talk about this at conferences. We had trouble communicating our model before because we just didn’t have the language. Now, thanks to the certificate program, we’re able to show just how effective this model is to advocate for this in other states: ‘You can grow childcare access in your state by using a peer-based model. We’ve opened 700 new businesses in Washington state using this model. You can try this, too.’”

Contrasting this with the organization’s early challenges of advocating for a peer-based model, Amelia adds, “We’re not an old organization. We’re about eight years old at this point. When we were first formed, there was a lot of scoffing at a strictly peer-based model, because everything is higher education-based when it comes to childcare providers. People believe that childcare providers should be dictated to and monitored, as opposed to actually having a say in how their field operates. So, our model was like, ‘Stop that.’ Now, people are actually seeing that if childcare providers are delivering the PD, you increase language access because they speak the languages of the field. Many of them have been doing this for decades. They have the experience, and they know how this stuff is going to be delivered in the environment in a way that an academic doesn’t.”

Making logic models accessible – and fun

Not only has Amelia increased her own knowledge and capacity for program evaluation, but she has made it her mission to share her insights and understanding with her colleagues. “One of the things that was hammered into us in the program was that it is really important to increase an organization’s evaluation capacity. And I took it really, really seriously – to the point that I’m pretty sure I went overboard with it. But now everyone at my organization understands both the kind of evaluation that we do, which is developmental, and our framework. They know Bennet’s Hierarchy, which is what we use. They know how to do a logic model. They may not understand all the minutia of it, but they understand what a logic model represents. I meet with each department at least twice a year, and we review their program logic. So, they have increased their understanding of evaluation. Everybody understands evaluation to some extent now.”

Amelia says sharing this knowledge with her colleagues has added a dimension of fun to their program evaluation process. “At first, even my boss did not like knowledge models.  She was like, ‘That makes zero sense to me.’ But now she understands them: She understands Bennett’s Hierarchy, and she’s like, ‘Okay, our goal is to move each program deeper along Bennett’s Hierarchy. We need to ask deeper and deeper questions to see if we can get all the way up to Level 7.’ It becomes almost like a game: Which program can we move up higher this year? So, it’s fun. Nerdy fun, but it’s fun in a way where you can make it not so awful and so academic.”

Dynamic adaptability to changing landscape

Amelia describes how the choice of using a developmental framework is aligned with the organization’s goals of continuously adapting to diverse communities and changing landscapes. “The reason we picked a developmental framework is it allows us to adapt to changes. We do semi-annual surveys that go out to all our providers. In those cases, we meet with the programs to understand what it is they want to know about their programs so that can be reflected in the survey, and then to discuss what we’re hearing from the survey to make sure we’re understanding the feedback correctly. We also present that to staff: ‘This is how this feedback stacked up against last year. We’re seeing a drop. People don’t really like that change so much. It seems that this didn’t land so well.’ Or ‘This really landed well. People seem to really like this.’ We also serve so many different languages, where different communities will respond differently. The Spanish providers may really like something, but the Somali providers may really hate it. For that community, you might need to do something different because that community operates differently. So, understanding that you can’t have a one-size-fits-all approach, we’re able to show how things are landing. We tried to pick a framework that allows us to adapt to changes because the landscape is always changing. The funding is always changing. Everything is always changing. Since everything is always dynamic, we needed to apply a dynamic framework.”

Using data to make programs stronger

Through enthusiastically sharing her knowledge and insights, Amelia has empowered her colleagues to engage with the evaluation process more meaningfully and to use the data to make their programs stronger. “I think it’s so important for them to understand how to use the data. With previous evaluations, program leaders were like, ‘Okay she needs this data for some reason, so let’s just get it to her.’ It was one of those things where they felt like they were doing it for me, as opposed to something that was useful for them. Once you start to make it useful for them, they are like, ‘Oh, this is useful for me: I can understand more about how my program works and what’s not working through this.’ It then becomes, ‘What else can I know?’ They can now engage with me in a different way, where they use me as a tool, like ‘Well, can you help me understand this better?’ So, we have a more useful relationship with each other as opposed to just me extracting data from them so that I can satisfy a funder. I’m still doing that, but now I’m also being useful to them in a way where they’re using what I know to make their program more effective.”

Dedicated to her colleagues and the communities she serves, Amelia loves her work at The Imagine Institute. While she has no plan of leaving, Amelia says the certificate program has given her more options for the future: “Prior to this, I had never considered advancing as an evaluator. That was never a career path for me. So having the certificate gives me a lot more options now. If I were to ever leave The Imagine Institute, I could apply for another evaluator position as opposed to what I was doing before, which was more focused on antiracism and equity work. I just have more options available to me by having this certificate. I think it just gives me a lot more flexibility for myself. It was a great program: It was perfect for me!”


"Having the metrics in place before we even invest in the fund requester – helping them anticipate what we need to continue funding them – results in stronger partnerships, with all parties working together toward a common goal. This is the new norm, and now I feel like I'm totally in the groove!" — Chelsea Johnson, Program Evaluation Certificate Graduate

Chelsea-Johnson
Chelsea Johnson is putting her new program evaluation
skills to good use at the
BlueCross BlueShield
of Tennessee Health Foundation.

Getting in the Groove

Chelsea had been working for the BlueCross BlueShield of Tennessee Health Foundation for over five years. As the demands placed on her to evaluate funding effectiveness grew, she felt bogged down. She knew she needed formal training to assess the impact of the company's grants on all parties involved. Thanks to having earned a Program Evaluation Online Graduate Certificate from the University of Connecticut (UConn), she now feels confident in her ability to take on her new responsibilities and be a valuable resource to her company and the nonprofits it supports.

When Chelsea Johnson began with the foundation arm of BlueCross BlueShield of Tennessee, the department was tiny – just her and one other employee. But as it grew to three full-timers, Chelsea found herself increasingly asked to assess and improve the impact the foundation has on the nonprofits it supports.

"I was being tasked more often to identify metrics for the initiatives we support," says Chelsea. "If you don't know what to look for, there are so many things that can bog you down. Plus our nonprofits typically operate on a skeletal crew and don't have the resources to track program effectiveness. If, for example, I were to ask the nonprofit organization to see data on all 2,000 children it serves by the end of the year, there would understandable panic."

In fact, Chelsea says, expanding her knowledge and skills so that she could be a better resource to her clients was a key reason she decided to take the online graduate certificate program in the first place. "The program helped me identify key priorities early on. I'm in a much better position to know the right questions to ask and to make sure the grant requester has the right mindset so that when they implement a program, they know our expectations upfront."

Online collaboration, made easy with VoiceThread and Blackboard

Chelsea admits she felt a bit apprehensive about the online part of the program – after all, she had no previous experience with taking courses online. But as she quickly discovered, the online delivery platform turned out to be a great way to connect with her classmates, even collaborate with them on small group projects.

One tool that made it easier was VoiceThread, an innovative collaboration and sharing platform. Students create short videos in which they discuss their experiences or present assignments, then upload those videos to VoiceThread. Other students can then add their voice, text, audio file, or video comments.

Says Chelsea: "VoiceThread was extremely helpful for working on group projects. Plus the program offered all kinds of other communication methods, from Blackboard discussion groups to email blasts, so I never felt disconnected."

Outstanding support from instructors and classmates

And thanks to all the support she received from her instructors and classmates, she never felt lost. "Dr. Bianca Montrosse-Moorhead, who teaches EPSY 5195 – Practicum, is terrific," notes Chelsea. "Her excitement, responsiveness, and enthusiasm for the material made it very interesting. She was also instrumental in getting everyone to work closely together. For someone like me who had never done real program evaluations, it was great to get feedback from other students. I think it was helpful for them as well. One of my classmates told me that she appreciated all my questions about the process, because she had to stop and think about the various steps involved."

Chelsea also notes that Dr. Holli Bayonas, who teaches EPSY 6194 – Advanced Program Evaluation, offered invaluable insights, helping her identify areas that might be too much to evaluate within the specified time frame. "Her experience was so helpful in driving our projects. We didn't have to spin our wheels, guessing and trying to jam too much into an evaluation."

Practice makes perfect

Best of all, says Chelsea, is the practical experience she gained. During the Advanced Program Evaluation course and final Practicum (Capstone Project), she developed and implemented a program evaluation for Northside Neighborhood House (NNH). This Chattanooga, Tennessee, nonprofit serves clients in financial crisis by providing assistance with utility bills, food, and education.

So what did Chelsea set out to determine? "My purpose was to assess the efficiency and effectiveness of the Direct Assistance program at NNH. My ultimate goal is to use the findings to support strategic planning efforts for the next five years, helping NNH leaders streamline the focus of the program and determine how best to allocate resources going forward."

One of the tools Chelsea especially appreciated learning to use is called Logic Model, which provides a systematic and visual way to identify problems, clarify desired results, and develop a strategy to achieve those results. "You don't know what you don't know until you can see what's missing," she explains. "But using Logic Model, I could see where there were gaps in information, which in turn, helped me develop desired outputs and identify missing resources that would enable us to improve the Direct Assistance Program for NNH. Then I used the model to visually share my ideas and strategies with the client."

Chelsea is already putting her new skills to good use, especially for those smaller grant requesters that don't have the capabilities in place to conduct formal program evaluations. In conclusion, she says: "Having the metrics in place before we even invest in the fund requester – helping them anticipate what we need to continue funding them – results in stronger partnerships, with all parties working together toward a common goal. This is the new norm, and now I feel like I'm totally in the groove!"


"This experience helped me to apply what I have learned in class and gave me the opportunity to successfully navigate the real-life challenges of conducting an evaluation."  — Yordanos Tiruneh, Program Evaluation Certificate Graduate

Practical Meets Academic
Yordanos Tiruneh has a PhD. Yet even with her extensive educational background, she wanted to get the additional practical skills she would need to formally join evaluation professionals. So she decided it was time to get some specific training. She did a Google search, and the Program Evaluation Online Graduate Certificate from the University of Connecticut (UConn) popped up. She called Dr. Bianca Montrosse-Moorehead to get the scoop. A year later, she had earned this prestigious credential and is already putting her new skills to work.

Many students in UConn's online graduate certificate programs need formalized training in a discipline-specific area – people just like Yordanos Tiruneh, PhD, who goes by Yordi for short. Yordi already had a rigorous and extensive training experience in mixed-methods research. Her goal in pursuing this program was to be formally introduced to evaluation as a profession.

"I had been doing public health research for several years," says Yordi, who has a PhD in Sociology with a focus in Medical Sociology from Northwestern University and a Postdoctoral fellowship in Health Services Research from the School of Public Health at Brown University. "I felt I needed to learn discipline-specific theories, principles, and practices that guide professional evaluators to determine what works and what doesn't."

The right fit

After she discovered UConn’s Program Evaluation Online Certificate offering, Yordi called Dr. Montrosse-Moorehead to talk specifics. "Bianca walked me through all of the details about the courses and the program. She helped me figure out whether it was the right fit for me. As we went through the program structure and its focus on practical application, I found it very appealing. She told me exactly what to expect, and after earning my certificate, I can say that the expectations were right on."

There's another key reason the program fit her needs so well. As a full-time employee at an Ivy League university, Yordi needed a program that was entirely online, so that she could fit the work into her schedule. She found the online platform to be interactive.

"In the online platform, you have to go through all of the assigned material, think analytically, and synthesize your thoughts in a comprehensive way that communicates your ideas and opinions. Then you have to post your reflections to the class. It takes a little longer than you would need to interact in a traditional classroom, but it was well worth it. I learned so much from the faculty, as well as from my classmates, who had a lot of practical experience in evaluation."

Yordi credits VoiceThread – an innovative sharing platform – with providing the tools she and her classmates needed to work collaboratively. "Using VoiceThread, we could post videos instead of written comments. When my classmates posted their video comments, I was able to attach voices and faces to specific students. It is like a virtual classroom, but without having to be with all my classmates at a specific time and place."

Yordi greatly appreciated the practical experience she gained during the program. All of the instructors provided real-life examples, talking about the specific challenges they faced and how they tackled them. "You can learn the theory, but getting insights into real experiences from seasoned evaluators was very enriching. They also know so many people in the field of program evaluation. They know where to send you for specific resources or to make connections in the evaluation community." And she adds, "Throughout the program, the faculty was very accessible by phone and by email, and they were always supportive and helpful."

With its focus on practical skills building, during the final course, EPSY 5195 – Practicum, Yordi and her classmates had the opportunity to conduct an actual program evaluation. "I evaluated a diabetes education program in a clinic that serves working poor families," explains Yordi. "I evaluated whether this specific program is being implemented as planned and has resulted in the desired outcomes. This experience helped me to apply what I have learned in class and allowed me to successfully navigate the real-life challenges of conducting an evaluation.”