HomeAccueil / Resource Sheet: Evaluating The Hub Model

Resource Sheet: Evaluating The Hub Model

September 11, 2018 • 20 min read

NOTE: The focus of this resource guide is on evaluation of the overall hub model, rather than on individual organizations and their programs and services (each of which may have its own evaluation processes, required by funders or generated internally).

However, the impact of partnering on hub programs and with the organization will likely be a part of this evaluation and therefore relevant to the overall hub. For example, asking hub users about the location of a program and/or whether they have used other programs offered by hub partners, will provide useful learning for each individual agency as well as for the hub as a collective entity.

The main purpose of this resource is to contribute to the development of tools and approaches that will further our understanding of hubs as collective entities.

Why It’s Important

Each hub has its own unique origins, structure and approach.  Each has its own vision and related ideas or assumptions about why a given approach is likely to be effective.  Evaluation is an opportunity to explore and test these assumptions, to test “proof of concept”. By sharing this learning, our knowledge and ability to plan, build and sustain effective hubs can only improve.

Evaluations also provide an opportunity, in a formal, thoughtful way, to explore different approaches and degrees of collaboration and their collective impact on populations, communities and systems. An hub evaluation might include these questions:

  • How do co-location and varying degrees of collaboration and integration impact service access and usage by clients?
  • Do people access and benefit from more services because of enhanced connections between those services?
  • Which factors increase use of services by community members (single intake vs. warm referrals, for example). Do such factors lead to improved outcomes for a given sector within a population and, if so, what are those outcomes?
  • How has collaboration (via service network or sharing hub space) impacted each organization? For example, ease of cross-referral, greater client volumes, new partnerships, programs and services, increased community engagement or opportunities for community development?
  • How can hub partners work together to find ways to reach and engage with underserved populations? For example, a collaborative effort to identify, engage, learn from and respond to the needs of isolated seniors will be labour-intensive, costly and require changes to service models. However, the impact on quality of life, aging well at home, preventive intervention and overall system savings can and arguably should justify the investment. This applies to any population group that is isolated, marginalized, faces barriers to service access.

Understanding and developing the role of hubs in community development. Hubs can provide space, resources and opportunities for community members to meet, organize their own activities and build new connections.   By building these relationships and connections, and by sharing space and other resources for community use, the hub is likely to be seen as a safe, accessible space, and this can only benefit the organizations working out of the hub in terms of increased use by target populations.  It can also offer opportunities to tailor services and service models to those communities.

The hub’s governance model provides opportunities to build inclusion and leadership from within communities. Your hub can do this by maximizing opportunities and support to community members to participate on the board, on committees, on advisory groups.  Community members can and arguably should be include in the development of hub policies, procedures and evaluation processes. By extending this approach externally Hubs that can support civic engagement and provide information and evidence to support social, economic and physical infrastructure planning and development.  They can facilitate community presence at the centre of these processes.

What You Need To Know

Approaches to Evaluation: It is important to explore different approaches to evaluation before deciding what is best suited to your specific goals and resources. As the Ontario Nonprofit Network describes it

The term ‘evaluation’ gets used in a variety of ways in the nonprofit world and this can lead to confusion. Sometimes, evaluation can mean very simple satisfaction surveys or basic tracking of attendance. At other times, people may use the term to describe a very complex and long-term piece of applied research. Evaluations tend to lead to action when there is a good fit between approach and expectations and when everyone involved has the same understanding of this fit.

 

The following are brief descriptions of some of the most frequently used approaches to evaluation.  See the Resources section for more detailed explanations of different approaches.  You may choose to use a combination of methods tailored to your own setting and available resources.

Formative Evaluations are used in the early stages of implementation of a program or project to explore how the approach is working and to make adjustments.

 

Process Evaluations focus on how the process is being implemented, whether the initiative is being implemented the way it was planned, and examines any unforeseen changes that have impacted the ability to do so.

Summative or Outcome Evaluation and Impact Evaluations focus on whether the initiative has achieved what it set out to do. For hubs this could include exploring whether co-location or collaboration between organizations has “worked”. For example, have people who would not previously or normally have accessed services done so? Have people used multiple services as a result of the hub model?  Have shared intake or “warm referral” systems been effective in the ways that were anticipated? Have community space and related supports allowed community groups enabled community leadership to grow and built capacity and opportunity in other ways?

Outcomes are easier to evaluate when what you are measuring is “closest” to the deliverables of the program or initiative.  For example, for a program it is possible to ask and to measure whether people participated, whether their lives were changed in the way the program intended, or whether colocation, as described above, has “worked” in the way that it was planned.

While impact on population health or neighbourhood vitality may be worthy goals and driving principles for the hub, they are far more difficult to measure because of all of the social, economic and political factors beyond the hub’s direct control which impact the community.

One way to handle this dilemma is to explore the hub’s contribution[i] to broader change. For example, taken together, the hub’s physical space, programs, resources, and partners’ efforts supporting leadership and fostering community connections may have led to local people becoming more involved with municipal or regional consultation and decision-making. Which civic engagement could lead to improved transit access to affordable groceries or to a nearby hospital.

Alternatively, the hub’s impact on, say, community health and in reaching and delivering services to target populations thereby increasing “wraparound” access, could be undermined by the closing of a local factory, or changes in public policy that are beyond the hub’s control and which negatively impact people and neighbourhoods in other ways.

Langs in their Hub Evaluation Report describe their approach to separating out these different factors, using an Outcome Mapping methodology, where they organized their findings into short-term, mid-term and long-term outcomes.

  • Short-term outcomes that one would expect to see as a result of the hub model
  • Mid-term outcomes that one would like to see emerge when clients access a service or several services: and
  • Long-term outcomes, or the results that respond to the organization’s vision.

http://www.langs.org/ 
p`3

 

Developmental Evaluation or Action Evaluation: The Calgary United Way describes Developmental Evaluation as “well suited to social innovation, or any process that is in the exploratory phase, where specific outcomes are not yet known or are vague at best.”  Community Hubs are well suited to this kind of approach[ii], because they require commitment from many parties and key decision-makers, in order to work towards a shared vision.

What is the hub vision you are evaluating?  Does it involve basic co-location so as to provide a shared base for several organizations, and better access to more services? Does your vision  include community development and capacity-building as pathways to a strong, inclusive, healthy community?   To understand and evaluate the progress and contribution(s) of your hub, you must possess a clearly articulated vision.  When evaluating your progress, you will need to define the various “streams” of activity in support of your goals, as outlined in your vision.

How do you make sure that people see the value of your evaluation? In other words, what is the commitment to evaluation? The ONN notes, “Research suggests that the factors that are most likely to cause an evaluation to lead [to] action have less to do with how good you are at designing a survey or developing a logic model (though both can be helpful) and more to do with how stakeholders view and participate in the process.”  See the 6 factors and chart here.

It is important to explore who needs/wants/should be part of designing, implementing and reflecting on results and applying the learnings moving forward. You might include partners, clients, community members and funders. Are there external organizations – colleges, universities, for example – that could participate in the work?

 

Also ensure your evaluation process is as inclusive as possible, that the communities you represent are not merely “engaged” but at the centre of deciding whether a hub is actually working for them. This is especially true for those who are often excluded or marginalized within urban contexts, such as Indigenous people. You may have incorrect assumptions about what’s important to people, or be basing your approach on directions or constraints enforced by policies, funding streams or service models which are unrelated to your specific context or community – tailor your evaluation for the people or groups your hub is meant to serve. Finally, the principles of inclusion and leadership development are common aims for most hubs.

What resources do you have or need for an evaluation?  You may have existing resources built into your lead organization’s budget, or be able to secure contributions from each of the partners.  Perhaps you can approach existing funders or foundations and donors with an interest in the hub model to apply for additional funding. Again, being able to articulate what and why you are evaluating and how the findings will be used, helps make the case for funding and eligibility for in-kind resources (say, the support of a college, university or social planning body).

If resources are limited, scale your evaluation objectives accordingly. Perhaps select one aspect of your hub that will gain the most from evaluation, where findings can be realistically applied, and which points the way towards the next area for evaluation. For example, you could explore whether or why community members have been using more than one program or agency in your hub, whether they would have done so if the programs weren’t at the same location, how this has impacted their lives, whether sharing space has led to increased attendance for each of the partners, and so on. Asking what you could have done (or do) differently is also extremely important.  Evaluations can help improve the direction of hub activities related to outreach, engagement, cross-referral mechanisms, timing of programs, adding languages or culturally relevant staff and volunteers, or many other components to your services.

How do you plan for the evaluation work itself? How do you design a process that is most likely to be carried out as thoroughly as it needs to be – where the various evaluation tools and measures are applied consistently and applied well? There is a lot of work involved in evaluation – surveys, interviews, data collection and analysis, focus groups, etc. How do we make sure that the people who will be part of doing the evaluation work recognize its importance, have a sense of shared ownership, and have the necessary time and support? You may need a committee with representation from each part of your hub (including community members) to design and oversee the process.

The more involved people are in evaluation design, the more invested they are likely to be. The more invested they are, the more thoroughly and thoughtfully the evaluation is likely to be carried out!

 

How do we ensure the various organizations, clients, broader community members and other participants are aware of the scope of work and the degree of transparency, disclosure and honest discussion that is involved?

If you are evaluating Collaboration, for example, then effectiveness of joint decision-making processes will come under scrutiny, as well as participation in shared activities. These complicated issues touch on power dynamics, organizational culture, and commitment or ability to contribute the resources required for true collaboration. As noted in the Langs report, “Managing relationships for collaboration is an art, and the process is impossible to blueprint, as it evolves organically.”  Working to explore and understand how your collaborative approach is working and learning from the experience involves commitment to participate from all the relevant stakeholders.

How do you design a process that is most likely to be used? What are the most central concepts, questions, processes, outcomes and impacts that you want to explore? Have you worked collectively with everyone involved in the hub to correctly define them? How do we build the results of our evaluation into the ongoing development of our hub? How do we communicate[iii] what we’ve learned – share with all relevant groups – partners, community members, decision-makers, donors – in a way that is accessible? And how do we create space for discussion and further analysis and learning?  Each of these questions must be explored and answered at the beginning of the evaluation process, in order for the final evaluation to have real impact.

What To Look Out For

Ensure Relevancy: Evaluation may seem irrelevant or intimidating to some partners, staff, volunteers, clients.  It is important to match evaluation design to the particular culture and characteristics of your setting – make the questions meaningful and the methods accessible to ensure everyone takes part in the discussion around results. Demonstrating a focus on utilization, as Langs did, and including as broad a range of people as possible when designing the process, reassures people that the evaluation will be relevant, useful and used.  This also increases the potential for shared ownership.

Paying Appropriate Attention to the Collaboration Process: Given that collaboration is at the heart of the hub model, and how complicated it is to build partnerships, shared systems and the cultures which support collaboration, it’s important to focus not just on outcomes, but also on the processes used to get there.  Peter Clutterbuck of the Social Planning Network of Ontario suggests that, “successful achievement of outcomes/impacts often depends on respecting key elements of good collaborative process.” Therefore when designing a hub evaluation, be sure to include thoughtful exploration around how the collaboration process is working.  Ask, for example, whether certain structures and processes have been used as intended. Were there unanticipated changes or outcomes?  For assessing collaboration, consult the Wilder Collaboration Factors Inventory.

Carefully Estimate the Resource Needed for Your Evaluation and Beyond:  It’s easy to underestimate the work involved in evaluation, particularly when exploring a complex model such as a hub.  It is important to consider the time, costs and training involved from the wide range of people and processes involved, including for planning, developing and administering tools (surveys, interviews, data collection), data entry and analysis, managing the process, presenting findings for analysis and reflection, and communicating the findings for different audiences. Given that the evaluation is likely to lead to change, it’s important to consider how you will apply the learnings and plan for the changes which will improve the hub’s effectiveness.

Sources

[1] Rule #5: Seek out contribution – not attribution – to community changes Acknowledge that multiple factors are likely behind an observed change or changes and seek instead to understand the contribution of the Collective Impact effort activities to the change.

Mark Cabaj in https://thephilanthropist.ca/original-pdfs/Philanthropist-26-1-17.pdf

The Philanthropist 2014 / volume 26 • 1

[ii] http://www.calgaryunitedway.org/images/uwca/our-work/social-innovation/leading-boldly/Developmental%20Evaluation%207.pdf

See also Marc Cabaj  https://thephilanthropist.ca/original-pdfs/Philanthropist-26-1-17.pdf

Includes a table that compares traditional and complexity-based development evaluation

[iii] Langs Hub Evaluation report noted that “The presentation of the findings needs to make sense to different audiences. It is a lot to expect for a reader to understand this process; some folks will be more interested in process while others will focus on the findings. The evaluation report is a foundation, but different products could be developed to suit each audience’s needs. In future, it would be timely to include communication planning as part of evaluation design.”

Share this article! Select your platform:Partagez cet article:

Related Content

Articles Liés

{:ca}Langs Hub Evaluation Fact Sheet{:}{:fr}Fiche d’information sur l’évaluation du carrefour Langs, 2017{:}
This easy-to-read, four-page report highlights the findings of an evaluation of Langs in Cambridge and the Family Centre (of Family ...
{:ca}Wilfred Laurier University / Waterloo Collegiate Institute Feasibility Study{:}{:fr}Étude de faisabilité de l’Université Wilfrid Laurier/Waterloo Collegiate Institute{:}
The WLU/WCI Feasibility Study was conducted in 2016 to identify opportunities to pursue the re-development of the WLU/WCI Northdale Lands ...
{:ca}Surplus Property Transition Initiative{:}{:fr}Initiative de transition visant les propriétés excédentaires{:}
The Ministry of Infrastructure has re-launched an expanded version of the Surplus Property Transition Initiative, an existing program to support ...
Major Ballachey Elementary School exterior
Recognized with a 2017 Local Municipal Champions Award by the Ontario Municipal Social Services Association, this unique school-based hub is ...

Join the conversation Rejoindre la conversation

  Subscribe  
Notify of