Introduction

One of the tasks included in the Panel’s terms of reference was the identification of “key indicators that would make it possible to evaluate the effectiveness of programs and services across service providers and sectors”. Many of our recommendations hinge on the Ministry’s ability to gather such information. The importance of accessing and analysing the right indicators is not only a concern for the Ministry, but has been raised in all previous reviews. The framework and indicators presented in this chapter were developed on the basis of a review of indicators that are currently reported by the Ministry or by some residential service providers or are being reported in selected jurisdictions across Canada and internationally.

The challenge of collecting, measuring, and understanding indicators

There are many potential pitfalls in moving from a situation where there has been very little province-wide information available – and virtually no data comparing residential service providers with each other – to one where key indicators are used to “evaluate the effectiveness of programs and services across service providers and sectors”. These include (1) finding an appropriate measure or indicator, (2) ensuring that the information collected is accurate, (3) minimizing the response burden and data collection costs, (4) interpreting the results in their appropriate context, and (5) ensuring that services are not inappropriately incentivised to maximize good rankings on indicators at the cost of other important unmeasured dimensions. The challenge of dealing with these issues is particularly complex in a sector where the use of psychometrically validated measures varies significantly, where there has been very limited public reporting of data, and where there isn’t enough information available to establish contextualized baselines for setting performance targets.

Principles for selecting indicators

Given the challenges inherent in developing meaningful and useful indicators, the Panel has approached its review and recommendations on the basis of several principles:

Reporting on what can be easily measured runs the risk that less important – or possibly even misleading – easily measured indicators end up incentivising service priorities in directions that do not reflect the objectives and values that should be driving services. Indicators should be selected on the basis of a framework, or logic model, that clearly articulates the link between the indicator and the desired outcomes.

  1. Indicators clearly linked to objectives: Indicators should be developed and selected on the basis of an outcomes framework that clearly articulates the short and long-term objectives of residential care.
  2. Incrementalism: Developing a top down set of indicators that also requires the creation of new information systems runs the risks of escalating costs and implementation resistance.
    1. Where possible integrate data across existing databases.
    2. Develop indicators that can be generated using the range of different clinical tools used by local service providers. Imposing a single tool risks undermining its effective clinical use.
    3. Make full use of available data before requiring the collection of new data.
    4. Use sample surveys and research studies to address complex questions rather than attempt to collect extensive information for every young person in residential care. With methodologically sound sampling procedures, small sample exploration provides for more meaningful and accurate information than whole population studies that are often fraught with corrupted data, poor management of data extraction processes and ethical issues related to research practices.
  3. Understanding before benchmarking: Extensive analysis and reporting should be completed to ensure that indicators are robust and truly reflect the objectives they are intended to measure. Most indicators are indirect measures of the intended outcomes and objectives. Before using indicators as benchmarks or performance targets:
    1. conduct extensive trend and contextualized comparative analyses at the provincial level
    2. report publicly
    3. support use of indicators in service providers’ local planning processes

Existing outcomes frameworks and reported indicators

The Ministry, along with most jurisdictions across Canada and internationally, has been putting increasing emphasis on developing methods to track and report on outcomes for young people in residential care. The Ministry’s Strategic Plan for 2013-2018, Growing Together, articulates four overall goals: 1) Children and youth are resilient; 2) Children and youth have the skills and opportunities needed to shape their own lives; 3) Children and youth have a voice; 4) Children and youth experience high-quality, responsive services. While these goals map well to the overall goals that should guide a residential services delivery system, they need to be translated into a more specific set of objectives that reflect residential care processes.

Building on the Ministry’s Strategic Plan, the Youth Justice Outcomes Framework identifies four specific outcomes: 1) improved functioning and positive social behaviours, 2) increased skills and abilities, 3) increased youth engagement with supports and 4) decreased re-offending. Nine indicators have been selected to measure these outcomes, although other than the recidivism indicators, most are still under development and will require the introduction of new data collection instruments.

For the child welfare sector the Ministry is currently publicly reporting on five “performance indicators” in three key areas: 1) safety, 2) permanency and 3) well-being. Safety is measured on the basis of two indicators of recurrence of investigation. Permanency is tracked on the basis of two additional indicators: days of care, by placement type and the time it takes for a young person to be reunified, placed in a permanent alternative home or discharged from care; and well-being is measured for young people in long-term care who report on the quality of their relationship with their caregiver. Additional performance indicators are being developed in collaboration with the OACAS and with the Association of Native Child and Family Services Agencies of Ontario (ANCFSAO).

The three domains of safety, permanency and well-being are similar to domains reported on in a number of jurisdictions across Canada and internationally (e.g., Quebec, British Columbia, Alberta, California). Jurisdictions vary considerably with respect to public reporting on these indicators. The most longstanding source of reports is generated by the U.S. Department of Health and Human Services through their Child Welfare Outcomes Report to Congress. The 2010-2013 document reports on seven indicators, including comparative state-level data, focusing primarily on safety and permanency. A number of jurisdictions also report on well-being indicators. These focus most often on educational outcomes, such as grade-level (e.g. British Columbia) or math and reading scores (England) and in some instances health (e.g. immunization and dental exams, substance misuse - England).

Surprisingly, we were unable to find many examples of system-wide publicly reported data on well-being indicators in Ontario. Well-being indicators are more frequently reported in sub-populations followed through specific assessment initiatives. A growing number of residential care providers are tracking outcomes using a range of self-report measures. For instance, Ontario’s Looking After Children (ONLAC) assessment tool which is completed with many young people in long-term care includes a number of well-being indicators.

Most of the outcomes framework we reviewed focused on sector-specific systems level outcomes, specifically ones related to child welfare systems and youth justice. While many of these, especially permanency and educational outcomes, relate well to the types of indicators that could be tracked for young people in residential care, we found fewer examples of reported indicators that monitor quality of care and the everyday experience of young people. The American Association of Children’s Residential Centers has developed a promising framework based on four types of indicators: practice/process indicators, functional outcomes, perception of care and organizational indicators (American Association of Children’s Residential Centers, 2009). While the AACRC framework is a helpful conceptual model, it has not yet been implemented as a data reporting framework.

Quality of care, continuity and outcomes

Three key dimensions need to be monitored in order to capture the experiences of young people in residential care:

(1) the quality of care provided and experienced in the homes young people are living in, (2) the extent to which residential services are leading to stable long-term caring living arrangements, and (3) the extent to which young people are reaching their educational, vocational and relationship aspirations. While quality care is fundamental, tracking service trajectories is equally important: a disconnected series of high quality placements is unlikely to serve any young person well. Conversely, while many outcomes tracking systems focus on permanency and stability, a long-term stable placement in an unsupportive home is likely to do more harm than good. Finally, it is important to evaluate the extent to which high quality care leading to stable long-term caring living arrangements actually leads to the positive outcomes. Many young people who enter residential care have needs and gifts that may require more than good care.

Building on the recommendations developed in the Panel report, we have identified a set of indicators designed to monitor the quality of care provided within every placement, track service trajectories across placements, and assess short and long term outcomes for young people. We have tried as much as possible to identify indicators that are already being measured or could be feasibly measured using existing data systems, however some indicators will require the introduction of new data collection systems. For each indicator we identify potential data sources, suggest a timeframe for implementation, and, where available, provide examples of jurisdictions that currently report on similar indicators.

Indicators should be initially reported and treated as descriptive indicators rather than evaluative. It is important to report as quickly as possible on a range of indicators to ensure transparency and establish confidence in the residential care system through public reporting without imposing an arbitrary set of performance indicators that oversimplify and potentially distort service and program priorities. Through public reporting the quality of the indicators will improve over time, while trend and comparative analyses and multi-method studies will help to determine their appropriate contextualized interpretation as potential performance indicators.

The recommended list is not intended to be an exhaustive list of relevant indicators. One of the functions of the Quality of Residential Care Branch/Division’s Advisory Council will be to recommend gathering additional information as required. For instance, the suggested indicators related to education do not include more detailed information about the educational supports being provided to young people, such as homework support, peer to peer support or tutoring. Similarly, the proposed family support indicators only track amount of contact; a survey of family perspectives on the support they receive may be important to develop eventually. A growing number of programs are tracking outcomes from youth using various self-report instruments; consideration will need to be given to how to best use these data to assess program success.

Quality of care

Quality of care measures are designed to monitor the quality of care for each residential service provider. Indicators to track quality of care can be generated by the Quality Inspections, Serious Occurrence reports and exit surveys of young people leaving a facility.

Safety

Safety is a core priority for all residential services. Serious Occurrence reports track many Indicators of safety, including rates of injury, physical or sexual abuse by peers or caregivers, and running away.

Data source Timeframe for reporting Examples from other jurisdictions
Serious occurrence reports Within one year

Proportion of children in care who are abused or neglected (U.S Department of Health and Human Services, 2014: See Measure 2.1)

Children in care who were the subject of a substantiation (Australian Government, 2015: See Box 15.11)

The number of children reported missing for more than 24 hours (UK Government, 2015: See Paragraph 65)

Fatalities of Children in Care (British Columbia, 2015a)

Program Coherence: Does the program match its stated objectives?

As part of the Concept Statements that will be required to accompany all licenses, service providers will be asked to describe their program objectives at the program and the client level. Service providers will also be asked to provide evidence relative to measurable indicators related to each program and client-level output and outcome. The quality of care inspectors will assess the extent to which the program elements are indeed in place to meet the stated program objectives, on the basis of their review of the residence’s program schedule, staff background and training, and interviews with residents and staff. These assessments can be summarized in the form of simple Likert scale ratings for each element, producing a composite overall score.

Table provides a program coherence: Does the program match its stated objectives?
Data source Timeframe for reporting Examples from other jurisdictions
Quality Inspection Reports Within two years OFSTED Inspection Reports (UK Government, 2016 & UK Government, 2015: See Paragraphs 151-161)

Staff qualifications, experience and stability

Indicators of the quality of staff that could be easily reported during inspection visits include (1) the proportion of full-time, part-time and relief staff with above minimum required human services credentials, (2) the median years of staff experience working with young people, (3) the median years working in the specific residential setting (turnover rate), (4) the proportion of staffing hours covered by full-time staff; and (5) staff satisfaction with their work environment.

Table provides a staff qualifications, experience and stability
Data Source Timeframe for reporting Examples from other jurisdictions
Quality inspection reports

Within two years

See AACRC Framework (2009)

Staff Development

Quality inspections should include information about (1) the amount of on-going training provided, documenting separately in-house and external training, (2) the frequency of supervision, and (3) the qualification of supervisors.

Table provides a Staff development
Data source Timeframe for reporting Examples from other jurisdictions
Quality inspection reports Within two years Evidence of local arrangements for all carers of looked-after children and young people to receive ongoing high-quality core training and support packages that equip them to provide warm, nurturing care (NICE UK, 2013: See Quality Statement 1)

School attendance, vocational training and employment

For school aged young people, supporting daily attendance at school, or vocational training or employment, is a minimum expectation for quality care. The proportion of young people in Section 23 classrooms is also important context information to track. This indicator does not assess the quality of education or training nor educational outcomes.

Table provides a school attendance, vocational training and employment
Data source Timeframe for reporting Examples from other jurisdictions
Reviewer reports Within two years

Age-Appropriate Grade of Children and Youth in Care (British Columbia, 2015b: See Performance Indicators 5.16, 5.21, 5.26)

Percentage of looked after children achieving level 2 or above (Math, Reading & Writing, and Attainment Gap) (UK Government, 2014: See Chart 1 and Chart 6)

Restrictiveness

The restrictiveness of different settings will vary on the basis of the quality of staff, of supervision, of programming and the types of young people in the setting. A range of indicators can be tracked to reflect the restrictiveness of a specific setting, these include the use of restraints, psychotropics, isolation, one-on-one shadowing and police interventions.

Table provides a restrictiveness
Data source Timeframe for reporting Examples from other jurisdictions
Serious occurrence reports and reviewer reports Within two years Use of psychotropic medication among youth in foster care (California Child Welfare Indicators Project, 2016: See Measure 5a.1)

Family Support

For young people for whom family contact is appropriate, the extent to which a facility supports family contact can be tracked by documenting the number of days of contact, differentiating between home visits, face to face visits and other contact (phone, skype, etc.).

Table provides a family support
Data Source Timeframe for reporting Examples from other jurisdictions
Reviewer reports Within three years N/A

Youth perception of quality of care

The Quality Inspectorate should have a simple web-based or app-based survey that all young people are asked to complete when they leave a residential setting. This should include their perceptions of the following:

  1. Youth feel safe and respected
  2. Youth feel staff / foster-kinship parents care about them as individuals and are interested in their future
  3. Youth develop or maintain healthy friendships with youth in the community
  4. Youth’s unique educational needs are being met
  5. Staff / foster parent(s) actively support and encourage connection to family, community, culture and sexual identity, spiritual needs
  6. There is a consistent adult in the youth’s life who cares about them
  7. A range of athletic, cultural, and social activities are organized and youth’s individual hobbies, sports or artistic interests are supported
  8. Someone spends regular time with youth to help them understand and cope with sad or bad things that have happened to them
  9. Young people are asked to participate in decisions about their care and about the daily activities in the residential setting
Table provides a youth perception of quality of care
Data source Timeframe for reporting Examples from other jurisdictions
New exit survey (build on new YJ survey) Implement survey within one year, report publicly within two

Quality of the caregiver and youth relationship (Ontario Looking After Children study)

Client satisfaction (Australian Government, 2015: Under development, see Box 15.7 & 15.8)

Looked-after children and young people experience warm, nurturing care (NICE UK, 2013: See Quality Statement 1)

Continuity of care

Tracking trajectories of care across residential services provides critical information about the residential service delivery system. These indicators could be tracked either through a dedicated residential care CPIN module that would be used for all young people in residential care, or by combining data from the Youth Offender Tracking and Information System (OTIS) system and CPIN.

Stability

The number of placement changes should be tracked for all young people in care, excluding family visits, summer camps or respite placements. Although this is often measured on an annual basis (# of moves in a year) we recommend that it also be tracked over a 3-year period.

Data source Timeframe for reporting Examples from other jurisdictions
CPIN Within one year

Proportion of children on an order exiting care after less than 12 months, who had one or two placements (Australian Government, 2015: see Figure 15.7)

In First Year of Current Episode of Care - CYIC That Did Not Move (British Columbia, 2015b: See Performance Indicator 5.11)

Average moves in care within 36 months of placement (Quebec, Trocmé et al., 2013: See Figure 8)

The percentage of children in care for 24 months or longer who experienced two or fewer placement settings (U.S Department of Health and Human Services, 2014: See Measure 6.1c)

Permanence

Tracking permanence includes tracking where young people end up when they leave residential care, and how much time they spend in temporary care and stability of reunification or alternate “permanent” placement. Rates of breakdown in permanent placements, adoptions, guardianships or family reunifications should be tracked as well.

Table provides a permanence
Data source Timeframe for reporting Examples from other jurisdictions
CPIN Within one year

Of all children reunified with their parents or caretakers at the time of discharge from foster care during the year, what percentage were reunified in less than 12 months from the time of entry into foster care? (U.S Department of Health and Human Services, 2014: See Measure 4.1)

Proportion of children reunified within 36 months of initial placement (Quebec, Trocmé et al., 2013: See Figure 10)

Home-based care

For young children there is growing evidence that group care should be an option of last resort. Several jurisdictions report on the proportion of young people under 12 in home-based care. This should be tracked in Ontario as well. The placement where the young person has spent the most time should be used in instances where a young person has been in multiple placements.

Table provides a home-based care
Data source Timeframe for reporting Examples from other jurisdictions
CPIN Within one year Proportion of children aged under 12 years in out-of-home care who were in a home-based placement (Australian Government, 2015: see Figure 15.9)

Proximity to home

Young people placed near their home communities are better able to maintain important ties with family, friends and their communities. Distance to home is a simple indicator of the residential system’s ability to support these ties. When a young person has been in multiple placements use the average distance relative to the time spent in each placement.

Table provides a proximity to home
Data source Timeframe for reporting Examples from other jurisdictions
CPIN Within one year

Percent of all children looked after living more than 20 miles from their Local Authority boundary (UK Government, 2015: See Paragraph 33 and Chart 5)

Distance in miles between a child’s removal address and placement address at 12 months (California Child Welfare Indicators Project, 2016)

Keeping siblings together

For sibling groups who are placed in out-of-home care it is important to track the extent to which siblings are placed together. This indicator would record the proportion of sibling groups in care who are kept together. For very large sibling groups use the proportion of young people who are placed with at least one other sibling.

Table provides a Kkeeping siblings together
Data source Timeframe for reporting Examples from other jurisdictions
CPIN Within one year

Proportion of children who are on orders and in out-of­home care at 30 June who have siblings also on orders and in out-of-home care, who are placed with at least one of their siblings (Australian Government, 2015: See Box 15.17)

Count of sibling groups in foster care who are placed with all or some of their siblings (California Child Welfare Indicators Project, 2016)

School changes

In addition to maximizing placement stability, every effort needs to be made to minimize school changes for young people in residential care. For all school-aged children this indicator should measure the average and median number of school changes during their spell in care, including suspensions and expulsions.

Table provides a keeping siblings together
Data source Timeframe for reporting Examples from other jurisdictions
CPIN and OCANDS Within one year

Percentage of looked after children with a permanent exclusion compared to all children (UK Government, 2014: See Chart 10)

Aboriginal care

In addition to tracking placement rates to First Nations, Métis and Inuit children and youth, the proportion of young Aboriginal people placed in Aboriginal care should be tracked. This will require asking family-based placements self-identify their Aboriginal identity. Group homes operated by Aboriginal organizations would also count as an Aboriginal match. Operationalization of this indicator should be developed in partnership with Aboriginal service providers.

Table provides a Aboriginal care
Data source Timeframe for reporting Examples from other jurisdictions
CPIN / OTIS Within one year

Aboriginal children cared for by Aboriginal communities and service providers (British Columbia, 2015b: See performance indicator 5.61)

Children in out-of-home care placed with relatives/kin by Indigenous status (Australian Government, 2015: See Table 15A.23)

Ethno-Cultural/religious matching

For ethno-cultural or religious communities that have raised concern about the placement of their young people, placement rates and placement matching rates should be tracked. The Panel heard from several members of the Black community that this was a priority concern. Operationalization of this indicator should be based on consultation with the concerned communities.

Table provides a ethno-cultural/religious matching
Data source Timeframe for reporting Examples from other jurisdictions
CPIN / OTIS Within one year NA

Cross-Over youth

In addition to tracking placement changes and charging rates (see “restrictiveness”), it is important to report separately on the proportion of youth moving from child welfare or children and youth mental health to youth justice facilities to ensure that youth justice placements are not being over-used to place young people whose behaviour is seen as problematic in CW or CMH placements.

Table provides a cross-over youth
Data source Timeframe for reporting Examples from other jurisdictions
CPIN / OTIS Within one year NA

Outcomes

Key outcomes should be tracked for young people who spend more than 18 months in out-of-home care. Some of this information could be tracked by the Reviewers. Some long-term outcomes could be tracked through data matching with other information systems or through follow-up surveys with a sample of young people who have left care.

Educational achievements

Indicator measured through school records, such as high-school graduation, credit and grade-level attainment, grade level relative to age, EQAO scores, or successful transition from section 23 classes to mainstream.

Table provides a educational achievements
Data source Timeframe for reporting Examples from other jurisdictions
CPIN, reviewers and Ministry of Education Within one year NA

Employment

A key long-term outcome that could be tracked through regular follow-up surveys of random samples of young people who have left care. Data matching could also be considered to track social assistance use, a proxy indicator of employment.

Table provides a employment
Data source Timeframe for reporting Examples from other jurisdictions
Post-care survey and data matching Three years Youth Discharged from Care and Subsequently Claiming Income Assistance (IA) (British Columbia, 2015b: See Performance Indicator 5.36)

Youth crime

Recidivism rates are already being tracked for secure custody youth serving dispositions of six months or greater in the Youth Justice system. YJ convictions (while in care or post-care) should also be tracked for all other youth who spend at least 18 months in child welfare or children and youth mental health residential care.

Table provides a youth crime
Data source Timeframe for reporting Examples from other jurisdictions
Youth Justice Within one year

Proportion of youth in care involved with YJ (Quebec, Trocmé et al., 2013)

Looked after children convicted or subject to final warning (UK Government, 2014: See Chart 14)

Life success follow-up survey

On a cyclical basis a random sample of young people who spent at least 18 months in out-of-home care should be surveyed to assess their educational and vocational outcomes, employment, housing, connection with family and friends.

Table provides a life success follow-Up survey
Data source Timeframe for reporting Examples from other jurisdictions
Follow up survey Every three years Follow up with former foster youth at age 26 on outcomes such as homelessness; perceived social support; current school enrolment and postsecondary drop out; progress paying back student loans; employment, income and benefits; physical and mental health; pregnancy and parenthood; criminal justice system involvement; life satisfaction (Courtney et al., 2011)