Skip to main content

QRIS Resource Guide

Approaches to Implementation

Approaches to implementationMost States have found that full funding for a comprehensive quality rating and improvement system (QRIS) initiative is difficult to achieve initially, even with the redirection of existing resources. A pilot or a phased-in approach can be an affordable way to demonstrate the value of the program and may lead to increased support among stakeholders. States are also using pilots increasingly to test QRIS elements, with positive results. This section includes issues to consider when conducting a pilot and a description of how some States have used a phased-in approach as an alternative to full implementation.

pdficon Print this Resource Guide Chapter (PDF - 642 KB)

Conducting Pilot Programs

Some States (Indiana, Kentucky, Ohio, Pennsylvania, and Mississippi) conducted pilots before implementing a statewide QRIS, where all eligible providers can participate from any geographic area within the State. States may implement a pilot to measure the efficacy, sustainability, and applicability of a QRIS across diverse age ranges and care settings within their State. Whether a State decides to pilot a QRIS or go statewide from the beginning depends on resources and stakeholder support. Some possible reasons to engage in a small-scale pilot or field test include the ability to:

  • Target available funding in order to build support. Stakeholders may feel it more appropriate to start slowly and produce some positive results on a smaller scale as a way to garner support for statewide implementation.
  • Allow time for implementation approaches to be tested and refined before large numbers of programs are involved in the process. By investing the time and effort to conduct a pilot, a State can enjoy the benefits of customer and community feedback to better inform and revise both the program and the process.
  • Evaluate aspects of the system such as rating scales or professional development supports. For example, a State may be considering different rating scales and may like to compare them in a controlled environment rather than launch something on a larger scale that later needs to be changed.
  • Assess potential program participation and capacity for implementing once the QRIS goes statewide. This will allow for better budget estimates and planning processes.

 

Alabama Pilots Quality STARS

Alabama worked with a state level stakeholders group for a little over three years to develop a five star rating system for child care centers.  In October 2013, Alabama launched an eight month pilot which is being conducted by the University of Alabama, with 50 participating centers who meet the licensing requirements.  Ratings will be given in the form of feedback to the participating centers with focus groups to assess the effectiveness of the new system.  The ratings will be based on information submitted by the center and two onsite assessments:  The first is a three to four hour assessment of the center’s leadership and management practices using the PAS assessment and the second is an ERS assessment of randomly selected classrooms.  After the pilot, when Quality STARS goes live, programs who volunteer to participate will receive ratings that are good for three years.

Oregon Uses Research Partners in Implementation of QRIS

Oregon contracted with Western Oregon University to conduct the field test of the QRIS, working with the CCR&R system which provides improvement specialists to the child care programs.  Western Oregon University will conduct a process evaluation of the QRIS. Oregon has also contracted with Oregon State University to conduct the validation study of their QRIS field test.  One critical goal of Oregon QRIS is to increase the quality of care for children with high needs.  Focusing on this goal has opened up a dialog regarding “children with high needs” including how to identify care and education programs who serve these populations and target them for recruitment into the QRIS.

Puerto Rico Undertakes 4 Phase Process in Development of QRIS, Pasitos

The development of the QRIS Standards, known in PR as the Pasitos instrument, was a rigorous process over two years and was divided into four phases. During the first project year (October to September 2010) were Phases 1 (Development of the instrument) and 2 (Expert Evaluation). The second year (October to September 2011) were performed the tasks described in Phases 3 (Pilot study) and 4 (Development of Electronic Portal).

PR staff worked with professors and graduate students at the University of Puerto Rico, as well as experts in early childhood education, in the development and evaluation of the tool.  The pilot study involved 100 early childhood education centers from child care, Head Start, Early Head Start, licensed private schools and kindergarten.  The fourth and final phase was the creation of an electronic portal that provides information to various audiences (e.g., schools, families, Council on Children, Administration for the Care and Development of Children, Department of Family) and will serve as access to the database that will store all information collected and provide program reports.

The Pasitos Instrument is divided into ten standards aligned to the standards of NAEYC Program Performance Standards, Head Start Content Standards, expectations of the Department of Education degree, regulations for the licensing and supervision of institutions for children, among others.  The instrument enables each center or hall can highlight areas of strength and need. The data analysis is intended to help in making decisions to establish an action plan aimed at strengthening the quality of service offered.

Many factors influence how and where to launch a QRIS pilot, including the availability of funding for a particular geographic area, priority population, or type of program. Some States, such as Pennsylvania, involved both centers and family child care providers in a statewide pilot but limited the number of participants to 400. Other States, such as Kentucky and Ohio, started in selected communities or counties. Virginia limited the pilot to both selected communities and program type, center-based care.

 

Indiana Implements First QRIS at the Local Level

The following timeline highlights Indiana’s approach to launching its Paths to QUALITY QRIS.

Paths to QUALITY initiative was launched in 2000 by the Early Childhood Alliance in Allen County, a family support organization that offers child care resource and referral (CCR&R) services.

One year later, the initiative was expanded to four surrounding counties served by the Alliance, with incentives secured through local community foundations.

In 2005, 4Cs of Southern Indiana implemented Paths to QUALITY in 11 counties with the support of a local community foundation.

In May 2006, the Bureau of Child Care, Indiana Family and Social Services Administration (FSSA) convened a State Child Care Quality Rating System Advisory Group and began considering the feasibility of implementing a statewide QRIS.

In March 2007, a license agreement was signed between the Bureau of Child Care and the Early Childhood Alliance to adopt Paths to QUALITY as the State’s QRIS.

According to the FSSA Web site, there are 2,364 providers enrolled in Paths to Quality as of September 2013. Providers include child care centers, family child care homes, and certified ministries.

Additional information is available at http://www.in.gov/fssa/2554.htm.

Targeting Participation in the Rollout of a QRIS in Arizona

Six hundred programs throughout the State were selected to participate in the first phase of Arizona’s Quality First. Four hundred of these programs were center based and 200 were family child care homes. This represented roughly 10 percent of the State’s centers and 5 percent of its homes. The first step in the selection process was to use the percentage of regulated settings (licensed and certified centers and homes) by region to equitably divide the available slots among regions, thus reducing geographic and rural/urban competition. Then the following selection criteria were applied, each of which had different point values related to priorities of First Things First and State agencies:

  • Percentage of children enrolled in child care subsidy (in three tiers with the higher percentage earning higher priority points)
  • Percentage of children enrolled who qualify for free/reduced lunch
  • Whether the program was a full-year program
  • Whether the program was a full-day program
  • Whether the program served children on weekends or evenings
  • Whether the program had never (or in the last 3 years) been accredited
  • Whether the program had never (or in the last 3 years) participated in any of its State's quality improvement initiatives (such as a Self-Study program through Child Care and Development Fund monies or a United Way Hands on Quality initiative)
  • Whether the program served infants or toddlers

These criteria were used to rank applicants within a region from highest to lowest point value.

Additional information is available at http://www.azftf.gov/WhatWeDo/Programs/QualityFirst/Pages/default.aspx.

The length of time a State will maintain its QRIS in a pilot phase is often determined by the amount financial resources; stakeholder, participant, and community support; and whether the goals for the pilot have been met. Pilots can grow slowly into larger systems by adding new communities, additional provider types, or new/expanded quality levels. For example, some States began their QRIS with a focus on Star 1, and then opened participation for other star levels as the system grew. Pilots can last from a few months (Pennsylvania) to 1 or 2 years (Ohio, Missouri, Kentucky, Delaware) to multiple years (Indiana and Virginia.).

 

Multiyear Rollout of Mississippi System

The Mississippi Child Care Quality Step System used a 4-year pilot phase for its program and implemented it in cohorts as it expanded across the State. During the pilot phase, 80 centers volunteered to be rated. Of those, 78 percent earned 1-Star ratings and 15 percent earned 2-Star ratings. Three centers earned a 3-Star and one center earned a 5-Star rating. Enrollment more than doubled each year during the pilot phase of the system, with 31 centers in the first cohort, 66 centers in the second, and 153 centers in the third. Thirty-six percent of child care centers are participating.  The Mississippi State University Early Childhood Institute provided technical assistance during the development and implementation of the system and continues to enroll and rate centers. Additional information is available at http://earlychildhood.msstate.edu/programs/qualitystars/.

The goals set for the pilot by the State and its partners will influence what data will be collected and by whom, how it will be recorded, and how it will be analyzed and used for adjustments and refinements. QRIS standards are generally informed by and aligned with existing standards such as licensing, national accreditation, Head Start, prekindergarten or State early learning guidelines. The pilot is often used as a way to test how best to align and layer all of these standards into a common QRIS. The procedures for applying for the program, conducting ratings, reviewing documentation, assigning levels, and communicating outcomes can be tested in the pilot as well.

Types of data to be collected in a pilot might include:

  • Participation rates—overall rates, as well as by facility type, size, level, and geographic location
  • Percentage of children served in the QRIS programs
  • Percentage of providers that are able to meet various criteria (such as degree requirements)
  • Utilization rates for incentives and support services, such as professional development or training opportunities, technical assistance supports, or financial incentives
  • Subsidy participation rates for participating providers
  • Participation rates at varying levels of quality
  • Baseline data from assessment tools
  • Parent/consumer awareness of QRIS
  • Feedback from providers on clarity and ease of process and forms/documents

 

Data can be collected in a variety of ways and from a variety of sources. The centers and homes involved in the pilot can provide critical feedback through self-assessments, self-reporting, and documentation. The staff involved in managing the pilot can collect feedback through interviews, observations, and document reviews in such areas as the clarity of explanatory documents, standards, and the application process; sources of evidence or documents to include or accept; the amount and complexity of paperwork; time required to complete various requirements; and availability/accessibility of appropriate training opportunities.

It is important to consider a State’s capacity to gather appropriate and sufficient data to assign accurate ratings, redesign standards, implement procedures, or develop or change providers’ supports. Gathering data that seems “interesting” is only a worthwhile exercise if it is used at some point to inform the system. Otherwise, the process can become costly and frustrating, and can be perceived as unresponsive. Many States have engaged researchers to evaluate their QRIS pilots. These individuals can be helpful in selecting the most appropriate data elements for monitoring and implementation as well as for process and formative evaluation.

 

New York Field Test Informs Revisions

A Field Test of QUALITYstarsNY, coordinated by the NY Early Childhood Professional Development Institute, City University of New York, was completed in 2010. The goals of the field test were to:

  • Evaluate the ease and efficiency of the process of QUALITYstarsNY’s application, documentation, and assessment system under a variety of community conditions (high/low presence of quality improvement supports, geography, program setting types, demographics of children).
  • Validate the standards and the rating scale, i.e., determine whether the points weighting is accurate and whether the star ratings distinguish levels of quality.
  • Demonstrate the value/use of community supports for quality improvement.
  • Gather information about what kinds of improvements programs plan to make to move up in the system. This was done to inform content and the nature of later support efforts.

An independent evaluation was conducted as part of the field test to assess the validity and reliability of the draft program standards. The evaluation data informed decisions necessary for the statewide implementation of QUALITYstarsNY. Based on the field test, the standards for center-based and family-based programs were revised to better reflect the feedback from programs and providers. NY also has standards for public schools and has tested the draft version of standards for school-age child care programs in some programs across the state.  Additional information is available at http://qualitystarsny.org

Redesigning Montana's QRIS for Expansion

Montana’s Star Quality Child Care Rating System has been operating since 2002; an inclusive and broad-based participatory review began in late 2007. The Stars redesign process has become the State’s strategic plan for all early care and education, not just subsidized child care. The goal is to have the professional development and infrastructure support to help providers increase quality, whether or not they are formally enrolled in Stars. The field test of the new system began in June 2010. In May 2012, at a STARS event for directors, there was a unanimous vote to extend the field test. The Early Childhood Services Bureau received additional unexpected funding in early 2013 which allowed for the planning and implementation of a Phase II for the field test.  There have been updates to both the center matrix and the family/group matrix for Phase II.  All changes and updates came about directly from provider and coach feedback, as well as information and data gathered by the Early Childhood Services Bureau (ECSB).The original Star Quality system had three levels: licensing, one level above licensing, and national accreditation (National Association for the Education of Young Children, National Association for Family Child Care, Council on Accreditation).  The redesign focused on adding gradual steps and increasing supports to encourage participation. The new Best Beginnings STARS to Quality system has five levels and includes the following:

Research based criteria

Workforce support through the Montana Early Care and Education Career Path, encouraging professional development along a continuum of training

Maintaining quality over time; renewal based on validation of Level and program improvement plan

Monetary Incentives for continual program improvement based on Level achieved

Resources and support to move through the Levels provided by Child Care Resource & Referral Agencies, the Early Childhood Project, and other state-determined resources.

Program Assessment Tools are incorporated including the Environmental Rating Scales (ERS) and the Program and Business Administration Scales (PAS and BAS) The Center on Social & Emotional Foundations for Early Learning (CSEFEL) Teaching Pyramid Observation Tool (TPOT) and The Pyramid Infant Toddler Observation Scale (TPITOS) scale are part of the coaching experience for programs using the Pyramid Model.

Information about the new STARS to Quality system is available at http://www.dphhs.mt.gov/hcsd/childcare/bestbeginnings/bestbeginningsstarstoquality.shtml.

Evaluation and Piloting to Revise QRIS Standards in Rhode Island

Over several years, a broadly representative community-based group drafted standards and quality criteria for BrightStars. A pilot and random sample evaluation was conducted by researchers from the Frank Porter Graham (FPG) Child Development Institute at the University of North Carolina. Additionally, FGP helped train BrightStars staff to collect data in a valid and reliable manner. The draft center framework included 62 criteria across 28 standards. The pilot evaluation revealed that using all 62 criteria resulted in small quality distinctions. The criteria were reviewed to ensure that each was (1) not already in State licensing, (2) actually feasible to measure, (3) supported by research related to program quality and child outcomes, and (4) able to adequately measure differences in quality. This pared the number of criteria down to 22 grouped into nine standards. Differences between the levels are now meaningful but achievable. The evaluation not only improved the BrightStars standards and measurement tool, it also provided a baseline measure of program quality in a random sample of centers, homes, and school-age programs. Additional information is available at  http://www.brightstars.org/.

Once a State and its partners determine they are ready to move from pilot to statewide implementation, a detailed plan and timeline should be developed. An analysis of available funding, along with each agency’s capacity to implement and manage the system, will also be critical factors in this process.

Most States subcontract the management of some QRIS components. States may have an existing system in place that can be leveraged to support the QRIS. Some States have utilized CCR&R networks and postsecondary institutions to support professional development activities. Virginia provides an example of how some States use a request for proposals process to select and engage local coalitions to manage the QRIS pilot.

One of the strengths of a QRIS is the ability to consistently engage parents through strategic messaging. This can be a problem if the pilot phase is limited to a particular jurisdiction or type of care, or if there are multiple, but different, pilots occurring at the same time. To this end, a critical consideration in the parent education component is “when”? Some contend that a consumer education campaign should be launched early in the process to help build the demand for the system. Others have conducted limited marketing of the QRIS to the general public until they felt the system was fairly well established with enough participating programs and accessibility to parents. Arizona opted to make ratings available to the public after the QRIS was well-established. Ratings are currently available on Arizona's Quality First Web site, accompanied by an explanation of the rating, at http://qualityfirstaz.com/parents-and-families/ .

Additional information on communicating with families is available in the "Consumer Education" section.

If a State does forego a pilot phase and chooses to benefit from lessons learned in other State pilots, it can be especially critical to engage providers and other partners and stakeholders in a strategic implementation process. Although much information can be gleaned from research and lessons learned in other pioneer States, it is important to remember that each State is unique. A State must consider its landscape, history, infrastructure, and overall early and school-age care and education environment, and adapt the information to its particular set of circumstances. A State can test its QRIS standards prior to implementation by distributing them widely, seeking feedback in various ways. Web surveys can be developed for this purpose. Some States have conducted focus group discussions with parents and programs to review and revise standards as well as to discuss application and rating processes. In yet other States, the QRIS plan may be implemented with a periodic review included in the plan. Additional information is available in the “Initial Design Process” section.

 

Oklahoma Makes Adjustments in Response to Feedback

The first QRIS was launched in Oklahoma in 1998. Reaching for the Stars included two star levels. One year later, the State funded a three star level for programs that met two star standards and were also nationally accredited. After 2 years and lagging participation levels, program designers identified that the gap between one star licensing and two star standards was greater than most providers could accomplish. They created a midpoint and time-limited One Star Plus level that provided financial incentives and recognition for providers that needed more support to progress to higher star levels. Additional information is available at http://www.okdhs.org/programsandservices/cc/stars/.  

Phasing In Programs

Although a phased-in approach may be necessary due to limited funding and staff resources or a lack of broad support, policymakers should be reminded that anticipated changes in program quality may not occur with incremental implementation. A phased-in strategy requires careful consideration of which approaches to administration, monitoring, provider supports, and incentives are most likely to be cost-effective in terms of improving quality, ensuring accountability, and increasing participation.

It is also important to realize that a limited implementation strategy is only the first step toward a comprehensive, statewide QRIS. The value of expansion to a statewide QRIS is that it allows all parents and providers to benefit, provides a consistent standard of measurement, and improves opportunities for realignment of resources.  Planning for full, statewide implementation and the projection of total costs should be part of the process, even when a phased-in approach is necessary.

Making decisions about how and when to phase in implementation of a QRIS can be guided by the cost projection process. The Cost Estimation Model (CEM) described in the “Cost Projections and Financing” section can help with projecting costs at scale and guide decisions regarding where and when to reduce costs, if necessary. It is possible to develop multiple cost projections for a statewide program using the CEM. Projections can be made for strategies, such as:

  • A comprehensive plan that anticipates full funding for the next 5 years for each component of a fully implemented QRIS.
  • A midrange or scaled back plan to get started and build support for future expansion, e.g., limited participation, reduced provider incentives.
  • A basic program with fewer provider supports and incentives and fewer accountability measures.

In addition to projecting the cost of various implementation strategies, several other factors may influence decision-making about when to fully implement a QRIS. These include:

  • Rate at which changes are made to QRIS standards or criteria. Changing them too quickly after implementation may be difficult for providers and could potentially erode their trust in the system and their feeling of success and confidence. Generally, States revise a QRIS about every 3 to 5 years. Small changes can be made annually, especially ones that are responsive to participant feedback.
  • Financial incentives and supports. Making a range of financial incentives and provider supports available early on is likely to increase participation among providers. Limiting or targeting incentives and supports is likely to slow participation growth.
  • Level of participation. Early and strong participation will affect how people view the success and value of the program and is likely to help build support for increased funding.

A phased-in approach can take several forms.

  • Limiting initial participation, e.g., implementing with child care centers and not family child care homes. Pennsylvania and Virginia took this approach.
  • Implementing fewer than the anticipated number of levels, e.g., levels 1–3 of a 5-level system.
  • Beginning with a limited number of provider resources and incentives. North Carolina, Pennsylvania, and Oklahoma initially took this approach. For example, Oklahoma had quality improvement grants and scholarships available when it launched its Reaching for the Stars initiative in 1998. Over the next 10 years, in response to demonstrated need, the State added a wage supplement program, onsite technical assistance, specialized consultation, a director’s leadership academy, and training on the environment rating scales.
  • Targeting provider outreach, incentives, and supports to particular communities or providers, such as those serving large numbers of low-income children. Colorado's Qualistar QRIS took this approach.
  • Relying on administrative data (e.g., links to data from licensing or a professional development registry or another third-party source) and self-assessments only, rather than requiring the collection of new data or limiting time spent onsite (e.g., conducting environmental rating scale classroom assessments only when providers apply for higher quality levels). When Pennsylvania’s process evaluation revealed that making technical assistance "responsive" rather than automatic was a much wiser use of resources, taking the pilot statewide became more feasible.

 

Field Test and Phase-in of Oregon Program of Quality (OPQ)

Oregon developed a State designation of quality that serves as a stepping stone between program licensure and national accreditation.  

OPQ is designed to:

  • improve program quality,
  • recognize higher quality programs, and
  • increase the number of programs eligible to partner with Head Start and Early Intervention.

OPR standards resulted from a crosswalk of six common areas across:

  • Head Start Performance Standards,
  • NAEYC accreditation standards,
  • Oregon’s early learning guidelines, and
  • State licensure requirements. 

The field test began in January 2011 with a cohort of 25 diverse programs from across the State. Programs are required to complete orientation, develop a quality improvement plan, and submit a portfolio demonstrating how they meet the standards at the end of a 7-month period. Participating programs receive customized technical assistance and up to $5,000 in quality improvement awards. The second cohort was expected to begin the process in late 2011.

Washington State Development and Phase in of Early Achievers QRIS: Lessons Learned

Washington State followed a careful development process for their QRIS:  2008-2010 field test in 5 communities/counties; 2011 infrastructure development; 2012 regional roll out started; 2013 statewide roll out.  The Department of Early Learning built their QRIS on licensing as the foundation, working with licensors to get them on board.  They worked with Child Care Aware WA to do the regional implementation, reaching out to recruit programs and then offering help in the form of technical assistance, training and coaching.  Another key partner, the University of Washington has developed the Early Achievers Coach Model and trains coaches, conducted an evaluation of the QRIS process and is collecting data on the system.  One key piece of advice they give other states is to start building a data system from day one to protect the integrity of the QRIS.  Another lesson learned is that they found they started with more standards than they have now but realized they were measuring elements twice or that some elements could not be measured.  They advise other states to define their goals and then develop standards that will lead to a measurement that will assess achievement of those goals.  Additional information on the Early Achievers (QRIS) is available at http://wa.childcareaware.org/providers/EA.