Skip to main content

QRIS Resource Guide

Quality Assurance and Monitoring

Accountability and Monitoring pictureWhen a State decides to pursue a quality rating and improvement system (QRIS), it is important to engage providers, partners, and other stakeholders in a strategic process to determine appropriate policies and procedures for accountability and monitoring.

This section addresses the areas of documenting compliance with the standards and criteria, determining the rating levels, deciding how frequently they will be determined, choosing which assessment tools to use, monitoring the rating, and facing a possible loss or reduction of a rating level.

pdficon Print this Resource Guide Chapter (PDF - 735 KB)

Documenting Compliance

The compliance criteria for each standard define what a program must do to achieve a particular level, to move to the next level, or to earn points in a specific category. Documentation for meeting QRIS standards can be in the form of a checklist, a self-report or self-assessment, presentation of materials, and an observation or assessment. It is very important that each criterion and forms of acceptable documentation are clearly defined. Interviews and conversations with providers and interested stakeholders during the design phase will help identify requirements and processes that are not clear or sufficiently defined.

Idaho, Kentucky, New Hampshire, New York and Pennsylvania are among the States that include checklists, listing sources of evidence, on their Web sites.

  • IdahoSTARS QRIS STAR Rating Verification for Family and Group Homes includes an example of the checklist an IdahoSTARS Assessor will use to verify a STAR rating: http://idahostars.org/sites/default/files/documents/qris/qris_star_rating_verification_home_group.pdf
  • Kentucky's procedures manual includes the evidence the Quality Enhancement Initiative (QEI) staff looks at to verify compliance. The STARS for KIDS NOW Standards Checklists are included in the appendix. http://chfs.ky.gov/NR/rdonlyres/A1AE772D-AE26-455A-A5AA-7A4605A256DE/0/STARSOperationsManual.pdf
  • New Hampshire Licensed Plus Quality Rating System Option 1 Standards includes a column that specifies the type of documentation that is required to verify compliance with the standard: http://www.dhhs.state.nh.us/dcyf/licensedplus/documents/option1standards.pdf
  • New York's QRIS standards include a documentation checklist next to each standard. New York has developed an on-line Resource Guide that provides details about the documentation requirement and includes samples of acceptable documents. Upon clicking on the standard, additional information on that standard becomes available, including clarifications, key documents and links to additional reading. The Resource Guide is available at http://qualitystarsny.org/standardsguide.php.
  • Pennsylvania has provider worksheets for each STAR Level. The worksheets are organized in three columns: the standard, pre-designation notes (completed by the program) and designation notes (completed by STARS staff). They serve not only as a way to identify the sources of evidence necessary to comply with the standards, but also as a roadmap to prepare for STAR Level designation. The STAR 1, 2, 3 and 4 Worksheets are available on the PAKeys website, at http://www.pakeys.org/pages/starsDocs.aspx. Separate worksheets are available for school-age care programs.

Many States have glossaries, or definition pages, to more fully define and explain the criteria. They also have companion pieces, such as an application manual (Maine), or a program guide (Delaware) that help the providers and other interested individuals better understand the QRIS. See the “Standards and Criteria” section of this guide for additional information.

As QRIS evolve in a State, documentation requirements may change or need clarification. Any changes need to be communicated to all stakeholders. As participation in the QRIS increases, the capacity of the documentation and assessment system must increase accordingly. The goal remains to make accurate verification and timely rating decisions.

 

Oklahoma Responds to Unintended Consequences in Its QRIS

Oklahoma’s Reaching for the Stars QRIS policy and procedures are specific and detailed so that staff and providers understand the process. This is essential because of the significant financial consequences of star status on tiered reimbursement rates. Because it is difficult to evaluate a program when it first opens, the QRIS policy initially stated that a program could not apply for a higher star level until it had a full license, generally after 6 months of operation. This imposed a hardship on new programs as well as on existing child care centers, particularly if there was a change of ownership. Under new ownership, the tiered rates dropped dramatically, jeopardizing the continued quality of the center. As a result, the policy was changed to allow new programs with an initial permit to participate. Additional information is available at http://www.okdhs.org/programsandservices/cc/stars/

Some States permit multiple methods to demonstrate compliance with QRIS standards. One area where States frequently accept equivalencies is educational qualifications and attainment.

 

National accreditation is another standard that is often used as an equivalent measure in a QRIS. States that incorporate national accreditation systems into their QRIS generally do so as equivalent to, or required for, higher levels of quality. Most States accept more than one national accreditation and typically base this decision on a comparison of the accreditation standards with their QRIS requirements.  Several States have developed online crosswalks that are used to compare several sets of standards (e.g. State standards, accreditation, and Head Start Performance Standards). The National Center on Child Care Quality Improvement provides an online National Program Standards Crosswalk Tool that is designed to help States that are developing and aligning program standards for licensing, quality rating and improvement systems, and/or prekindergarten programs, to search and compare the content of several sets of national standards. The Crosswalk Tool is accesible at https://occqrisguide.icfwebservices.com/.

States may decide to include standards in addition to national accreditation if they feel that standards are not sufficiently incorporated in the accreditation system or monitored with enough frequency.  An example of this is the requirement for program assessments, such as the Environment Rating Scales.   Information about the use of accreditation and the accreditation organizations accepted by each statewide QRIS is available in Accreditation Accepted for QRIS at https://occqrisguide.icfwebservices.com/files/QRIS_Accred_Accepted.pdf.

Most QRIS include a range of choices to demonstrate compliance in the areas of staffing standards, such as accepting different types of professional development activities, as well as various ways to meet the compensation and benefits standards. The family involvement standards component is another area where choice is the norm; most States permit QRIS participants to identify a range of acceptable parent activities and supports. When States are considering multiple ways to demonstrate compliance, they can consider such questions as:

  • If there is another way to document compliance, is it equivalent to the competencies required in the QRIS?
  • Do providers have access to programs and supports that will help them demonstrate compliance? If not, does the State have the capacity to make them available?
  • If providers can seek validation from an outside group, association, or system to document compliance, does the outside entity have the capacity to meet the provider requests in a timely manner?
  • Are there financial implications for the state or the provider involved with alternate pathways?

Alternative Approaches to Meeting Requirements in Louisiana

In the Louisiana QRIS, Quality Start, there are four alternatives for Directors to meet the requirement for 3 semester hours in administration training at the 2 Star level: (1) Louisiana Administrator Certificate, (2) National Administrator Credential, (3) 3 years of experience in administration, or (4) combination of 1 year of experience in administration and 4 years of teaching young children in an early childhood program. Louisiana’s Quality Start standards are available at http://www.qrslouisiana.org/child-care-staff/pathways-child-care-career-development-system.

Alternative Approaches to Meeting Requirements in Iowa

In the Iowa Quality Rating System, a center director can meet a professional development standard by having a current National Administrator Credential, an Aim4Excellence credential, or a license as a Pre-Kindergarten principal issued by the Board of Educational Examiners. Iowa also accepts different criteria at the Level 1–Licensing Level: full license OR a provisional license with no action to revoke or deny OR operates under the authority of an accredited school district or nonpublic school. Additional information is available at  http://www.dhs.state.ia.us/iqrs/.

Ideally, if a State has an integrated, comprehensive early and school-age care and education system, documentation from various components of the system could be shared to verify compliance with QRIS standards. Sources could include licensing data, a professional development registry, the Head Start Program Review Instrument for Systems Monitoring data system, accreditation monitoring data, or prekindergarten program monitoring data. The Maine QRIS was designed to maximize data from licensing and Maine Roads to Quality, the State’s professional development system.

 

As States develop new or expanded data systems, a challenge and an opportunity exist in cross-sector and cross-system utilities. The Pennsylvania Enterprise to Link Information for Children Across Networks (PELICAN) is one example of a sophisticated, integrated child and early learning system. This system draws data from subsidy, licensing, Pennsylvania Pre-K Counts, the Pennsylvania Keys to Quality QRIS, and other Pennsylvania early learning programs into one integrated data system. Automation offers exciting opportunities to create user-friendly systems that not only draw data from multiple sources, but also use these data to help inform consumers.

Online Application Simplifies Process in Maine

For its QRIS, Quality for ME, Maine created an online application process, which is linked to the State's automated professional development registry. Providers begin the application process by entering their six-digit, unique child care license number. This number enables access to the system and automatically triggers the Maine Roads to Quality (MRTQ) Professional Development Registry records for the site. Providers verify or update these records, and go on to respond to queries that request the additional information required to determine their quality level. The MRTQ Registry provides accreditation and Head Start data, which are also used to help determine a quality level. Upon completion of the application, the system triggers a report that includes a brief overview of what quality level the provider is likely to receive based on the information entered. It also tells the applicant what is missing as well as what it would need to do to move to the next step in the QRIS. This report is then sent to Department of Health and Human Services staff to verify licensing compliance data and provide any other necessary approvals. Filling out the application is meant to be an educational experience for providers. The online system allows the user to hold the cursor over words which bring up popup boxes with definitions and other helpful information, including examples of policies and practices that meet the QRIS standards (e.g., model parent handbook or classroom planning tool). Additional information is available at http://www.maine.gov/dhhs/ocfs/ec/occhs/qualityforme.htm

Delaware Alternative Pathway Avoids Duplication of Documentation

National Association for the Education of Young Children (NAEYC) accredited programs, once accepted into Delaware Stars, must provide a copy of their accreditation and complete the Delaware Stars Orientation specific to NAEYC programs.  Once complete, NAEYC programs are designated Star Level 5 and may keep their Star Level by maintaining their accreditation and submitting copies of their annual NAEYC reports to Delaware Stars.  NAEYC programs are assigned a Technical Assistant (TA) and are required to submit an annual report to Stars.  Head Start and Early Childhood Assistance Programs (ECAP), once accepted into Delaware Stars, must provide a copy of their most recent monitoring report and complete the Delaware Stars Orientation specific to Head Start/ECAP programs.  Once complete, programs are designated Star Level 4 and may keep their Star Level by maintaining good standing with Head Start and ECAP monitoring standards and by submitting copies of their annual monitoring reports to Delaware Stars.  Programs may choose to maintain at Star Level 4 or move up to a Star Level 5.  To achieve Star Level 5, programs must request an ERS assessment and meet the minimum required classroom scores for that level.  Programs are eligible for technical assistance when actively working on achieving Star Level 5.  The 2012 Program Guide is available at http://www.delawarestars.udel.edu/wp-content/uploads/2012/08/DEStarsProgramGuideSeptember2012R.pdf.

Several States have found it helpful to have programs prepare for their QRIS evaluation by completing rating readiness tools. Washington describes this process as follows (http://www.del.wa.gov/publications/elac-qris/docs/EA_operating_guidelines.pdf):

As part of evaluation preparation, facilities will work with their local lead agency staff to complete a Rating Readiness Tool. The tool is a checklist created by the [University of Washington] that helps facilities and the evaluation team plan for a successful, efficient on-site evaluation visit. The tool collects facility information for the evaluation team including:

  • Site map, classroom schedules, and other relevant facility details
  • Confirmation that facility has collected consent from all families
  • Location of documentation and files for Data Collector review
  • Which Quality Standard components the facility plans to demonstrate during the on-site evaluation visits

Following are some additional examples of ways in which States help programs determine their readiness to be rated.

Some States use Level 2 as an assessment of rating readiness, especially when Level 1 is licensing.  Level 2 standards can serve as an assessment of whether a program is meeting slightly higher standards than licensing.  Most of these standards do not require onsite verification (this might include, for example, a personnel policy manual, parent manual, or staff qualifications). These States typically do not use a formal observation tool for rating purposes before Level 3.  Programs receive more intensive coaching at levels 3 and higher, once they have demonstrated the ability to meet the basics.

In Pennsylvania, monitoring compliance for the lower levels, Start with STARS and STAR 1, is a paperwork process; whereas STARS 2 through 4 require an annual onsite review of standards compliance by Pennsylvania Keys to Quality Program regional staff.

Frequency of monitoring is often determined through examination of several factors:

  • Available financial resources
  • Availability of staff with appropriate skills, knowledge, and time to perform functions
  • Determinations related to validity and integrity of data collection
  • Connections to other systems and their monitoring and compliance processes

Most States conduct monitoring of compliance with standards for their QRIS on an annual basis. Others monitor more frequently. Oklahoma monitors programs for licensing three times per year, and, if any QRIS violations are observed, they are documented. Overall QRIS compliance is reviewed annually, whereas the environment rating scales (ERS) assessments are conducted only once every 3 years.

The method and frequency of monitoring may vary by standard. Some standards, such as current staff qualifications, may only need to be verified one time as long as the staff and their qualifications remain unchanged. Other standards, such as professional development requirements for ongoing training, need to be checked annually. This can accomplished through a paperwork process or verification through training organizations or data imported from registries. Other standards, such as implementing curriculum or posting lessons plans, may require onsite observation. Monitoring or verification may also be triggered  under certain circumstances, such as staff changes, particularly as it relates to the director, or serious licensing violations.

Tennessee's Use of Unannounced Monitoring Visits (UAV)

Unannounced visits are conducted annually because Tennessee recognizes that a program can change quickly. All licensed agencies must receive one announced visit and a minimum of four unannounced visits each year.  The number of UAVs each agency receives is determined by its “star rating.” The UAV schedule for agencies operating year round is as follows:

  • New agencies and agencies eligible for 0 stars = six UAVs/licensing year
  • Agencies eligible for one star = five UAVs/licensing year
  • Agencies eligible for two or three stars = four UAVs/licensing year

Additional information is available at  http://www.tennessee.gov/humanserv/adfam/ccrcsq.html.

Determining the Rating Level

Identifying the entity(ies) with capacity to effectively administer a QRIS over time is a central issue to consider in the design phase. Most statewide QRIS are administered by a State agency in partnership with private sector entities. The QRIS administrator typically performs several basic functions, including:

  1. Initially assessing program quality and assigning a QRIS level;
  2. Monitoring compliance to ensure system integrity;
  3. Conducting classroom assessments (using the ERS, the Classroom Assessment Scoring System [CLASS], or another instrument);
  4. Providing training and technical assistance; and
  5. Managing system planning, engagement, and outreach (e.g., data collection and analysis, Web design and upkeep, marketing development and public information dissemination, etc.).

In most cases, each of these functions is the responsibility of different staff members, many of whom may be with contracted agencies or privately funded partners. Several States use State agency employees for functions 1 and 2 (assigning the initial rating and monitoring compliance) and contract with outside entities for functions 3 and 4 (conducting classroom assessments and providing training and technical assistance). However, these staffing patterns vary and are often influenced by available funding and current staffing needs and resources. For validity of the system, it is important to separate the functions of conducting assessments and providing technical assistance. In other words, technical assistance providers should not also be responsible for assessing programs.

Licensing staff in North CarolinaOklahoma and Tennessee monitor QRIS criteria, and a separate team of assessors conduct the ERS. In Oklahoma, the ERS assessors are housed in the University of Oklahoma’s Center for Early Childhood Professional Development.

Arizona's Quality First QRIS is administered by First Things First (a governmental agency funded with tobacco tax). Participation in Quality First begins with an initial assessment, during which a Quality First assessor visits the program to observe classrooms and interview teachers.

All programs enrolled in Quality First receive a coach, who visits the program on a regular basis and supports programs with technical assistance. The coach reviews scores from program assessments and helps the program create a plan for improvement. Additional information is available at http://qualityfirstaz.com/providers/.

Washington’s Department of Early Learning partners with the University of Washington (UW) to administer Early Achievers QRIS (http://www.del.wa.gov/care/qris/). The University is the lead agency for evaluation, assessment and rating assignment. Data Collectors from UW conduct facility on-site evaluation visits. The University is also responsible for the development of the Early Achievers Coach Framework. Washington designates several roles responsible for monitoring and supporting providers in their QRIS:

  • Regional Coordinator: Approves or denies a program's request to be rated;
  • Community Liaison: A member of the University of Washington (UW) evaluation team, the Community Liaison supports the facility and the data collectors to have a successful visit;
  • Coach: Also with UW, the Coach participates in ongoing professional development and consultation with the program;
  • TA Specialist: With the local lead agency, the TA Specialist works with the program to develop a work plan and timeline; and
  • Data Collector: With UW, the Data Collector collects data through observations, interviews and reviews of records and documentation. The Data Collector also administers the CLASS and ERS.

The Office of Planning, Research and Evaluation’s publication titled Defining and Measuring Quality: An In-Depth Study of Five Child Care Quality Rating and Improvement Systems (2011) details the number, caseload and qualifications of QRIS raters and assessors in five Quality Rating and Improvement Systems (Miami-Dade County, Illinois, Indiana, Pennsylvania, and Tennessee). The publication is available at http://www.acf.hhs.gov/programs/opre/cc/childcare_quality/five_childcare/five_childcare.pdf.

In many States, CCR&R agencies play a key role in QRIS administration and often coordinate QRIS training and technical assistance. Institutions of higher education are also important partners and frequently assume responsibility for classroom assessment as well as help with data collection. Public-private partnerships, such as early and school-age care and education advisory committees, are often charged with planning, engagement, and outreach functions. In short, QRIS implementation is often a team effort.

State experience suggests that the decision to use State licensing or other staff to assign ratings, and outside entities, such as CCR&R agencies, institutions of higher education, Cooperative Extension, and others, to assist with training and technical assistance, is often a strategic way to build on and expand current investments and maximize all available early and school-age care and education dollars.

 

Shared Management Approach in Virginia's QRIS Pilot

Virginia’s Star Quality Initiative (VSQI) is a public-private partnership between the Virginia Early Childhood Foundation (VECF) and the Office of Early Child Development (OECD), a division of the Virginia Department of Social Services. The administrative partnership of these two entities is referred to as “the Hub”, while each partner maintains clear responsibilities.  The Office of Early Childhood Development’s responsibilities include:

1.     Facilitate selection and training of raters

2.      Set criteria and training standards for mentors

3.     Review applications for each participating program

4.     Conduct documentation reviews for Standards 1, 3, and Transition Practices

5.     Maintain the QRIS online database

6.     Provide technical assistance to regional and local coordinators, raters, mentors and participating providers

7.     Conduct strategic decision-making and planning for the initiative

8.     Manage proposals and monitor contract implementation/compliance

 The Virginia Early Childhood Foundation’s responsibilities include:

  1. Maintain the quality and integrity of the standard and rating process:
    • Oversee the inter-rater reliability process
    • Review Program Rating Summary Reports
    • Award star ratings
    • Manage the appeal process
  2. Provide resources to help regional and local coalitions implement QRIS
  3. Provide technical assistance to regional and local coordinators, raters, mentors and participating providers
  4. Share responsibility with VDSS/OECD for strategic decision-making for the initative

Additional information is available at http://www.smartbeginnings.org/Home/StarQualityInitiative/ForEarlyChildhoodProfessionals.aspx.

In most States, licensing is an integral part of the QRIS, serving as the foundation other standards build on.  Frequently, the QRIS is monitored by the licensing agency alone, or in partnership with other agency staff or a private entity. Using licensors who are already funded to make periodic visits to programs makes good fiscal sense.  Strategically linking QRIS to licensing could provide an opportunity to increase the number of licensing staff, reduce caseloads, and broaden their role. For example, Oklahoma added 27 licensing staff when they became responsible for monitoring QRIS compliance. However, an assessment must be made to determine if the licensing system can adequately support this new responsibility.  If a licensing program is unable to adequately monitor child care or their sole focus is on enforcement, they will face greater challenges in monitoring a QRIS. If licensing managers are included early in the QRIS planning process, they may have valuable contributions to this discussion.

If licensing staff are not required to have any background in early or school-age care and education, it will be more difficult for them to provide the encouragement and support providers need to participate in a QRIS.  Trends in Child Care Center Licensing Regulations and Policies for 2011 (NCCCQI, 2013) notes 34 States where licensing staff provide technical assistance or consultation to help licensees move beyond minimum licensing standards. Most States (38) require staff to have at least a bachelor’s degree to work as a licensing specialist; however, only 16 States reported that the content or major of the required degree or coursework must be in early childhood education, child development, or a related topic. Twenty-two (22) States require experience working in a setting with children.  Although 25 States require licensing line staff to complete training each year, only 13 mandate additional training in early childhood education.

States have policies and procedures for renewing rating levels, and several States also set a time limit on how long a provider can be at one rating level, During renewal, providers generally can earn higher or lower ratings, based on the standards they meet, or they can keep their current rating levels.

When discussing QRIS ratings, it is important to differentiate between two separate, but interrelated, functions: assigning a rating and conducting a classroom or home assessment. Most States use classroom or home assessments, such as the ERS, as one—but not the only—tool to assess compliance with QRIS criteria in an area of learning environments. These two functions can occur on the same cycle, such as annually, or they can occur at different points in time. On average, States assign ratings and conduct classroom assessments annually.

  • In Delaware, a program accepted for enrollment has one year to complete the steps required for Star Level 2 or the program will be terminated and must reapply for enrollment in Delaware Stars. After a program achieves Star Level 2, they must re-verify for their current Star Level every two years if maintaining or actively working to achieve a higher Star Level.
  • Illinois requires renewal every 3 years, including reassessment on ERS, BAS, PAS, or CLASS. Rated programs must provide an annual report including licensing compliance, staffing requirements, etc. to maintain the rating.
  • North Carolina assigns ratings every 3 years and monitors annually for maintenance of ratings. A reassessment of the rating may also be conducted before the 3-year time period if the annual monitoring identified certain indicators, e.g., high staff turnover, a new director, or serious licensing violations. A program may also request a rating reassessment once a year if it anticipates its rating will improve.
  • In Ohio, providers must renew their ratings on an annual basis. At that time they can apply for higher ratings if they meet the standards, maintain their current ratings, or reduce their ratings if standards for higher ratings are no longer met. A rating will expire if a provider does not apply to have its rating renewed. Prior to the rating expiration date, providers receive written notification and instructions for renewing their ratings.
  • In Oklahoma, the license and star status are nonexpiring, based on documented compliance at monitoring visits that occur at least three times per year.  Providers can be at the One Star Plus level as long as they are working toward an education component needed to achieve higher standards and move to the Star Two level.
  • Maine assigns ratings every 3 years but only requires classroom assessment in sites that are selected to participate in the QRIS evaluation.
  • Providers in Rhode Island agree to participate in BrightStars for a 3-year period. At the end of that period, providers choose to continue and be reassessed or terminate participation without penalty. If a provider chooses to terminate participation during the 3-year period, the provider will not be allowed to participate again for 24 months from the time of termination. To maintain their rating, providers submit annual reports to BrightStars and may request an adjustment to their rating once a year.
  • Ratings in Vermont are valid for 3 years if the annual report documents maintaining the standards for this rating. If a provider is unable to maintain the standards of its current star level, no action is taken until the provider submits the annual report. Providers request points based on the standards they can demonstrate in the report. Participating providers renew their certificate for the number of points they can verify (may be higher, same, or lower than previously earned points). Reminders to submit the annual report are automatically sent to participating programs and must be returned in order to maintain participation and star level.

Decisions regarding how often QRIS ratings are assigned, as well as how frequently classroom assessments are conducted, will be influenced by available resources. Conducting reliable, valid classroom assessments can have a significant financial impact. In addition to the time it takes to actually conduct an assessment, write up the results, and travel among multiple sites, time and funding must be made available to ensure that raters receive appropriate training and that inter-rater reliability is assessed on a regular basis. Additional information is provided under the question, “What assessment tools will be used? How will they be used?”

As noted earlier, QRIS compliance is typically based on a number of factors, only some of which are determined by a classroom assessment.

Most of the States that require a classroom assessment to evaluate program quality currently use the ERS developed by the Frank Porter Graham Child Development Institute at the University of North Carolina at Chapel Hill. These scales include:

  • Early Childhood Environment Rating Scale-Revised (ECERS-R)
  • Infant/Toddler Environment Rating Scale-Revised (ITERS-R)
  • School-Age Care Environment Rating Scale (SACERS)
  • Family Child Care Rating Scale (FCCRS).[1]

Each one of the scales has seven areas of evaluation: physical environment, basic care, curriculum, interaction, schedule and program structure, and parent and staff education. Quality in Early Childhood Care and Education Settings: A Compendium of Measures (Halle, Vick Whittaker, & Anderson, 2010) provides detailed information on program assessment measures including the purpose of the measure, intended ages and settings, administration of the measure, reliability and validity. This document is available at  http://www.acf.hhs.gov/sites/default/files/opre/complete_compendium_full.pdf.

Some States are also using more focused assessment tools that measure interactions, classroom practice, and administrative practices in addition to or in lieu of measures of global quality:

  • Oklahoma recognizes the Child and Caregiver Interaction Scale, the Arnett Caregiver Interaction Scale, the Early Learning and Literacy Classroom Observation (ELLCO), and the Program Administration Scale (PAS).
  • In Ohio, self-assessments are required, but programs can use an ERS, the Early Learning and Literacy Classroom Observation (ELLCO), or other assessment tool, and scores are not tied to ratings.
  • Mississippi uses scores from ERS to determine rating levels.
  • In Rhode Island, CLASS scores are collected from a random sample of 33 percent of preschool classrooms. Scores were not used in the rating process during the first year of implementation.
  • Massachusetts is also among the States that requireassessments with the PAS for administration. Assessment with the Business Administration Scale (BAS) is required for FCC providers.

In some cases, classroom assessments are required and the scores are used to help determine ratings. Other States have made the assessment optional—as one way to accumulate QRIS points—or require it for programs seeking higher star levels only. Some States require programs to be assessed with ERS but do not tie particular scores to the ratings. Information about the program assessment tools used by States that are implementing a statewide QRIS model is available in Use of ERS and Other Program Assessment Tools in QRIS (NCCCQI, 2013) at https://occqrisguide.icfwebservices.com/files/QRIS_Program_Assess.pdf.

 

Some QRIS evaluators have raised concerns about the range of tools used to measure classroom quality as well as how these tools are implemented. The Office of Planning, Research and Evaluation’s (OPRE) brief, Issues for the Next Decade of Quality Rating and Improvement Systems (Tout, Zaslow, Halle, & Forry, 2009), cautions that even when States use the same tool, such as ITERS-R/ECERS-R/FCCRS/SACERS, significant variations can occur based on how the tool is applied. Indeed, State QRIS standards and compliance practices vary widely. Table 5.4 in the Compendium of Quality Rating Systems and Evaluation (Halle, Vick Whittaker, & An2010) lists the percentages of classrooms assessed by State with the most common approach being 33% with at least one of each age group.Some States average the scores for all assessed classrooms; others base ratings on the lowest score. In addition, States that use ERS assessments to actually assign scores do not use the same quality threshold. As more States adopt a QRIS as a way to improve the quality of early care and education, it will become important to address some of these inconsistencies. OPRE’s issue brief is available at http://www.acf.hhs.gov/programs/opre/resource/issues-for-the-next-decade-of-quality-rating-and-improvement-systems.

In determining what percentage of classrooms to assess using a classroom quality measurement tool, States have had to balance financial resources with the validity of the assessment. The authors of the classroom measurement tools can advise on the minimum number of classrooms to assess so that the resulting average is an accurate measure of the overall quality of the program. The OPRE publication Best Practices for Conducting Program Observations as Part of Quality Rating and Improvement Systems, available at  http://www.acf.hhs.gov/programs/opre/resource/best-practices-for-conducting-program-observations-as-part-of-quality, recommends observing at least one classroom in each age range and observing 50 percent of the classrooms in each program (Hamre and Maxwell, 2011). The authors add that “weighing the costs, it is not  recommended that QRIS observe every classroom in programs if the purpose is solely to determine the program’s rating. However, it is clear that observing every classroom may be useful for other purposes such as providing technical assistance.” (p. 9)


[1]The ERS for family child care homes was revised in 2007. Some States still refer to the older version, i.e., the Family Day Care Rating Scale (FCDRS).

North Carolina Partners with Other States to Develop New Program Level Measure for QRIS

As a part of their Race to the Top Early Learning Challenge Grant (RTT-ELC) proposal, North Carolina proposed to develop a valid and reliable program level measure to evaluate the continuum of early childhood quality.  Rather than assessing at the classroom level, the focus will be on the children’s experience in the various settings (centers, homes, schools) at the program level and will use multiple sources and types of evidence.  Conceptually, the tool will be looking at the child’s experience in relation to environmental factors, relationship factors and the interaction between the teacher and the child.  The goal is to create a program “portrait” that can serve as the foundation of a plan of continuous quality improvement.  Other states that are participating in the development and/or pilot of this tool include Delaware, Kentucky and Illinois.

A key step in QRIS design is to examine the current early and school-age care and education landscape and infrastructure to determine how to integrate various functions or subsystems. It is important to identify where there are services already in place that might be expanded or included in the QRIS structure. In most States, there are a host of resources that can be accessed.

  • North Carolina, Tennessee, and Oklahoma, among others, use State licensing staff to gather and validate the information needed to assign a rating.
  • Ohio’s Step Up to Quality program includes dedicated staff in each licensing field office whose sole responsibility is QRIS administration.
  • In Colorado, CCR&R staff, who are private sector employees that receive both public and private funding, conduct ratings.

In Illinois, assessments are conducted by staff at the McCormick Center for Early Childhood Leadership; scores are sent to Illinois Network of Child Care Resource and Referral Agencies (INCCRRA), the QRIS application contractor, to be combined with other criteria where the rating is generated. It is also important to ensure that the assessor conducting the assessments have the appropriate background, credentials, and training related to the age group for each assessment scale. For example, the ITERS-R assessor should have knowledge of infants and toddlers. Likewise, the SACERS assessor would be knowledgeable about the care and education of school-age children.

In staffing a QRIS, it is important to avoid conflicts of interest, real or perceived, that may arise if the same staff play multiple roles. In Pennsylvania, different QRIS staff perform three distinct roles.

  • The STARS Manager or Specialist serves in a supportive role and helps the provider understand the QRIS and its requirements.
  • The STARS Designator, an analysis expert, is responsible for reviewing all sources of evidence for each rating component and assigning the rating.
  • The ERS Assessor is part of a separate team that receives extensive training and maintains high inter-rater reliability.

All three of these staff are employed by the Pennsylvania Keys to Quality Offices. The first two positions are part of the Regional Keys that are located throughout the state, and the ERS Assessors are employees of hte PA Keys located in Harrisburg.

It is also possible to use existing databases and automated systems to help support QRIS functions. In most cases, however, significant modifications, or entirely new systems, will be needed. Pennsylvania's information management system, called PELICAN, integrates data from all early learning subsystems in the State. (See box, below for details.) Additionally, several independent contractors have developed new, customizable Web-based data management systems that could augment, or replace, existing State automation.

 

Alaska Leverages Resources to Support QRIS

In many states it is very difficult to add state staff positions when they are needed to administer a QRIS.  When Alaska found itself in that situation, they partnered with thread, their CCR&R, who was able to house the position within the R&R structure.  Alaska was able to redirect quality funds, supporting the QRIS.   

Integrated Data Systems in Support of a QRIS: Pennsylvania's PELICAN

PELICAN is an integrated child and early learning system that automates administration and data collection for the statewide QRIS, Keystone STARS, as well as many of the following additional functions: administration of the subsidized child care program; child care provider inspection and certification (licensing); administration and data collection for the State’s prekindergarten program; and data collection and analytics to support a longitudinal database and tracking system for children in Pennsylvania early learning programs. Users of PELICAN include public and private administrators, early childhood program teachers, directors and staff (including those in prekindergarten and the state funded Head Start programs); school districts that provide prekindergarten; parents; and others. Additional information is available in a report by the National Council of State Legislatures at http://www.ncsl.org/portals/1/documents/Educ/PAEarlyChild-Stedron.pdf and on the PA Key website at http://www.pakeys.org/pages/get.aspx?page=PELICAN

Arizona's Robust and User-Friendly QRIS Data System

Arizona opted to build its own data system, which includes comprehensive information on programs, classrooms, children, staff, registration and licensing status, and assessments. Coaches enter data and upload evidence into the system, and directors update data on their programs. Users of the system have the ability to analyze data and quickly pull reports. The system was developed using an agile methodology, which allows developers to make changes quickly in response to users’ needs.

Programs in Fresno County, CA, Upload Evidence into QRIS Database

In California, the Fresno County QRIS (Fresno County Early Stars) uses Mosaic as their QRIS database. Early Stars programs upload required information into the STARS database, such as lead teacher and director transcripts, copies of degrees and/or permits, and professional development certificates. Early Stars Administrators review and verify all the program data in the STARS database. A rating is generated based upon this information as well as information that is verified during an onsite visit.

Existing and New Resources Fill Gaps in Oklahoma's QRIS

When Oklahoma launched its Reaching for the Stars program in 1998, licensing staff were given the responsibility of both promoting the program with providers and providing ongoing monitoring. Twenty-seven new licensing specialists, a 25 percent increase in staff, were added to reduce caseloads and allow time for this new responsibility. Stars Outreach Specialists and Consultation and Technical Support Specialists were later added to supplement the consultation being provided by licensing staff. Child Care and Development Fund-funded partners were asked to make supporting QRIS participation a priority within their service delivery. For example, the Center for Early Childhood Professional Development offered workshops on program assessment and the four ERS scales. It initiated a Director’s Leadership Academy which addressed QRIS criteria, such as policies and procedures, staff development, and staff evaluation. The Scholars for Excellence in Child Care program provides advisors and scholarships to providers participating in the Stars program; child care providers must be employed in a OneStar plus or above child care facility to participate. The REWARD Oklahoma wage supplement program was created after programs were having difficulty recruiting and retaining the master teachers required in the Stars criteria. Additional information is available at http://www.okdhs.org/programsandservices/cc/stars/.

 

Monitoring the Rating

The policies and procedures for monitoring the ratings should be clearly articulated to all involved. As providers submit documentation, and QRIS staff conduct interviews, observations, and assessments, it is important that all acceptable “sources of evidence” are consistently defined and interpreted. Whether a State implements a building block approach, a point approach, or a combination of the two, it must have a sound monitoring process in place.

Just as it is important for early and school-age care and education programs to be aware of any benefits for achieving a level, they also need to understand what they must do to maintain a designated level and the consequences for noncompliance. The policy should specify when a reduction of status becomes effective, what the process is to restore a level, and if there are any appeal rights. States have developed administrative policies for situations when a program no longer meets one or more of the standards in its current designation level. The process to be followed for noncompliance should be clearly written and communicated to programs.

 

Many States include a program improvement plan as part of the QRIS process. Typically based on a provider's self-assessment, observation, or rating, this plan identifies strengths and weaknesses and suggests ways to make improvements. Many QRIS use the results of an assessment tool, like the ERS, as a starting point for developing this plan.

Maryland requires a program improvement plan for programs that are seeking a Check Level 3 rating and have any ERS subscale scores below 4.0 on the program's self-assessment, and for Check Level 4 and 5, a program improvement plan is required for any outside ERS assessment with a subscale score of 4.5 or 5.0, respectively. Programs may use a variety of additional tools or assessments to create the improvement plan, such as the accreditation self-study or validation results, school-readiness goals and objectives for their jurisdiction, and program-specific goals and objectives for continuous quality improvement.

States might also consider how a facility’s licensing status affects their rating. For example, if a facility is not in compliance with licensing standards, the rating might be reduced, suspended or removed. Several States have developed procedures detailing this process.

Washington has extensive procedures on the effects of licensing status on QRIS participation, both at registration and during participation. For example, if a facility is operating under a probationary license, they have six months to regain full licensure. During this time, they may continue to work with a coach or TA Specialist, but they cannot be evaluated for rating. If the full license is not reinstated within six months, their participation in Early Achievers will be terminated. State Pre-kindergarten (ECEAP) will be required to participate in Early Achievers by 2015 per HB 1723. Washington is currently considering how Early Achievers ratings and assessment data can be used to supplement required monitoring of ECEAP sites/contractors.

Providers may wish to challenge both an assessment score as well as the overall rating assigned to their program. As States gain experience in implementation of QRIS, many are developing guidelines to follow if a program disagrees with its quality rating, although not all have a formal appeals process. Clear communication and training to help providers better understand the rating process may help to reduce the number of appeals.

In Stair Steps to Quality: A Guide for States and Communities Developing Quality Rating Systems for Early Care and Education, Anne Mitchell (2005) makes the following statement about implications of accountability policies:

“A key accountability issue in a quality rating system (QRS) [sic] is the accuracy of quality ratings. A well-designed and implemented accountability system, bolstered by clear communication about the structure and operation of the QRS, should minimize disagreements. A concern that has been raised about rating systems, especially those connected with licensing, is whether rating the quality of programs will result in challenges to ratings and an increase in requests for hearings. Anticipating that some programs may not agree with the rating they receive, an appeals processshould be designed in advance. Administrators of statewide QRS report that although quality ratings do change, there are relatively few challenges and little or no increase in hearing requests” (p. 36).

Stair Steps to Quality is available at  http://www.earlychildhoodfinance.org/downloads/2005/MitchStairSteps_2005.pdf.

 

The guidelines developed by each State vary. In Colorado, a program may initiate a Technical Review of its Qualistar rating within 30 calendar days of receiving its Qualistar Rating Consultation. It may also initiate a Dispute Resolution Process within the same time period. In North Carolina, programs can appeal the evaluation of staff qualifications to the Education Unit and ERS assessments first to the assessors at UNC-Greensboro and then to the Office of Administrative Hearings. In Oklahoma, if a program's star level is reduced, it can appeal or propose an alternative settlement but cannot re-apply for 6 months if the reduction is due to noncompliance. Wisconsin’s YoungStar Policy Guide stipulates that the “local YoungStar office discuss the rating with the provider before it is published on the YoungStar Public Search website.” In an effort to minimize the number of reconsiderations, YoungStar has established “clear documentation and justification of the rationale for a program’s rating.”  Additional information can be accessed at http://dcf.wisconsin.gov/youngstar/pdf/policy_guide.pdf. Most of the guides, workbooks, and toolkits referenced in Section 6: Incentives and Support include information on the appeals process.

In Arkansas, upon receipt of the request for appeal, the Better Beginnings coordinator will conduct an internal review to ensure that the appropriate processes were followed and determine the validity of the original decision. The Better Beginnings coordinator will review the findings with the division director and will transmit the findings of the internal review to the facility within 30 days of the receipt of the request to appeal. If the outcome of the internal review is unsatisfactory to the facility, it has 10 days to ask for further review by the Better Beginnings Appeal Review Committee. Additional information is available in the Better Beginnings Rule Book at http://www.arbetterbeginnings.com/downloads/BB-Rule-Book_060110.pdf.

 

Tennessee's Appeals Process

Tennessee tries to anticipate situations that may lead to an appeal by making post-assessment calls to all providers participating in the Report Card and Star-Quality Program. These calls, which are handled by CCR&R Specialists, help to keep the number of disagreements low. Following each call, a provider receives a copy of the assessor’s notes and a Profile Sheet that summarizes all of its scores. If there is an issue with the assessment piece, the provider has 20 business days to file an appeal. The Level 1 appeal is handled by the local unit, which works with the CCR&R staff. The Level 2 appeal is conducted by contract staff. If a provider completes both levels of the appeals process and still has an issue, it may then request an administrative hearing. Additional information is available at  http://www.tennessee.gov/humanserv/adfam/ccrcsq.html

Maine's Process for Handling Appeals

In Maine’s Quality for ME QRIS, programs that do not agree with the accuracy of the Department of Health and Human Services’ rating may appeal a decision by requesting an informal review by the Early Childhood Division. If a provider is not satisfied with the result of this informal review, it may request an administrative hearing. The following policy is in place for this purpose:

“A provider must request an informal review and obtain a decision before requesting an administrative hearing. If the provider is dissatisfied with the informal review decision, he or she may write the Commissioner of the Department of Health and Human Services to request a hearing provided he/she does so within thirty (30) calendar days of the date of the Director’s report on the Department’s action. Subsequent appeal proceedings will be limited only to those issues raised during the informal review. The Office of Administrative Hearings shall notify the provider in writing of the date, time and place of the hearing, and shall designate a presiding officer. Providers will be given at least twenty (20) calendar days advance notice of the hearing date. The hearing shall be held in conformity with the Maine Administrative Procedures Act, 5 M.R.S.A. §8001 et seq. and the Administrative Hearings Regulations.”

Additional information is available at www.maine.gov/dhhs/ocfs/ec/occhs/qualityforme.htm.

Vermont's Grievance Process

Applicants or program participants have the right to appeal rejection of their application materials or other adverse decision related to the STARS program, such as the suspension or revocation of a STARS certificate in connection with enforcement of licensing regulations, subsidy regulations or these standards.

Appeals must be in writing and received by the DCF Commissioner within 30 days of the date of rejection or other adverse decision. If the appeal is from a school-operated pre-kindergarten program, the Commissioner of the Department of Education shall join the Commissioner of the Department for Children and Families in deciding the appeal.

The applicant or grievant shall have the opportunity to present the appeal to a STARS grievance committee. The committee shall be appointed by the Commissioner(s), consist of at least three members including one from the regulated provider community, and provide the Commissioner(s) with a recommendation. The Commissioner(s) shall make a final decision on the grievance and provide the grievant with a written decision. The grievant may appeal the final decision of the Commissioner(s) to the Human Services Board within thirty days of the date of the final decision.

Financial incentives shall not be paid while an appeal is pending. If a successful final appeal results in a determination that a STARS program participant is due a financial incentive or maintenance payment, DCF will award payment in full within 60 days

Additional information is available in Step Ahead Recognition System (STARS) Standards (January 2010), at   http://dcf.vermont.gov/sites/dcf/files/pdf/cdd/stars/Adopted_STARS_rule_FINAL.pdf.

As States are integrating services across systems and aligning program standards in the QRIS, the reduction or loss of rating levels can have a significant financial impact on programs. Examples include:

  • Lack of or reduced access to free or low-cost training opportunities (Teacher Education and Compensation Helps (T.E.A.C.H.) Early Childhood® Project scholarships, training vouchers, Child Development Associate courses, credentialing programs, etc.).
  • Reduction or loss of financial rewards or bonuses for attaining and maintaining higher levels within the QRIS. These awards can be directed to the program or to individual staff within the program.
  • Reduced tiered reimbursement payments for subsidized child care.
  • Limited access to supportive services, such as technical assistance, consultation, and ERS assessments.
  • Inability to market the program at a higher level. This may reduce a program’s ability to remain competitive with other programs and may affect parents’ decisions regarding placement of their children in care.

Any partnering agency or service within the State system that advertises rating levels to the public needs to be notified of rating changes so that parents have access to the most current information. This includes both increases and decreases in levels. Local CCR&Rs commonly maintain and distribute rating information to parents, and their listings must be accurate. If the licensing or subsidy agency is not the same agency that administers the QRIS, each of these agencies will need separate notification. When tiered reimbursement payments are involved, the subsidy agency must be notified as well as the Education Department if prekindergarten programs are rated, or if eligibility for funding depends on a specific quality rating.

Early and school-age care and education providers should be advised not to market themselves incorrectly. Some States supply participating programs with materials, such as banners, window clings, and posters, to use to market their QRIS to parents. If these materials advertise a level that is no longer applicable, they should be changed accordingly.