NCRVE Home | Site Search | Product Search



Taking Off!
Sharing State-Level
Accountability Strategies

Using Academic and Vocational Accountability
Strategies To Improve Student Achievement

MDS-1206




Mikala L. Rahn
Patricia O'Driscoll

Public Works, Inc.

Phyllis Hudecki
NCRVE

National Center for Research in Vocational Education
Graduate School of Education
University of California at Berkeley
2030 Addison Street, Suite 500
Berkeley, CA 94720-1674


Supported by
The Office of Vocational and Adult Education
U.S. Department of Education

April 1999



FUNDING INFORMATION

Project Title: National Center for Research in Vocational Education
Grant Number: V051A30003-98A/V051A30004-98A
Act under which Funds Administered: Carl D. Perkins Vocational Education Act
P.L. 98-524
Source of Grant: Office of Vocational and Adult Education
U.S. Department of Education
Washington, DC 20202
Grantee: The Regents of the University of California
c/o National Center for Research in Vocational Education
2030 Addison Street, Suite 500
Berkeley, CA 94720
Director: David Stern
Percent of Total Grant Financed by Federal Money: 100%
Dollar Amount of Federal Funds for Grant: $4,500,000
Disclaimer: This publication was prepared pursuant to a grant with the Office of Vocational and Adult Education, U.S. Department of Education. Grantees undertaking such projects under government sponsorship are encouraged to express freely their judgement in professional and technical matters. Points of view or opinions do not, therefore, necessarily represent official U.S. Department of Education position or policy.
Discrimination: Title VI of the Civil Rights Act of 1964 states: "No person in the United States shall, on the ground of race, color, or national origin, be excluded from participation in, be denied the benefits of, or be subjected to discrimination under any program or activity receiving federal financial assistance." Title IX of the Education Amendments of 1972 states: "No person in the United States shall, on the basis of sex, be excluded from participation in, be denied the benefits of, or be subjected to discrimination under any education program or activity receiving federal financial assistance." Therefore, the National Center for Research in Vocational Education project, like every program or activity receiving financial assistance from the U.S. Department of Education, must be operated in compliance with these laws.



Table of Contents



SHARING PROMISING PRACTICES

Acknowledgements

      Many people have contributed a great deal to the production of this document and to the continued efforts of the National Center for Research in Vocational Education to provide technical assistance that serves states in improving accountability and student achievement.

      This document was prepared under the guidance of Ron Castaldi, Jackie Friederich, Ricardo Hernandez, Laura Messenger, and Pariece Wilkins of the Office of Vocational and Adult Education, U.S. Department of Education, who have provided valuable support to the production and evolution of this document as it has taken shape during the year. In addition, we have received invaluable advice and leadership from Rodney Kelly of Kentucky, Joanna Kister of Ohio, Chris Lyons of Maine, and Russ McCampbell of Missouri, all state directors of vocational education who generously gave of their time by serving on our advisory council.

      Thanks are also due to Kerri Limbrecht and Stephanie Surles of Public Works whose help was invaluable to the final production of the document. In addition, Michael Butler's insight and valuable comments helped guide us to our final draft. We would also like to thank Sandy Larimer, Phyllis Plank, Pam Mainland and Christine Hulbert of NCRVE's Materials Distribution Services for their insight and editing and production support. All design and layout was done by Kurt Rahn, and he also helped coordinate printing.

      Most importantly, though, thanks are due in great measure to the many members of state agencies with whom we have spoken and corresponded throughout the year. Without their candidness and willingness to share their state's policies, practices, and lessons learned, this document could not have been prepared. We hope that we have communicated the real challenges of those who continue to work tirelessly at the state level to bring meaningful change to students in the classroom.



Introduction

Accountability

      The centerpiece of standards-based reform and results-oriented accountability systems is a set of standards defining what students should know and be able to do in a kindergarten through postsecondary articulated system. Throughout the 1990s, the effort to establish standards evolved from an interest of professional subject-matter associations such as the National Council of Teachers of Mathematics to a matter of high priority in nearly every state (Massell, Kirst & Hoppe, 1997). By the end of 1998, forty states had developed standards at the secondary level in all core subjects (Education Week, January 11, 1999).

       Having adopted standards for what students should know and be able to do, states are now struggling with the decision of how to define and measure these standards. Indeed, reaching consensus on the definitions of academic excellence has been difficult to achieve within each state, much less nationally. Recognizing that the establishment of standards alone will not improve the education system, states are turning their attention to the development of accountability systems that reward results and provide consequences for low performance. Movement in this direction, in turn, has led to more discussion about the necessary components of accountability such as standards-driven assessment and curriculum, statewide data systems, public reporting forums, incentives and consequences, and professional development aimed at training practitioners to use educational data for program improvement. This document will highlight promising state efforts in the areas.

      While the steady march towards greater accountability is encouraging, it is important to recognize that states are far from completing their accountability systems. Although many states have "pieces and parts" of an overall system, no state has developed the definitive "accountability blueprint." Moreover, states differ in terms of the approach utilized and priority accorded to accountability. While Texas and North Carolina are touted as being the closest to having all the necessary components of a complete accountability system, the authors of Quality Counts appropriately conclude that "states have completed only the first few miles of a marathon when it comes to holding schools accountable for results." (Education Week, 1999, p. 5)


Systemic Accountability Versus Programmatic Accountability

      Statewide reform efforts focus on improving the delivery of public education from a systemic perspective. By concentrating resources on setting standards and more accurately measuring student performance, states hope to set the stage for performance-based improvement that will produce lasting change in the public education system for all students in elementary, middle and high schools, and postsecondary institutions.

      While states are actively involved in building standards-based accountability systems, accountability efforts at the federal level are also underway. These are generally less comprehensive in approach and are more likely to be related to the tracking of specific programs. In particular, most state and federal legislation includes specific accountability and reporting requirements. Because they are more targeted or categorical in nature, programmatic accountability systems often call for data from specific sub-populations such as limited English proficient students or students receiving Title I funds.

       At times, programmatic reporting requirements may conflict with the goals or spirit of comprehensive statewide efforts. For example, one indicator within a state's comprehensive accountability system may relate to the academic performance of all students at the 3rd, 6th, 8th, and 10th grade. However, federal vocational accountability systems in the Carl D. Perkins Vocational-Technical Act of 1998 require a measure of the academic performance of vocational students, preferably at the 12th grade when a student has completed a vocational program. Similarly, federal programs may request information with the intention of creating a set of indicators that can be reported nationally, rather than as a reflection of the unique circumstances of each state. The result is a patchwork of "systems" at the local and state levels, each moving toward accountability with different data and reporting requirements.


Accountability Standards

      With districts and states experimenting with a wide range of accountability strategies to improve performance, the National Association of State Boards of Education (NASBE) organized the Study Group on Education Accountability. The Study Group examined the field to determine what is and is not working, the dilemmas that policymakers face, and how the various components of accountability fit together to become a coherent system. The effort resulted in a framework of ten action-oriented standards to guide the discussion, design, and evaluation of local and state education accountability systems (NASBE, 1998). These standards and indicators can be found in Table 1. A more complete description of the standards can be found in the Resource section of the document.

Table 1: Public Accountability for Student Success: Standards for Education Accountability Systems

Standard 1:
Accountability Goals and Vision
Legal authorities clearly specify accountability goals and strategies that focus on student academic performance.
Standard 2:
Governing Accountability
At each level of the education system, designated authorities are charged with the efficient governance of the accountability system.
Standard 3:
Responsible Agents
Specific responsibilities for student learning and performance are assigned to designated agents.
Standard 4:
Collecting Performance Information
Accountability is based on accurate measures of agent performance as informed by assessments that are administered equitably to all students.
Standard 5:
Analyzing and Reporting Performance Information
Those responsible for governing accountability regularly report student and school performance information in useful terms and on a timely basis to school staff, students and their families, local and state policymakers, and the news media.
Standard 6:
Incentives and Consequences
Incentives are established that effectively motivate agents to improve student learning. Consequences, which could include rewards, interventions or sanctions, are predictably applied in response to performance results.
Standard 7:
Building Agent Capacity
Agents are provided sufficient support and assistance to ensure they have the capacity necessary to help students achieve high-performance standards.
Standard 8:
Policy Alignment
Policymakers work to ensure that education policies, mandated programs, financial resources, and the accountability system are well-aligned so that consistent messages are communicated about educational goals and priorities.
Standard 9:
Public Understanding and Support
The accountability system has widespread support.
Standard 10:
Partnerships
Various established partnerships work together to support teachers, schools and districts in their efforts to improve student achievement.

      Source: NASBE (1998)

      The action orientation of the NASBE standards goes a long way toward identifying the elements that can be useful in the development of both comprehensive state systems and programmatic systems of accountability. As states become more experienced with the components of accountability and the federal government becomes more familiar with how these systems can be used for collection of nationally relevant data, state and programmatic accountability systems will begin to merge. However, because certain programs have particular data requirements, they will never match completely. For example, the accountability system needs for Head Start or state childcare programs will never match Perkins III or state vocational requirements. Nonetheless, it is hoped that the current patchwork of data collection and reporting will move closer toward an articulated and better-coordinated system of accountability.


Accountability in Vocational Education

      In an effort to ensure that states focus on the quality of vocational education programs, Perkins II, the 1990 Carl D. Perkins Vocational-Technical Act, required states to set up accountability systems for local vocational programs. As they grappled with new federal mandates, state-level efforts centered on meeting the law's requirements. At a minimum, they had to develop two accountability measures; one of which had to be an indicator of learning and competency gains, including student achievement of basic or more advanced academic skills. The other measure needed to be one of the following: [1] competency attainment; [2] job or work skill attainment or enhancement; [3] retention in school or completion of secondary school or its equivalent; and [4] placement into additional training or education, military service or employment.

      In spite of some initial trepidation with the accountability requirements in Perkins II, each state developed its own system of performance measures and standards and devised a plan for implementation. At the secondary and postsecondary level, all but two states went well beyond the requirements of Perkins II and included more than two performance measures and standards. In fact, most states included three to ten measures. (Hoachlander & Rahn, 1992)

      Rather than building a data collection system capable of producing nationally comparable data, Perkins II intended for states to provide educators at the local and state levels with data they could use to improve their own vocational education programs and courses; therefore, states built accountability systems suited to their own history and culture. In more traditionally decentralized states, local education agencies were allowed to use their own assessments to measure academic gains and to develop their own instruments on other measures. Local control resulted in more buy-in at the local level but no comparability and little standardization statewide. In more centralized states, statewide assessments were used to move local education agencies toward standardized systems based on a common set of data. While this led to more comparability across local education agencies, these efforts were often perceived at the local level as less useful to program improvement. (Stecher, Rahn, 1997)

      In states that integrated federal legislative requirements with their own initiatives for improving vocational education, the emerging accountability systems tended to evolve over time. In fact, in a few states, vocational accountability is beginning to be integrated into the overall statewide education accountability system; however, in states with more of a "compliance" mindset, the accountability systems evolved very little over time. Administrators in these states have tended to wait for new federal legislation to mandate change before proceeding further in terms of accountability.

      Despite progress, however, states are far from having complete accountability systems for vocational education in place. Integrating vocational education in an overall reform agenda continues to be a challenge. Most state administrators realize that the demand for accountability from the public, business, politicians, and others is here to stay. Therefore, they are working hard to figure out what data is reasonable to collect and how it can be used to improve reporting about students and programs. The concerns raised are no longer "Why do we have to measure academic skills?" but "How do we best measure academic and occupational skills and assist local educators and administrators in their use of data for program improvement?"

      With the recent passage of Perkins III, learning from the past is more important than ever. Perkins III requires the implementation of a similar accountability system; however, it adds the potential for higher stakes with the introduction of incentive performance funds. Through Perkins III and the Workforce Investment Act, states exceeding the levels of performance for core indicators established for workforce agencies, adult education, and vocational education are eligible for incentive grants.

      Perkins II was in many ways the first step toward accountability--requiring the reporting of performance data. Perkins III intends to complete the accountability cycle with rewards and consequences based on performance. Perkins IIIaccountability requirements are described in Table 2.

Table 2: Perkins III Accountability Requirements

      At a minimum, the requirements for core indicators of performance in Perkins III requires measures of each of the following:
      (1) Student attainment of challenging state-established academic, vocational and technical, and skill proficiencies.
      (2) Student attainment of a secondary school diploma or its recognized equivalent, proficiency credentials in conjunction with a secondary school diploma, or a postsecondary degree or credential.
      (3) Placement in, retention in, and completion of, postsecondary education or advanced training, placement in military service, or placement or retention in employment.
      (4) Student participation in and completion of vocational and technical education programs that lead to nontraditional training and employment.


Vocational Education: Pulled in Two Directions

      In most states, vocational education is trapped between two often-conflicting ends of a spectrum: workforce development and school reform. On the one hand, vocational educators are attempting to be included (and in a few cases lead) in mainstream school reform through contextual learning, applied methodology, and integrated curriculum. Seen this way, vocational education is a vehicle for thematic instruction to improve the knowledge and skills for all students and to better prepare these students for both work and postsecondary education.

      On the other hand, vocational educators also work within the framework of job training and welfare-to-work initiatives designed to help students obtain specific skills for entry-level work and future jobs. This training is more specific in focus and often shorter in term. The primary goal under this purpose is to get the learner a job which hopefully leads to a better job and later to an even better job. Typically, this type of vocational education includes the non-college bound, or high school students that do not wish to go immediately to college, but instead enter the workforce. This type of vocational education also serves displaced homemakers, dislocated workers, welfare recipients, and others whose first priority is to earn a living wage or better wage.

       Table 3 suggests differences that influence vocational education's tension in purpose. It includes legislation connections, vehicles and delivery systems, populations served, and skills taught. The tension between school reform and workforce development should be viewed as a continuum with some states on one end or the other, and others somewhere in the middle.

       The tension between these two purposes is real and quite manifest in some parts of the country. To cite just one example, the purpose and rhetoric surrounding the purpose of vocational education differs substantially in California compared to Arkansas. In California, secondary vocational education is one area within the state's education department. There is no longer a unit called "vocational education." Instead, vocational education is included as an area of emphasis within the High School Division. This division receives no JTPA or other workforce development funds. As such, the major purpose of vocational education at the secondary level is school reform that draws on Career Academies and Tech Prep education as primary vehicles for integrating vocational education into mainstream school reform.

       By contrast, vocational education has recently been consolidated into the newly developed Workforce Development Division in Arkansas. This new division houses both secondary and postsecondary vocational education, including all Perkins funds and 8% JTPA funds. Arkansas soon hopes to file a consolidated State Plan for Perkins III and the Workforce Investment Act.

Table 3: Vocational Education Pulled in Two Directions

      School Reform
Examples of Legislation:
  • Goals 2000
  • Improving America's Schools Act (IASA)
  • Eisenhower
  • Perkins III
Vehicles:
  • Elementary, middle, high schools, & community colleges
  • Academies, tech prep
  • Career clusters or magnet schools
  • Interdisciplinary teams
Populations Served:
  • Elementary through Postsecondary students
Skills Taught:
  • Rigorous academic standards
  • Work-readiness skills (such as the SCANS competencies)
  • Cluster or career major skills


Example of Legislation:
  • Temporary Aid to Needy Families (TANF)
  • Workforce Investment Act (formerly JTPA)
  • Perkins III
Vehicles:
  • Vocational programs within high schools & community colleges
  • JTPA programs
  • One-stop employment centers
  • Vocational centers/adult training
  • Customized training or private sector training
Populations Served:
  • Students in their final two years of high school
  • Community college students
  • JTPA adult and youth participants
  • Welfare recipients
  • Dislocated workers, displaced homemakers, and so on
Skills Taught:
  • Basic academic skills
  • Work-readiness skills (such as the SCANS competencies)
  • Job-specific skills
      Workforce Development

       While it varies by state and by level (secondary and postsecondary), the degree of tension within most states is more concentrated at the secondary level. This tension often complicates data collection and accountability systems, especially when measuring academic and occupational attainment. For example, a secondary level vocational education system primarily focused on school reform might measure academic achievement, dropout rates, and indicators such as student eligibility for postsecondary education. A second system, at the postsecondary level, centered on workforce development, may primarily focus on increases in wage rates, placement and retention, and improved economic indicators. Of course, an accountability system could measure the entire continuum. The questions are "What are the most important indicators in the overall system?" and "What should the system be held accountable for?"

       Table 4 provides examples of potential indicators along the school reform and workforce development continuum. The measures (academic tests, placement) embedded in the indicators are aligned with intended purpose. The line should be viewed as a continuum with some states on one end or the other, and others somewhere in the middle.

       An element that adds more complexity to the measurement challenge discussed above is the governance or delivery system of the state. In cases where the purpose of vocational education has been clearly established and the indicators defined, another challenge remains: at what level will the indicators be implemented and the data collected? For example, Perkins II required that states measure academic gains. In centralized states, the indicator was measured through statewide, standardized tests. This ensured a comparable measure across the state. In decentralized or local control states, local recipients were allowed to select their own measure of academic gains, with results forwarded to the state level. No statewide comparable data was ever developed.

       Moreover, states may have a system that combines both centralized and decentralized elements in its accountability system. For example, a state may have a centralized method for collecting academic performance, but only local methods for job placement. Alternatively, a state may have both a centralized and decentralized strategy on one indicator. For example, Oklahoma has a centralized occupational testing program that includes a multiple-choice test for each occupational area. The performance assessment component is locally developed and scored prior to administering the multiple-choice test. (Stecher, Rahn, 1997) Table 5 describes how the same indicator would be operationalized in a centralized versus decentralized system.

Table 4: Purpose Driven Example Indicators

School Reform
  • % of students scoring at X level on X academic test
  • % of students acquiring certificate of initial mastery
  • % of students enrolling in college prep coursework
  • % of students successfully completing university requirements
  • Decrease in high school dropouts
  • % of students completing high school entering postsecondary institutions
  • % of students attaining SCANs competencies

  • * % of students attaining SCANs competencies
  • % of students passing certification/licensure examinations
  • % of students acquiring certificate of advancement mastery
  • % of students mastering 80% of occupational competencies
  • % of students placed in a job related to training
  • % of students retained in job after six months
  • % of students earning a living wage
Workforce Development

      Although multiple combinations of centralized and decentralized elements are possible, the trend across states seems to be one of centralization. Without comparable measures, it is very difficult to offer statewide incentives and consequences in the accountability system. Without centralized measures, it is difficult to manage the information at the state level, and impossible nationally. However, with this drive toward centralization, defining the purpose of vocational education becomes more important than ever. What are the appropriate indicators and accountability strategies to be included in education systems, and, in particular, vocational education? How will success be defined? How will practitioners be supported in order to attain success as defined?

Table 5: Example of System Delivery Influences on Accountability Measures

School Reform

  • Academic tests selected or developed at local level
  • Local transcript analysis with results reported to state
  • Dropout defined differently at each local district; reported to state
  • Locally reported from postsecondary institutions to state
  • % of students scoring at X level on X academic test
  • % of students enrolling and completing college prep coursework
  • Decrease in high school dropouts
  • % of students completing high school entering postsecondary institutions
  • Statewide standardized academic test
  • Electronic transcript data at state-level
  • State definition; with state centralized database
  • State matched database of secondary and postsecondary records
  • Locally implemented checklists
  • Local student or employer survey
  • % of students mastering 80% of occupational competencies
  • % of students placed in a job related to training
  • % of students retained in job after six months
  • % of students earning a living wage
  • Statewide vocational testing system
  • Unemployment Insurance Data matched to program completers with social security #

Workforce Development


Sharing Practices To Improve Accountability

       We recognize that no one has all of the answers to the issues raised in this introduction. Most of the measurement issues related to the conflicting purposes of vocational education (school reform versus workforce development) and governance/system delivery (centralized versus decentralized) are quite complex. All states are working on parts of the system or systems. It is hoped that the parts eventually sum to a whole.

       Earlier in this document we provided the NASBE's ten action-oriented standards which are intended to guide the development of overall state accountability efforts. The case studies provided in this document describe promising practices or particular strategies within an accountability system that have operationalized some of the NASBE standards. The necessary components described in this report include the following areas:




      Within these areas, states have worked hard to develop particular components to their system. It is hoped that this document provides a vehicle to share some of the lessons learned related to the varying strategies employed state to state. The sharing of practices across the states may be helpful to those willing to take the time to analyze and reflect on others' efforts. In some cases, a strategy from another state could be adopted. In most cases, the lessons learned would need to be adapted to particular state conditions. At other times, the strategy may not be a viable option to either adopt or adapt. In these cases, learning from the challenges faced by others may help the reader see his or her own situation in a different light.

      This document is an evolving publication aimed primarily at state administrators and state policymakers. We hope to collect additional strategies from states over time, and mail out periodic updates (see Resources section). The "answers" to complex accountability questions are years down the road. As states continue to experiment and take test flights, we will need to observe, question, and incorporate new ideas into our own practice. Hopefully, this document will offer one more tool to help state administrators fine-tune their flight plan under always changing conditions.

      Buckle up, keep your seatbelts fastened, and enjoy the flight . . .


References

Hoachlander, E. G., & Rahn, M. L. (1992). Performance measures and standards for vocational education: 1991 survey results (MDS-388). Berkeley: National Center for Research in Vocational Education, University of California, Berkeley.

Massell, D., Kirst M., & Hoppe, M. (1997, March). Persistence and change: Standards-based systemic reform in nine states. CPRE Policy Briefs, Graduate School of Education, University of Pennsylvania, RB-21-March 1997.

National Association of State Boards of Education (NASBE). (1998, October). Public accountability for student success, standards for education accountability (The Report of the NASBE Study Group on Education Accountability). Alexandria, VA: Author.

Quality Counts `98: Rewarding results, punishing failure. (1999, January 11). Education Week (Entire issue), 18(17).

Stecher, B. M., Hanser, L.M., Hallmark, B., Rahn, M. L., Levesque, K., Hoachlander, E. G., Emanuel, D., & Klein, S.G. (1995). Improving Perkins II Performance Measures and Standards: Lessons Learned from Early Implementers in Four States (MDS-732). Berkeley: National Center for Research in Vocational Education, University of California, Berkeley.

Stecher, B. M., Rahn, M. L., Ruby, A., Alt, M., & Robyn, A. (1997). Using Alternative Assessments in Vocational Education (MDS-947-NCRVE/UCB). National Center for Research in Vocational Education, RAND, Santa Monica, CA.



SETTING STANDARDS

      The standard-setting process is widely acknowledged as the cornerstone of recent education reform efforts to guarantee educational accountability based on student performance at the individual, school, district and state levels. In an effort to clarify what is expected of students, states across the nation, professional subject matter organizations, and industry-based partnerships have set about defining the standards, or skills and knowledge, required of students. Standards are the first step toward developing an integrated system of accountability that is built on student performance. Standards define an aspect of student performance that is measurable, and build on the skills that are learned as students progress through each stage of the educational system and into the workplace. This section includes descriptions of the implementation of standard-setting efforts in vocational education programs, state systemic accountability efforts, and industry-based partnerships. Standard- setting efforts included in this section describe implementation strategies in one or more of the following categories of learning:




Illinois Occupational Skills Standards


Building a State-Endorsed Occupational Credentialing System

      Policy Rationale and Goals: In 1992, the Illinois legislature passed the Occupational Skill Standards Act (Public Act 87-1210), which established the Illinois Occupational Skill Standards and Credentialing Council (IOSSCC), a nine-member panel composed of representatives from business, industry, and labor. The lead agency designated to organize and staff the IOSSCC was the Illinois State Board of Education, with five members of the IOSSCC appointed by the Governor and four by the State Superintendent of Education.

      According to Kathy Nicholson-Tosh, Skill Standards project manager, from the beginning, Illinois wanted the standard-setting process to be a "business and industry-led venture." It "took time for appointments to be made and for the group to get together to begin to build the policies and infrastructure" that would frame the state's subsequent effort. The IOSSCC was first convened by the Illinois State Board of Education on January 31, 1994, where much of the time and initial meetings were spent establishing a common vision and mission for the IOSSCC's role in workforce development:

The vision of the IOSSCC is to have a statewide system of industry-defined and recognized skill standards and credentials for all major skill occupations that provide strong employment and earnings opportunities in Illinois. To achieve this vision, IOSSCC will play a major role in establishing and marketing these systems in the private sector for use in hiring, training, and promotion. IOSSCC will establish industry subcouncils to identify skilled occupations and recognize or develop skill standards and credentialing systems based on a common set of policies and procedures. The subcouncils will be responsible for developing marketing and promotion plans and developing or identifying information to be used in career information systems. (IOSSCC, 1997, p.4)

      Essentially, the IOSSCC concluded that its role will be to serve three primary purposes:

  1. Recognition and development of skill standards and credentialing systems
  2. Marketing and promotion of their use in the private sector
  3. Working with state councils and agencies to promote the application of standards and credentials in all approved and funded workforce development programs

      Implementation Strategy: According to Nicholson-Tosh, the IOSSCC spent many months after the initial appointment of members defining an implementation strategy in keeping with the vision and mission of the IOSSCC. The IOSSCC "deliberated extensively as new territory was being covered--when the first policies were passed, everyone cheered." The IOSSCC decided that a subcouncil structure would best serve the interests of all economic sectors in the state. The state's economy was grouped into fourteen categories which basically align with categories developed by the National Skill Standards Board, though there is some variation. The categories include the following:

    1) Agriculture and Natural Resources
    2) Construction
    3) Energy and Utilities
    4) Manufacturing
    5) Communications
    6) Transportation, Distribution, and Logistics
    7) Health and Social Services
    8) Educational Services
    9) Hospitality
    10) Marketing and Retail Trade
    11) Business and Administrative/Information Services
    12) Applied Science and Engineering
    13) Financial Services
    14) Legal and Protective Services

      Of the fourteen subcouncils planned by the IOSSCC, nine have been appointed. According to Nicholson-Tosh, the IOSSCC, in consultation with the Illinois Occupational Information and Coordinating Committee, "spent a good deal of time grouping industries and occupations to determine appropriate representation on the subcouncils." The IOSSCC's 1997 Progress Report states that "Each subcouncil consists of 15 to 25 employer and worker representatives who are nominated from major industry, trade, and professional associations and labor unions from all segments of the industry and regions of the state. In addition, a representative from the secondary and postsecondary workforce preparation system serves on the subcouncil." (p.6) The primary responsibilities of the subcouncils are identifying the occupations and occupational clusters that will be included in the system, marketing and promoting the state-endorsed skill standards, recommending skill standards and credentialing systems to the IOSSCC for final approval, and establishing standards development committees for an identified occupation.

      Standards development committees, appointed by the subcouncils, are responsible for the actual production of the skill standards and credentialing and assessment systems. The committees include 10 to 12 employer and worker representatives who are experts in the occupation or occupational cluster. These committees also identify related academic skills and recommend endorsement of the standards and credentialing system, to the industry subcouncil.

      Evolution of Strategy: According to Nicholson-Tosh, from the beginning, the IOSSCC has "been concerned that standards be specific enough for curriculum and assessment." Therefore, the IOSSCC established criteria in its review process for applications from subcouncils for skill standards and credentialing systems endorsement:

      Outcomes/Lessons Learned: To date, the IOSSCC has endorsed 15 industry skill standards products. While the IOSSCC believes that it has "the skill standards development process down, assessment and credentialing are the current focus," according to Nicholson-Tosh. The IOSSCC has learned that it takes "time for good reform to happen. The nine members appointed to the IOSSCC are involved on a voluntary basis and are very committed to the effort. They are not a rubber stamp group." It is this kind of commitment by the IOSSCC that encourages efforts by state agencies to continue to coordinate activities. Every other month, representatives from education and workforce development agencies "all sit at the table and plan together, providing input to what the IOSSCC will work on."

      Illinois is observing how industry skill standards and assessment systems are evolving in other states and is participating in the National Skill Standards Board "Building Linkages Project" in the manufacturing sector. Illinois is working closely with Indiana, the lead state in the NSSB manufacturing effort, and is considering modeling some aspects of its credentialing system after Indiana's scenario-based performance assessments.


Reference

Illinois Occupational Skill Standards and Credentialing Council (IOSSCC). (1997). Progress report, January 1994-July 1997, Illinois skills standards. Springfield, IL: Author.




New Jersey's Cross-Content Workplace Readiness Standards

Integrating Subject-Specific and Employability Skills in State Standards and Assessment Reform

      Policy Rationale and Goals: In May 1996, the New Jersey State Board of Education adopted Core Curriculum Content Standards to form the basis for changing the delivery of curriculum, instruction, and assessment in New Jersey schools. According to the New Jersey Department of Education, "These standards and indicators define the knowledge and skills that all students are expected to acquire by the completion of their thirteen year public school education." (1998, p.1) Core Curriculum Content Standards have been adopted by the state Board in the following seven academic areas: [1] visual and performing arts, [2] comprehensive health and physical education, [3] language arts literacy, [4] mathematics, [5] science, [6] social studies, and [7] world languages.

      In addition to the subject-specific standards, New Jersey adopted standards modeled in part after the employability skills identified by the 1992 Secretary's Commission on Achieving Necessary Skills (SCANS) report. While the subject-specific standards committees were in the process of reviewing proposed content standards, "certain themes reoccurred. These common themes reinforce[d] the notion that each content area draws on key elements of other content areas. For example, the need for students to learn problem-solving and critical thinking skills was reflected in all of the sets of standards." (1996, p.1)

      In order to include these kinds of skills and knowledge in the changes to the state's instructional and assessment system, the New Jersey Board of Education also adopted Cross-Content Workplace Readiness Standards. These standards reflect the goals of both educating students in subject-specific knowledge and, at the same time, emphasizing the kinds of skills that more broadly reflect the competencies all students will need in order to succeed in society and the workplace. According to the Cross-Content Workplace Readiness Standards, all students in New Jersey will be able to accomplish the following:

      Unlike the subject-specific standards, the Cross-Content Workplace Readiness Standards are not broken out by grade level. Rather, New Jersey intends for schools to begin to incorporate these concepts in their programs at all grade levels with age-appropriate activities such as "positive work habits" at the K-4 level or "preparing a résumé" in high school.

      Implementation Strategy: The New Jersey Department of Education is currently in the process of proposing the administrative code that, when adopted, will introduce a wide range of changes to the state's educational system. The goal outlined in the administrative code proposed in May 1998 is to phase in a new assessment system that will be fully functional by the time students graduate in 2006.

      The administrative code under discussion and public review proposes a series of changes in the statewide assessment system and provides guidelines for the integration of subject-specific and Cross-Content Workplace Readiness Standards into what New Jersey terms a student's "career education."

      Proposed Changes to the Assessment System: Students will be assessed with a new set of tests aligned to the state standards at grades 4, 8, and 11 - 12. New Jersey will no longer require boards of education and charter schools to administer commercial achievement tests "annually or at any particular grade level." The system will "be the assessment of record" and will combine state-administered paper-and-pencil assessments that include multiple-choice, open-ended response, and short-answer items. In addition, the New Jersey Department of Education will "develop and administer performance assessments collaboratively with districts and charter schools. These portions of the tests will be scored locally.... The department will provide... guidance and scoring rubrics to ensure measurement consistency across the state." (1998, p.3)

      According to the proposed code, the revised assessment system will have multiple purposes. Results will be given to parents and students to provide individual information about student progress. Assessment results will also be used by districts and charter schools to gauge "the alignment of their curriculum and instructional strategies with the Core Curriculum Content Standards and cumulative progress indicators." In turn, the New Jersey Department of Education will use the assessment system to certify that schools and districts are "providing a thorough and efficient education for all students." (1998, p.3) Results will also be published in an annual report card providing information about educational trends to the state's taxpayers.

      Proposed Career Education: According to the proposed administrative code, students will receive career awareness in kindergarten through grade 4, career exploration in grades 5 through 8, and career preparation in grades 9 through 12. Students will begin to develop individual career plans during the career exploration phase and will identify a career major before the 11th grade: "The career majors are broad clusters of programs of study that will show the relevance of the Core Curriculum Content Standards to the student's career plans." (1998, p.3) Students will also be required to complete a "structured learning experience," which includes volunteer activities, community service, paid or unpaid employment, or participation in a school-based enterprise. The structured learning experience will be documented through a contract describing the learning expectations; terms; and conditions of the experience signed by the student, the parents, and the employer.

      Evolution of Strategy: The New Jersey Department of Education's current proposal for the education system grew out of recent efforts to build consensus among the governor's office, the legislature, education and workforce training agencies, business and industry, parents, and local education systems. In addition, New Jersey was one of the first eight states to receive federal funding under the School-to-Work Opportunities Act (STWOA) and had already begun to approach education reform from a systemic perspective in order to meet the state's critical need for a well-prepared workforce. According to Martha Huleatte of the Office of School-to-Career and College Initiatives, when New Jersey first submitted its request for STWOA funds, the state had already "gone to a state employment and training commission," developing a common plan for workforce development involving education, labor, human services, and commerce departments. Through the adoption of both content-specific and Cross-Content Workplace Readiness Standards, New Jersey aims to continue to "embody school-to-work elements" in state reforms. According to Huleatte, to further the integration of school-to-work elements and improve the preparation of all students, the Department of Education recognized that "education is not just for the college-bound" and is spreading that message throughout the proposed changes to the education code.

      Evaluation Strategies: Over time and through the development of a state assessment system built around and aligned to content-specific and Cross-Content Workplace Readiness Standards, New Jersey intends to respond to the concerns of local educators regarding the implementation of the significant changes to the educational delivery system that has been proposed. According to Huleatte, in fact, support for these changes continues to be heard throughout the state. The "business community is greatly supportive [of the changes] and [is] becoming more vocal." Through a state Chamber of Commerce initiative called "School Counts," New Jersey employers are encouraged to ask for high school transcripts, thus supporting the state's academic and employability efforts. In addition, New Jersey has launched a public information campaign to encourage parents to become aware of the changes taking place.


References

New Jersey Department of Education. (1998, May 6). Proposed code for standards and assessment for student achievement. Available on-line: <http://www.state.nj.us/njded/proposed/standards.toc.htm>.

New Jersey Department of Education. (1996). Cross-content workplace readiness standards. Available on-line: <http://www.state.nj.us/njded/ccs/05cwrready.html>.




Oregon's Certification of Advanced Mastery

Using Employability Skill Standards and Career-Related Areas of Study

      Policy Rationale and Goals: Since the early 1990s, state-level education reform efforts in Oregon have centered around raising standards for student performance through a state driven system of student skill certification that embraces the notion of "new basics," including raising academic standards and employability skills such as communication and problem-solving. Oregon first introduced major educational change along these lines through legislation in 1991, and then again through revisions to the law passed in 1995:

Oregon expects more of its students. It expects students to master not only the traditional basics--reading, writing, and arithmetic--but also the new basics necessary for success in the next century--advanced mathematics, advanced science and the ability to apply academic knowledge in practical ways. The Oregon Educational Act for the 21st Century, passed by the state legislature in 1991, calls on schools to hold students accountable for higher academic standards. Students will demonstrate what they know and can do through complex assignments and periodic tests. Students who achieve the high standards will receive certificates certifying their abilities. (ODE, 1998b, p.1)

      Implementation Strategy: In 1991, the Legislature passed the Oregon Educational Act for the 21st Century, a sweeping initiative to reform the state's public education system. As part of this legislation, Oregon adopted certificates of "initial" and "advanced" mastery: the CIM and the CAM. Many of the provisions in the law were modeled after recommendations in the 1990 report of the National Center on Education and the Economy (NCEE), America's Choice! High Skills or Low Wages, warning that schools must raise academic standards for all students for the United States to prosper in an increasingly competitive global economy. The CIM and the CAM were introduced as the state's framework for a standards-based, performance-driven student assessment and certification system. The CIM, awarded at approximately age 16, would be followed by the CAM at around age 18. All standards, including CIM, CAM, and PASS -- the state's new performance based admission standards for the Oregon University System -- are aligned.[1]

      In 1995, "mounting fiscal pressures and opposition from parents" prompted Oregon legislators "to take another look at the state's comprehensive school-reform act--including its pioneering adoption of certificates of mastery." (Sommerfeld, 1995a) In response to criticism from the public that the standards for the certificates were unclear and fears that local diplomas would be replaced by the state CIM and CAM, the Oregon legislature passed another piece of legislation clarifying many of the requirements of the original act, including the option that students seeking a CAM would have access to both college-preparatory classes and vocational and technical training, "something the critics had said the act called into question." (Sommerfeld, 1995b)

      According to Holly Miles, coordinator of Data and Grants Management for the Oregon State Department of Education, the changes in 1995 attempted to address the concerns that the state reforms were driven by business needs, not by parents. The "original legislation did not use `educational' terminology and in 1995, the standards were recast in traditional academic terms (i.e., communication vs. English, computation vs. math) and a new timeline was put in place. Now we are dealing with learning and unlearning the new vs. the old CAM (and CIM)." The Oregon Department of Education (ODE) has written the administrative rules for the 1995 legislation, for release in December 1998. Both the CIM and the CAM will be awarded to students based on a system of combined local and state assessments tied to state-adopted standards.

      The CAM builds on the CIM and is focused on the 12th grade benchmarks and career-related learning standards which were adopted by the State Board of Education on December 19, 1996. (Oregon Department of Education, 1998a) The CAM "program components for schools" were approved by the State Board of Education on June 17, 1998. For students to earn a CAM, they must achieve the following:

      Schools are required to provide students with contextual learning experiences within an endorsement area. The state has adopted six endorsement areas -- arts & communication, business & management, human resources, health services, industrial & engineering systems, natural resource systems -- but districts can also develop their own.

      The 1995 revisions to the state's reform agenda addressed parental concerns by clarifying that districts could continue to offer a traditional diploma. While districts are mandated to offer the CIM and the CAM, the certificates themselves will be optional. It is Oregon's hope that these policies continue to drive statewide change while continuing to address the concerns of the public. According to Miles, the state intends the CIM to drive higher academic standards and the CAM to drive the adoption of instructional strategies that emphasize contextual learning through the required "endorsement areas," "career-related standards," and "career-related learning experiences."

      Evolution of Strategy: The state's implementation strategy has evolved over time. Oregon is finding that it is critical for its administrative rules to be written so that the needs of all students are met and so that students have the flexibility to switch easily between endorsement areas. According to Miles, the state has spent a great deal of time working on adaptive assessments for special populations and has embraced a collaborative process, which is typical of Oregon. They have involved members of the public, the business community, administrators, and teachers in developing the standards and assessments. The state is also proposing that some schools be considered "incentive schools" to try different models for implementing all aspects of the CAM.

      In 1997, the legislature responded to ODE research that indicated that schools would need more time to implement the CAM and extended the time frame for full implementation to 2004-2005. In addition, the legislation now requires districts to show progress toward CAM implementation each year.

      Most local districts are currently focused on implementing the CIM; however, the state is continuing with its work so that the CAM can be up and running by the time districts are ready to develop the "program components" that must be in place for students to be certified with a CAM.

      Outcomes/Lessons Learned: For Oregon, research conducted in cooperation with local districts has been essential to implementation of the CIM and the CAM. In the summer of 1996, ODE selected 15 small districts to assess the difficulties they would face in implementing CAM endorsement areas. Teams from ODE facilitated a planning process with representative groups from each area, including businesses, parents, teachers, and administrators. These groups were asked what they would need to implement the CAM and what they thought they could offer given the timeline for implementation. The research indicated that districts would need five years to prepare, which was one factor prompting the legislature to change the time frame. In addition, the technical assistance provided by the state teams encouraged several of the local school districts to continue with the planning process without state intervention.

      Though Oregon has been through substantial revisions to the original legislation envisioned in 1991, the changes are integral to the political process and are in direct response to the concerns of the public. ODE is moving forward with implementation with this long-term view in mind. As the first draft of the administrative rules for the CAM go out for public comment, the tradition of openness and close public scrutiny will continue. In preparation for the public hearings, ODE conducted fifteen public information meetings. Following those public hearings, the final rules will be adopted.


References

Commission on the Skills of the American Workforce. (1990) America's choice: High skills or low wages! Rochester, NY: National Center on Education and the Economy.

      Oregon Department of Education (ODE) (1998a). Questions and answers about the Certificate of Advanced Mastery. Available on-line: <http://www.ode.state.or.us/>.

ODE. (1998b). Raising expectations. Available on-line: <http://www.ode.state.or.us/cifs/cimcam.htm>.

Sommerfeld, M. (1995a, March 22). Pioneering reform act under attack in Oregon. Education Week on the WEB. Available on-line: <http://www.edweek.org/>.

Sommerfeld, M. (1995b, June 14). Oregon lawmakers retain master certificates. Education Week on the WEB. Available on-line: <http://www.edweek.org/>.




Ohio's Curriculum Framework System

Using Integrated Technical and Academic Competency Standards

      Policy Rationale and Goals: Beginning in 1990, Ohio developed Occupational Competency Analysis Profiles (OCAPs) for use as the basis of occupational training curriculum in both secondary and adult education programs. Business and industry panels developed OCAPs for approximately 60 occupational areas. Each OCAP includes separate lists of occupational, employability, and academic competencies. Assessments for each OCAP have also been developed. The OCAP system has been in place for several years and is implemented on a statewide basis. In 1998, however, Ohio began the process of shifting from OCAPs which separate occupational, employability, and academic competencies, to a model integrating technical and academic student competencies. In the future, Integrated Technical and Academic Competencies (ITACs) will serve as the curricular framework for career-focused education in the state.

      Changes in the economy which demanded increased levels of skill and knowledge and specific changes in Ohio educational law and policy were the major forces driving the change from OCAPs to ITACs. First, Senate Bill 55 increased the number of credits required for graduation and changed graduation test requirements from end of the 8th grade to competencies at the end of the 10th grade. Second, "Ohio's Future at Work: Beyond 2000" redirected Ohio's vocational education program from a goal of preparing for entry-level jobs to preparation for work and continuing education. The rationale for ITACs is further supported by the requirement of the Carl D. Perkins Vocational and Technical Education Act of 1998 (Perkins III) that "all students are taught the same challenging academic proficiencies as all other students."

      ITACs are to be used by local districts as a resource for program planning and as the basis for curriculum, instruction, and assessment, and will serve primarily for secondary and adult programs with articulation to postsecondary programs. Competencies from Tech Prep curriculum at both the secondary and post-secondary instructional levels will also be used and will describe the performance levels required for articulation from one level to the next. ITACs are intended to help instructors [1] expand student options for achieving career and educational goals; [2] integrate instruction and include active, project-based learning; [3] prepare students with a broad base of transferable career skills; [4] build partnerships between education and business/industry; and [5] support proficiency test scores.

      Implementation Strategy: The Division of Vocational and Adult Education is providing leadership in the development of ITACs. Ohio has embraced a four step process in the development of ITACs: [1] national academic, employability, and occupational standards are reviewed and synthesized; [2] standards are reviewed by both vocational and academic teachers; [3] the standards are updated and validated by business and industry representatives; [4] the standards are reviewed for direct links to Ohio Competency-Based Education Models and Ohio Proficiency Test Learning Outcomes.

      Core, Career Cluster, and Specialization ITACs Form the model in Ohio

      Evolution of Strategy: The development and implementation processes are complex and labor intensive. For example, Ohio has found that in developing some of the Career Cluster competencies, national skill standards may not provide a broad enough base for an entire cluster. Furthermore, in developing the Core ITACS, Ohio found it was important that the standards be developed jointly as opposed to in isolation of each subject matter or concept. In other words, when academic and employability components were co-developed, buy-in and ownership of the standards was created both for academic and technical educators. In the development of the Specialization ITACs, the integration of academics has been a difficult process, particularly when it comes to defining the academics skills that are equivalent to appropriate grade-level proficiencies.

      Outcomes/Lessons Learned: As a result of using academic, technical, and national and state curriculum models in the development of ITACs, there has been a wide range of support from all stakeholder groups, including academic and technical educators from both secondary and postsecondary education. The OCAPs have been effective in setting standards for vocational education, in providing the framework for accountability in the programs, and for identifying student competencies on career passports for employers and postsecondary institutions. Ohio is convinced that the new ITACs will reflect the higher order academic and technical competencies now required for students' academic and career success.


Reference

Information submitted by Joanna Kister, Director, Ohio Department of Education, Division of Vocational and Adult Education, Columbus, OH (1998, October 23).




Oklahoma's Occupational Skills Testing Program

Using Occupational Skill Standards to Measure Student Competency

      Policy Rationale and Goals: Since discussions that began in the mid-1970s, Oklahoma has been committed to developing a system of skill certification for its vocational students. Over 16 years ago, the Oklahoma Department of Vocational and Technical Education (Oklahoma Vo-Tech), a separate agency from the Oklahoma Department of Education, began to develop a competency-based testing program to measure the skills of vocational students in its comprehensive high school vocational programs and area vocational and technical centers. The Testing Division began as a segment of Evaluation and Testing Services in 1982; and in 1990, the Testing Division became a separate unit within Oklahoma Vo-Tech, growing from a staff of three in 1982 to 15 in 1998. The Testing Division's mission is to accomplish the following:

Implement a valid system to measure occupational competencies of students and industry workers, to meet the evolving certification needs of Oklahoma and select national groups:

The Testing Division works closely with vocational programs to establish occupational standards and measures for their students. To meet the certification needs of our vocational programs and industry, the Testing Division develops and continually maintains a variety of occupational duty/task lists and tests.... Our duty/task lists are an important part of Oklahoma's system of standards and measures. They ensure that our curriculum addresses industry's needs and that vocational students have attained those skills necessary for job success. (Oklahoma Department of Vocational-Technical Education, 1998)

      Implementation Strategy: The testing program as originally envisioned by business leaders and the Oklahoma Board for Vocational and Technical Education was closely linked to the framework under development by the National Occupational Information Coordinating Committee (NOICC) and the state counterpart, the Oklahoma State Occupational Information Coordinating Committee (SOICC). NOICC is a federal interagency committee, established by Congress in 1976:

NOICC has two basic missions. One is to improve communication and coordination among developers and users of occupational and career information. The other is to help states meet the occupational information needs of two major constituencies: 1) planners and managers of vocational education and job training programs and 2) individuals making career decisions. (Oklahoma Department of Vocational-Technical Education, 1992)

      The NOICC and SOICC information systems are based on specific occupations. In turn, as Oklahoma built its system, it chose a testing strategy based not on broad occupational categories, industry areas, or general workplace competencies, but on specific occupations. Oklahoma's testing program builds curriculum and assessment from specific occupational duty/task lists for over 60 program areas. The validity of the duty/task lists, curriculum, and assessments is verified and endorsed by industry advisory groups for each program area.

      Since its inception, the Testing Division has developed over 60 duty/task lists, addressing approximately 300 vocational occupations. According to Chuck Hopkins, an Oklahoma Vo-Tech official, starting at the occupational level is most useful for their purposes. Creating a separate duty/task list based on an occupation gives instructors information about "all the skills that are required and a foundation for the mix of tests and curriculum." In addition, "employers can identify with the system -- people are hired for specific jobs" as opposed to broad industry clusters. The duty/task lists contain employability skills that often cut across occupations, and instructors can combine information from lists to meet the needs of students in their courses.

      Evolution of Strategy: Oklahoma Vo-Tech administers the program with the help of testing liaisons/assessment coordinators at the school sites. The analysis of all test results is done at the state level. According to Hopkins, the biggest changes occurring now involve "going to the Internet system" and keeping up with technology. The Testing Division envisions using the Internet for ordering curriculum and testing materials, managing the test results process, and providing information to school sites and Vo-Tech centers. Oklahoma Vo-Tech sees great promise in the use of technology for the testing program and hopes to use video-conferencing to provide inservice programs statewide. As Hopkins asserts, "Technology is the next step--as the whole system changes from teaching to learning."

      Over the years, Oklahoma Vo-Tech has also developed extensive curriculum materials and information that tie closely to the duty/task lists and assessments. The Curriculum and Instructional Materials Center distributes the materials and is in the process of developing an on-line catalog for teachers and school site coordinators. The Curriculum and Testing Divisions market the materials at shows and conferences and distribute catalogs widely throughout the state and nation.

      Lessons Learned: Oklahoma Vo-Tech is confident in its occupational testing program and counts among its successes the continued endorsement and validation received from industry. In addition, the state makes substantial investments in developing curriculum materials to support instructors. According to Hopkins, the Testing Division conducts regular customer surveys and focus groups and "is constantly looking at ways to improve--we go through all the hoops." They have found that while instructors are not required to use their materials, the majority consistently do.

      As states implement testing systems, it is important to remember that "the perfect system is almost impossible to administer--testing great numbers of students accurately is difficult and resources do limit the scope of our dream systems," explains Hopkins. Much research is conducted, and the assessment system is tested on incumbent workers in the construction office and sales industries, measuring both "hard" and "soft" skills. The close connections maintained by Oklahoma Vo-Tech with both industry and vocational instructors have resulted in "no political opposition to the system" and it is consistently endorsed by the State Board of Vocational and Technical Education. The testing program has also maintained political support by avoiding "penalties for individuals" and "staying competency based."


References

Oklahoma Department of Vocational-Technical Education. (1997). NOICC information. Available on-line: <http://www.okvotech.org/testing/occptest.htm>.

Oklahoma Department of Vocational-Technical Education. (1998). Occupational testing. Available on-line: <http://www.okvotech.org/soicc/national.htm>.




Virginia's Core Academic Standards

Building Accountability Through Clear and Focused Standards

      Policy Rationale and Goals: Like nearly all states, beginning in the early 1990s, Virginia has been in the process of drafting and adopting a set of state standards that describe the skills and knowledge students should achieve at various stages of their educational careers. In 1995, Virginia adopted "Standards of Learning" in the core academic subjects of English/language arts, math, science, and history/social science. According to the American Federation of Teachers' Making Standards Matter 1997: An Annual Fifty-State Report on Efforts to Raise Academic Standards,

Virginia's standards are extraordinarily clear, focused, and well grounded in content. Their grade-by-grade and course-by-course structure ensures that they will be useful to teachers and other school staff regardless of the grade or subject they are involved in. ...They reflect some tough choices about what is most important for students to learn, rather than trying to cover everything. It is because of this combination of clarity, detail, content, and precision that we consider Virginia's standards `exemplary' and worthy of a close look by other states. (Gandal, 1997, p.95)

      In addition to adopting a set of standards in the four core academic content areas, Virginia is in the process of building other components of a standards-driven state accountability system that will support the integration of these standards statewide. According to a response from Virginia that was contained in the appendix of the AFT Making Standards Matter 1997 report, the state's Superintendent of Public Instruction at that time, Richard T. La Pointe, described four components of the state's "educational reform package: [1] rigorous standards in the four content areas; [2] assessments designed to assess those standards; [3] accreditation standards for schools which consider student performance on these standards in the accreditation decisions; and [4] a school performance report card to the public." (p.160) The state's Standards of Learning were adopted after an extensive review process throughout the state.

      Implementation Strategy: Since adoption of the state's Standards of Learning, Virginia has been establishing a new assessment system to measure student competencies aligned with the state standards. In the spring of 1998, Virginia administered new Standards of Learning tests to public school students in grades three, five, and eight in math, English, history, and social sciences. End-of-course exams were also administered to high school students in eight subject areas. Committees responsible for setting the Standards of Learning reviewed the results of the high school exams to determine passing scores and recommended that students would have to answer between 60 - 70 % of questions correctly in order to pass and, eventually, to graduate. (Portner, 1999, p.181) "Not only will students, starting with the Class of 2004, need to have passed six of the eleven high school tests to graduate, but schools where less than 70 percent of students have passed the tests also will face the loss of their accreditation, starting in 2007." (Benning, 1998) Public accountability will be further enhanced by a state requirement that schools provide detailed information to the public on student achievement, results of Standards of Learning tests, crime statistics, and attendance. Districts are also given the option of including information deemed pertinent for explaining the results and putting the statistics in context. Teachers will be supported in their efforts to improve student learning by a $25 million staff development program with $14.3 million for extra remedial courses.

      Lessons Learned: Like other states involved in developing standards-based accountability systems, Virginia has taken time to build support for the reforms and is giving schools and students the time needed to implement the wide range of changes. According to the Virginia Board of Education which adopted passing scores for 27 Standards of Learning tests in October 1998, the "implementation period is intended to give educators time to use test results to strengthen school programs where needed." (Virginia Department of Education, 1998) Academic standards and an emphasis on student learning results present both challenges and opportunities for vocational programs and schools with a focus on career preparation and technical training. For schools participating in the Southern Regional Education Board's High Schools That Work (HSTW) network, Virginia's emphasis on academic standards and assessment provides the possibility of deepening many of the changes in curriculum and instruction that are already beginning to occur. With its emphasis on using data for program improvement, HSTW schools have taken many of the first steps that all schools in the state will eventually need to take in order to incorporate a data-driven approach to school improvement. In January 1999, the Virginia Department of Education presented a one and one-half day workshop for teams from 64 HSTW sites on using data in a systemic approach to improving teaching and learning. While the workshops focused, in part, on using the 1998 HSTW assessment results, participants were also encouraged to use other school-based data, including results on the state's new set of assessments.


References

Benning, V. (1998, November 16). Will new Virginia school standards pass muster with the public. Washington Post web site. Available on-line: <http://www.washingtonpost.com/wp-srv/WPlate/1998-11/16/1641-111698-idx.html>.

Gandal, M. (1997). Making standards matter 1997: An annual fifty-state report on efforts to raise academic standards. Washington, DC: American Federation of Teachers.

Portner, J. (1999, January 11). Quality counts `98: Rewarding results, punishing failure. Education Week, 18(17), 181.

Virginia Department of Education. (1998, October 30). Virginia Board of Education sets passing scores (Press release). Available on-line: <http://www.pen.k12.va.us/VDOE/NewHome/pressreleases/oct3098.html>.




Washington's Industry Skill Standards Projects

Meeting Employer and Employee Training Needs Through the Community and Technical College System

      Policy Rationale and Goals: A wide range of efforts to develop standards-based systems that define and assess the skills and competencies of employees in various industries and occupations have been underway at the national and state levels for many years. Secondary and postsecondary education systems continue to grapple with improving the academic performance of all students and smoothing the transition from high school to postsecondary education and training or directly to work. In addition, employers have called upon educational institutions to more closely align curriculum and training programs to meet the needs of a rapidly changing economy.

      The National Skill Standards Board, established under the Goals 2000: Educate America Act, has supported the national effort to establish a voluntary skill standards system by providing financial support to 22 national industry skill standards pilot projects. While efforts to develop a voluntary, national skill standards system continue with the support of educators, private sector partners, and state workforce training systems, many states have chosen to forge ahead with developing industry skill standards that meet their own particular workforce development needs.

      One example is the state of Washington. Through funds from the School-to-Work Opportunities Act, the Washington State Board for Community and Technical Colleges has taken the lead in establishing the state's industry skill standards development process. Local community and technical colleges apply to the State Board for funding to support local partnerships as they develop industry-defined skill standards. These partnerships involve industry groups and leaders from community and technical colleges, K-12 schools, and four-year universities.

      In 1994, the state adopted a three-step process for local colleges to follow so that local partnerships' creations will be able to form the basis for standards and assessments that eventually will be used statewide. Local partnerships (1) compile "the existing skill requirements developed by industry associations, employers, and labor unions"; (2) convene "focus groups of employers and workers to determine which skills are relevant to the state labor market"; and (3) validate "the work of these groups through representative samples of the industry statewide." (Hardcastle, 1998b) Currently, there are 19 different skill standards projects in various stages of development:

Completed Standards:
      Information Technology
      Allied Oral Health
      Cosmetology
      Secondary Wood Projects
      Telecommunications
      Chiropractic Technicians

Standards Nearing Completion:
      Manufacturing
      Retail/Wholesale Trade
      Natural Resources Technology
      Law Enforcement
      Food Processing

More Recent Standards Projects:
      Audiology/Hearing Aid Technology
      Early Childhood Education
      Vocational Instructors
      Para-Educators
      Agriculture
      Travel and Tourism
      Chemical Dependency Counseling
      Optician Technology

      ImplementationStrategy: The State Board for Community and Technical Colleges has taken the lead in monitoring the overall development of a state industry skill standards system. Individual community and technical colleges with particular industry specialization are selected to develop skill standards that will eventually be adopted statewide. (Klein, 1996, p.23) Colleges, in partnerships with business and industry, conduct industry meetings and apply for funding from the state to help with personnel, printing, and validation of materials. The criteria set by the State Board include the following:

      The Chiropractic Technician Skill Standards is one example of the general approach taken by development teams in Washington. Skill standards are defined as the "set of specific foundation skills, workplace competencies, technical skills, and performance criteria that are developed with cooperation between educators and representatives from business." (Highline Community College, 1998) After conducting research into prior projects and securing project funding, the development team conducted a facilitated Developing a Curriculum (DACUM) workshop which clarified the Chiropractic Technician job by listing the major job functions and related tasks. This "task analysis" was sent to 500 chiropractic offices in Washington for review and validation. The results of the survey of professionals further defined the tasks most valued in the field. Finally, "Follow-up work with small focus groups, and a foundation skills survey provided the specific details of tasks that educators needed for curriculum development." (Highline Community College, 1998)

      Having completed and published the skill standards document, the development team is currently using the skill standards in the classroom and measuring any changes that result. A "flexible modularized curriculum" based on the standards is also planned: "Students will be able to start the program any time at their current level of competencies and finish the program when they master all the tasks."(Highline Community College, 1998) The next step in the process will be to develop assessments that measure the required levels of performance for meeting the skill standards which, in turn, can form the basis for certifying program completion and/or licensing, when appropriate.

      Evolution of Strategy: While the strategy outlined above remains largely unchanged, local projects have received technical support through a technical consultant and through the Washington State Skill Standards Resource Center. The Washington State Work-Based Learning Resource Center Guidebook is available on the Resource Center's Web site and responds to the concerns of development teams which were frequently hampered by the lack of standardization of "both the content and process of skill standard development; often delaying progress for several months." The guidebook gives development teams a "a `standardized' yet flexible approach with a recommended skill standard development process and several state and national models. Additionally, a computer template of the process is included to allow for convenient and consistent formatting of [the] data."

      Outcomes: In answer to the question "Why do skill standards matter?" the Washington State Skill Standards Resource Center states in its guidebook that "updating skills and knowledge is now a lifelong endeavor, causing many employers and employees to spend more effort, time and money on education and training. Skill standards provide benchmarks for making education and training decisions, shaping curricula, and directing funds toward high value education and training investments."

      The commitment to developing skill standards continues at the state level, and the projects that are being developed will help educators and trainers understand both technical and more general work-readiness skills that workers will need for employment in high-skill, high-wage jobs. Assessment and program certification are some of the next steps for the state's industry skill standards system. The State Board intends for the process to lead institutions to develop appropriate curriculum and update programs that will be endorsed by industry.

       According to Hardcastle, "Getting to scale will require expanding the number of skill standards in high-demand and emerging industries and occupations and developing standards-based applications for education and state workforce agencies, business, and labor organizations." To continue to build the system, the State Board plans to do the following:


References

Hardcastle, A. (1998a, November 18). Skill standards as a key state strategy for workforce development. Olympia, WA: State Board for Community and Technical Colleges.

Hardcastle, A. (1998b, November 20). Education and training that meets industry requirements: Skill standards reflect an industry-education partnership. Olympia, WA: State Board for Community and Technical Colleges.

Highline Community College. (1998). Skill standards for the chiropractic technician. Midway, WA: Author.

Klein, S. G., S.C. Cuccaro-Alamin, J. Giambattista, E.G. Hoachlander, and B. Ward. (1996, November 12). Applying the standard: Using industry skill standards to improve curriculum and instruction. Berkeley, CA: MPR Associates.

Washington State Work-Based Learning Resource Center. Skill standards guidebook. (1998). Available on-line: <http://www.wa-wbl.com/resources_educators/skill_standards/sectioni_2.htm>.




Test Flights


Emerging Strategies for Setting Standards


Idaho: Academic Exiting Standards for Grade 9 - 12

      In January 1999, Idaho presented a third draft of academic "Exiting Standards" for grades 9 - 12 in five core subject areas: [1] math, [2] science, [3] social studies, [4] language arts and communications, and [5] health. As the 43rd state to develop standards, Idaho has learned from other states and strives to be inclusive of all stakeholders in the process. One hundred and thirty people participating in subject area subcommittees developed the first draft of "Exiting Standards" in each of the five subject areas. Each of the subcommittees is regionally representative and includes vocational and academic teachers, members of the business community, and parents. In October 1998, the second draft of the Exiting Standards were presented at public hearings. Based on the public comments, the third draft will be developed and presented for an external review. The next steps for the third draft are approval by the State Board of Education and presentation to the legislature, which is expected to occur in February 1999. Standards for grades K-8 and curriculum guides and assessments for the Exiting Standards for grades 9-12 are also planned.


Maine: Articulating Secondary and Postsecondary Skill Standards

      Approximately ten years ago, Maine began the process of developing skill standards for its secondary and postsecondary technical education programs. In that process, Maine developed skill standards in approximately two dozen secondary occupational skill program areas. Because of cuts in state and federal funding in the early 1990s, skill standards development had to be put on hold. However, in 1994, under School-to-Work implementation funding, Maine resumed the standard setting process and is currently developing a new set of advanced academic and occupational skill standards in nearly twenty program areas. The goal of the Workforce Education and the Maine Learning Results Project, or WELEARN as it is known in Maine, is to complete skill standards in secondary workforce education; school-to-work programs; and, potentially, postsecondary education. Standards will be developed in close interaction with business and employer representatives, and, where possible, these standards will incorporate the frameworks completed under the National Skill Standards Board (NSSB). To that end, Maine has reorganized its current list of courses under the 15 industry sectors identified by the NSSB.


Wyoming: Setting Academic Standards From the Bottom-Up

      Using a grassroots, bottom-up approach to setting state standards, Wyoming is in the second year of the process. Building on the work of districts to develop standards with the participation of staff, parents, and community members, the Wyoming Department of Education is convening six regional standards development groups. These groups include representatives from each school district, the community college in that region, and a business and community representative. The groups are charged with examining the region's district standards and identifying commonalties across the districts. Each group then develops a set of regional goals and benchmark standards for grades 4, 8, and 11. After this first phase of the process, state groups are convened that also include representatives from the Department of Employment and an employer. The state groups use the work of the six regions to develop state standards documents. These documents are then compared to national standards as well as documents from other states. The third and final phase is the development of performance standards that describe proficient, advanced, and partially proficient performance at each benchmark level. These drafts are then submitted to focus groups for input and to public libraries and schools and placed on the Wyoming Department of Education web site for written comments. Standards are further adjusted and submitted to the Wyoming State Board of Education for approval. In 1997 - 1998, language arts and mathematics standards were developed, followed by science and social studies in 1998 - 1999. Foreign language and health and physical science are planned for 1999 - 2000 and career/vocational education and fine/performing arts in 2000 - 2001.




ASSESSMENT SYSTEMS

      After adopting a set of standards that define the essential knowledge and skills that are desired for all students, the next question often asked is, "What assessment strategies will most appropriately measure whether the standards and benchmarks have been attained?" As vocational programs, states, and industry-based partnerships begin to move beyond setting standards, policymakers must first establish the purpose of the system. In general, assessment systems fall under one or more of the following three broad purposes:

  1. To measure individual student learning.

  2. To certify the mastery of skills by students.

  3. To collect information on program performance.

      Defining the purpose of an assessment system is a critical first step in deciding the types of assessment strategies that will provide appropriate measures of student progress or levels of accomplishment. In addition, policymakers must decide the degree of technical quality that will fulfill the system's purpose or purposes. Questions of feasibility based on cost, time, and administrative complexity must also be addressed.

      As assessment systems are more closely tied to systemic education accountability initiatives, policymakers must ensure that the assessments are perceived by stakeholders as reasonable measures of student skills. In turn, educators, parents, students, and employers become vital participants in the validation process of developing a reliable and trusted assessment system. As assessment results are more widely used as communication tools and to gain public support, the credibility of assessment results becomes an increasingly critical aspect of a standards-based accountability system.




Kentucky's Assessment Program


Implementing Standards-Based Assessment to Better Track Students' Progress

      Policy Rationale and Goals: Under the far-reaching Kentucky Education Reform Act (KERA) of 1990, extensive procedures for measuring the achievement of Kentucky students were created under a system called Kentucky Instructional Results Information System (KIRIS). Kentucky policymakers viewed KIRIS as a strategy to track the progress of individual students, even if they moved from one county to another. In addition, KIRIS was intended to form the basis for incentive rewards to schools meeting goals for progress and for assistance to schools in need of support. Kentucky created goals that explicitly stated graduation expectations for students. The assessment system was built specifically around these goals and addressed the state-adopted expectations for students.

      First used in 1992, all students in the state were tested, and the results were used to determine whether schools and students had made sufficient progress in meeting the state's goals. Individual teachers at schools rewarded with incentive funds were allowed to vote on how the funds would be spent, and KIRIS provided a list of schools in need of assistance and state intervention. By tying incentives and state intervention to a new system of assessment, KIRIS proved quite controversial. In addition, KIRIS assessments were based primarily on portfolios, performance-based exams, and writing samples completed during class. While arguably, these are more authentic measures of student achievement, Kentucky found that these kinds of assessments were nonetheless harder to implement because of problems related to scoring reliability and comparability across schools. The confusion around the types of assessment and the significance of the results created further contention around the assessment system.

      In April of 1998, the Kentucky legislature replaced KIRIS with a new assessment program. This new system, the Commonwealth Accountability Testing System (CATS), is planned for rollout in 1999. The CATS test includes a national norm-referenced section that is both tied to the state's core curriculum and allows national comparisons to be made for students in Kentucky. This addressed a weakness of the KIRIS test, which only compared students to Kentucky's student expectations and not to national norms. The CATS test will also use a shorter version of the writing exam that was a part of the KIRIS test. The CATS test will provide information on each individual student, allowing for comparisons of individual student progress over time.

      It is likely that results from the CATS test will be included on student transcripts. High school transcripts will include information on students who complete vocational programs, certification information, community service, service learning, and any workbased learning completed. Although all students are required to take a college prep curriculum based on the reform efforts of the states (4 English, 3 science, 3 math, 3 social science, 1 fine/performing arts, 1 health/physical education, 7 electives), students must also have an individual graduation plan with a focus on career development.

      The Department of Education plans to offer an occupational skill standards certification system based on career majors for students that signals to employers a graduate's skill level. Kentucky's goal is to have a standards-driven assessment system that will drive curriculum and workforce training needs in the state.

      Implementation Strategy: The Department of Education began their ambitious plan by establishing a standards development and assessment implementation plan that included all the potential career major areas needing assessment development and prioritizing the areas that would be worked on first, second, and third. Many of these decisions were based on the standards that exist and the support of employers related to these areas. Both labor market data and course enrollment data will be used to fine-tune the phase-in plan as the work progresses.

Excerpt from Kentucky Assessment Implementation Plan

Career
Cluster
Career
Major
1998-
1999
1999-
2000
2000-
2001

Agriculture Horticulture Develop
Standards &
Assessments
Pilot
Standards &
Assessments
Implement
Standards &
Assessments

Production Develop
Standards &
Assessments
Pilot
Standards &
Assessments

Business
& Marketing
Administration
Support Services
Develop
Standards &
Assessments
Pilot
Standards &
Assessments
Implement
Standards &
Assessments

Financial Services Develop
Standards &
Assessments
Pilot
Standards &
Assessments

Retail Services Pilot
Standards &
Assessments
Implement
Standards &
Assessments

Travel & Tourism Develop
Standards &
Assessments
Pilot
Standards &
Assessments

      Within each program area, a committee of employers and educators has been established to adopt, adapt, or develop standards. When national standards are available, the committee relies on those standards first. For example, in the area of manufacturing, the committee utilized resources from the National Skill Standards Board's Building Linkages project and the National Coalition for Advanced Manufacturing (NACFAM) national skill standards to develop Kentucky's own Manufacturing Skill Standards.

      Standards are being developed for use at the secondary, postsecondary, and remedial education levels. Standards are developed in Levels 1 and 2, with Level 2 targeted to postsecondary education. Students will have the option of a more general career major certificate and a certification in specific industry (i.e. within manufacturing). Standards setting efforts in Kentucky are following the lead of the Manufacturing standards by grouping standards in the following categories: Application of Academic Expectations, Employability Skills, and Occupational Skills.

      The state will develop a multiple-choice test and problem-based scenario in each career major to be scored at the state-level. The problem-based scenario will be scored on a four-point analytic rubric that utilizes categories of knowledge and skills used in the standards. The scenario assessment will include the prompt, instructions, and evaluation criteria which students will respond to in a 45-minute period. In some career majors, there may be locally scored assessment components such as portfolios that are used and kept at the local classroom level.

      Evaluation Strategies/Lessons Learned: In most cases, the lack of national standards at the career major level has been the most disappointing aspect of Kentucky's effort. While specific standards in a variety of occupations do exist, very few broader, secondary-level standards in career majors have been developed. Kentucky is working hard to develop their own standards based on whatever standards do exist at the national level in an attempt to utilize a similar format across career majors.

      Kentucky plans to move rapidly through the standard and assessment process in order to utilize the system to certify students and report academic and occupation attainment in Perkins III accountability requirements. With the help of the National Center for Research in Vocational Education and the Vocational-Technical Education Consortium of States(V-TECS), it is hoped that the system will contribute to both workforce and economic development throughout the state of Kentucky.


Reference

Information provided by James Justice, Program Consultant, Kentucky Department of Education, Division of Secondary Vocational Education, Frankfort, KY (1999).




New Mexico's Youth Opportunities in Retailing Model Program


Partners in Implementing National Retail Skill Standards and Assessments

      Policy Rationale and Goals: Through grant funding from the National Skill Standards Board (NSSB) and the U.S. Department of Labor, the National Retail Federation (NRF) convened a task force of retailers (sales associate to CEO level) and education, labor, and government representatives to develop pilot standards for the Professional Sales Associate. This work was supported by the National Retail Institute (NRI), the nonprofit research and education foundation of the NRF. The NRI is currently managing the Sales & Service Voluntary Partnership&tm;. The Sales & Service Voluntary Partnership is a national body serving as the catalyst for skill standards development for the retail, wholesale, real estate, and personal services industries.

      While the Sales & Service standards continue to be developed, NRI has moved ahead with developing an assessment system to complement its pilot retail standards for the Professional Sales Associate. This assessment system has two options: [1] the written or computer-based Retail Readiness Assessment (RRA) and [2] the video-based scenario AccuVision Retail Assessment System. To implement the use of its pilot standards and assessment system, NRI has developed or is currently developing a range of materials and programs. DECA, the national student organization linked to Marketing Education programs in every state, has been a strong partner with the NRF/NRI in the development and dissemination of retail skill standards. DECA has been involved in the piloting of the RRA and the AccuVision Retail Assessment System and continues strong involvement in the NRI's School-to-Work initiative -- Retail Employer Link to Education (RELE).

      Retail Readiness Assessment: The Retail Readiness Assessment (RRA) is available in pencil and paper or computerized format. The NRI worked with NCS London House, a national assessment development company, to create the RRA. The RRA is designed to measure key qualities of a successful Professional Sales Associate, including customer service aptitude, customer service attitude, confidence/influence, sales aptitude, sales responsibility, and service knowledge. NCS London House does the scoring for the paper version. On-site scoring and results are available for computer-administered tests using software available for purchase from NRI (called Quanta for Windows). Results from the assessment are summarized in a two-page Composite Index that provides a quick reference to the individual's overall readiness to provide excellent customer service. The first page of the report analyzes the individual's scores in the tested areas of customer service, sales, and validity (measuring the test taker's accuracy and candidness). A second page of the report examines positive and negative behavioral indicators based on the individual's responses in each of the areas measured by the assessment. Below are two sample questions from the RRA:

Retail Readiness Assessment Sample Questions
   1) Sometimes an item a client wants is out of stock. It is a good policy for a salesperson to:
(CUSTOMER SERVICE APTITUDE)
   a) Tell the customer it will be shipped tomorrow
   b) Tell the customer there is no way of knowing when the item will be available
   c) Offer to telephone the cbrent when the item comes in and is being shipped
   d) Tell the customer to telephone periodically to see if the item is in
   2) An employee's attitude toward work can influence customer satisfaction very much.
(CUSTOMER SERVICE ATTITUDE)
   a) Strongly agree
   b) Moderately agree
   c) Sbrghtly agree
   d) Sbrghtly disagree
   e) Moderately disagree
   f) Strongly disagree

      AccuVision Retail Assessment System: The second option in the NRI assessment system is a tool that employs video job simulation, and computer scoring to capture the skills and abilities required for success in sales associate positions. This assessment system contains video simulations of various job situations and presents four possible responses, each varying by degree of correctness. The situations presented are based on research of realistic job situations and use the Professional Sales Associate skill standards as a foundation. Validation studies comparing scores on the assessment to actual job performance have been conducted. Because the assessment is video-based, the reading ability of test takers has little bearing on their performance. The assessment system includes three modules: [1] customer relations, [2] sales skills, and [3] training. Each of these include an analysis of the sales associate's ability to learn and apply new information.

      Implementation Strategy: Both the written RRA and the video-based AccuVision assessments are designed to serve a variety of purposes. The assessments can be used to assess the skills and abilities of potential job candidates, to determine the training needs of an employer's existing sales associates, or to develop curriculum for educational and job training programs. By creating flexible, stand-alone assessments that are aligned to national skill standards, the NRI hopes to create a system that is useful for a wide range of organizations involved in building a high-performance workplace in the retail industry.

      NRI partners with a range of retail industry, education, and workforce development organizations to implement a full spectrum of options using its skill standards and assessment systems. Retail Employer Link to Education (RELE) is a recent School-to-Work initiative resulting from an industry need and through a grant from the National School-to-Work Office. The NRI is helping its affiliated state retail associations implement the Youth Opportunities in Retailing (YOR) program through the RELE initiative.

      The YOR program model was developed by the New Mexico Retail Association, in partnership with member retailers and DECA advisors in the state. Through the RELE initiative, at least seven state retail associations will partner with member retailers, DECA, and other organizations to implement the YOR program over the two-year grant period. According to Corinne Berkseth, a national DECA staff member working on loan to the NRI, under this model, a retail state association staff person is designated to broker the relationship between schools, students, and retailers. DECA continues to play a primary education partner role in New Mexico and in the other implementation states.

      The mechanics of the YOR program include introducing students to the retail industry, providing them with pre-employment training, assisting with part-time job placement, and monitoring student and retailer progress. An initial presentation on the program and retail career possibilities is given in Marketing Education/DECA or other classes by YOR staff. After the presentation, students apply for the program. As part of the application process, students participate in a seminar led by YOR staff. This pre-employment seminar is approximately three hours long and usually takes place after school at the school site. An overview of job seeking and desired employability skills is provided. Students then take the Retail Readiness Assessment and conduct a mock job interview with YOR staff. Potential retailers are identified, and the student applies and interviews directly with the companies. Once the student is hired, he or she is officially enrolled in the YOR program. YOR staff monitors the student's progress on the job in partnership with his or her employer to ensure both student and employer are benefiting. YOR hosts two full-day seminars during the school year so that students continue to learn how they can progress in their current positions and how their success can lead to career opportunities in the retail industry or other arenas. In addition, grades are monitored and performance evaluations are conducted on a regular basis.

      Outcomes: The NRI anticipates continuing its partnership with DECA to implement the YOR program. It expects YOR will become one option for implementation of skill standards and assessments in a manner that is systematic and sustainable at the local, state, and national levels. Today, the YOR program has grown to include partnerships in Arizona, Hawaii, and Massachusetts, with plans for expansion into three additional states later in 1999. In the spring of 1999, the NRI will publish a follow up to its Implementation Guide of 1996, updating best practices of all kinds for integrating model retail skill standards, aligning curriculum, and using its assessments.


References

Information regarding the Retail Readiness Assessment is available on-line from the National Retail Federation web site: <http://www.nrf.com/nri.rra.htm>.

      Information materials regarding the AccuVision Retail Assessment System can be obtained from Alignmark, 258 Southhall Lane, Suite 400, Maitland, FL 32751-7457, (800) 682-4587.




California's Career-Technical Assessment Program (CTAP)


A Voluntary System to Assess Vocational Skills and Improve Instruction

      Policy Rationale and Goals: California's Career-Technical Assessment Program (C-TAP) began in 1990 and was developed by Far West Laboratory (FWL), now known as WestEd, under contract to the California Department of Education. Since 1990, four assessment components have been developed for C-TAP for each of five vocational program areas: [1] agriculture, [2] business, [3] health careers, [4] home economics, and [5] industrial technology education. The components include written scenarios, projects, portfolios, and, more recently, multiple-choice tests.

      C-TAP was originally proposed as a statewide performance-based certification system for vocational students. Under C-TAP, students were to be given the option of demonstrating competency for employment or advanced training based on their mastery of California Model Curriculum Standards (MCSs) for programs offered in California high schools and Regional Occupational Centers or Programs (ROC/Ps).

      Over time, the assessment components designed for C-TAP evolved into an assessment system that helped instructors improve instruction in the classroom, rather than certify students for employment or further education. The program has again been adapted for another purpose. This time the focus is on assessing a smaller number of broad-based clusters of job skills and recognizing student achievement in a format similar to that of the statewide academic merit test called the Golden State Exam. According to Sri Ananda of WestEd, like the Golden State Exam, the Assessments in Career Education (ACE) is a voluntary assessment system where students can receive a seal of recognition on their diploma. In the spring of 1999, the ACE program will be administered in five career technical areas: [1] agriculture, [2] computer science and information systems, [3] health care, [4] food services and hospitality, and [5] industrial and technology education.

      There have been many reasons for the continuing evolution of C-TAP including shifting political climates regarding assessment, department of education reorganization and changing purposes, the lack of centralized direction to provide incentives to participate, and consistent implementation. While the particular focus of the program has evolved over time, the primary goal of C-TAP remains the same: to provide better information to teachers, students, and other stakeholders about the achievement level of students through multiple forms of assessment. According to Daniel McLaughlin of WestEd, C-TAP is "one of the best [assessment] tools out there to teach to and assess occupational work standards." It is hoped that the information gathered through C-TAP assessment strategies will continue to help teachers improve their instruction by multiple means of measuring the broad skills and knowledge necessary to succeed in the workplace.

      Implementation Strategy: The C-TAP system consists of four components: [1]multiple-choice questions, [2] written scenarios, [3] student projects, and [4] portfolios. The components are linked to Challenge Standards, formerly called Model Curriculum Standards (MCSs), in the five vocational areas and general workplace readiness standards. Because C-TAP is a voluntary system, teachers can use all of the components or choose those pieces that best suit their needs. Details of the four components follow:

  1. On-demand, multiple-choice/short answer tests which have been developed and are part of ACE.

  2. Written scenarios involving problem-solving tasks requiring immediate written responses to job-related problems. Students may "rehearse" or practice the scenario, but the official assessment is an on-demand task administered under standardized, secure testing conditions with a 45-minute time restriction. Written scenarios are scored holistically based on four levels of achievement using rubrics developed by WestEd: [1] Low Basic, [2] Basic, [3] Proficient, and [4] Advanced.

  3. Projects allowing students the opportunity to show their expertise and knowledge of one or two standards in their area of study. They involve a hands-on application of skills as well as an oral presentation of the knowledge and skills they learned during the course of the project. Projects include the following:

    • A plan - the steps of the project and the material that will be submitted for evaluation

    • Evidence of progress - three sources demonstrating the skills used and the work accomplished before submitting the final product

    • Final product - which demonstrates mastery of the standards and may include the actual project or other documentation (i.e. videotape)

    • Oral presentation - a description of the project, the knowledge and skills learned, and a self evaluation

  4. Portfolios that include the following:

    • Presenting the Portfolio -- students introduce readers to the portfolio through a letter of introduction and a table of contents

    • Career development package -- an employment or college application, a letter of recommendation, and a résumé

    • Work samples -- at least four products that illustrate mastery of skills detailed in at least one MCS

    • Writing sample -- a demonstration of their writing skills

    • Supervised practical experience evaluation -- an illustration of skills the student has mastered in a work-based environment (This component is optional)

      According to WestEd, the portfolio component has become the most widely liked and used part of C-TAP. Because of the evolving purpose and the subsequent "low-stakes" emphasis of C-TAP components to improve instruction rather than to certify students for employment, most of the focus on implementation has been to support the use of the materials in the classroom, not in investing in the technical quality of the instruments. ACE has been studied and has been shown to have adequate levels of technical quality, however. In addition, the Teacher Guidebook and Student Guidebook created by WestEd have become essential to the implementation of the program.

      Evolution of Strategy: A critical barrier to statewide implementation and consistent adoption of C-TAP is due in part to shifting political winds and the reorganization of the California Department of Education. In addition, the impact of changes in the oversight of the program from Vocational Education to the Student Performance Division further affects the purpose and focus of the program. In part, C-TAP has survived these changes through teachers' perceptions of flexibility of the components and the responsiveness of the system to the needs of teachers to improve instruction and assessment together.

      More recently, C-TAP has again undergone changes; this time it has gone from more strictly performance based and constructed responses to a standardized multiple-choice exam with open-ended questions. As the stakes become higher and C-TAP components are more widely used, the usability and adaptability must, in part, give way to the creation of an assessment system that is accessible statewide and administered fairly.

      Outcomes/Lessons Learned: C-TAP is an example of a voluntary assessment system that has evolved to meet the needs of instructors and students in improving instruction, curriculum, and assessment. Because of its voluntary nature, there has been no consistent or standard implementation of any of the components. Teachers may select those components that best serve their needs. They use them in any order, at any time during the year, and may develop their own grading systems. While guidebooks produced by WestEd have been praised by teachers, they are designed to improve how teachers use the materials as opposed to increasing the reliability of scoring or the consistent use of the components from a teacher-to-teacher or class-to-class basis.

      The benefits of C-TAP to teachers include knowledge about state standards and their integration with course objectives and a resulting modification of instruction. Writing, speaking, and self-reflection are emphasized, and vocational teachers report being encouraged to collaborate more with academic teachers. However, it is unclear how useful the C-TAP components are for students entering the workplace or applying for postsecondary training and education. In addition, as California makes plans to modify C-TAP into a student merit exam, further work will be necessary to ensure technical quality and fair implementation for all students.


Reference

Evaluation and Training Institute. (1997, October 3). Evaluation of the Career-Technical Assessment Program (C-TAP). Final report submitted to the Sacramento County Office of Education. Los Angeles: Author.




Test Flights


Emerging Strategies for Assessment Systems


North Carolina: Vocational Competency Achievement Tracking System

      North Carolina's Vocational Competency Achievement Tracking System (VoCATS) incorporates course competencies developed by educators and industry representatives, and end-of-course exams to help teachers measure student progress and adjust teaching methods, course content, and materials. Course competencies, or Blueprints, include both academic and technical competencies and form the basis for the curriculum and course assessments in VoCATS. The system now includes a computerized test item bank for virtually every course taught in the state, or about 100 of the 110 courses that are available. The assessment items are multiple-choice and are reviewed and revised every five years. At the beginning of each year, students take a pre-test generated by the North Carolina Department of Public Instruction. The tests are scored locally by teachers who then scan the results into "Testmate," a CTB-McGraw Hill management program which generates a report detailing initial student competency. Interim tests are used throughout the year to assess student competency and modify courses during the year. The Department of Public Instruction administers a post-test at the end of the year which is used both to report performance information to the state and for instructional planning at the site. North Carolina first began to test vocational students in the 1970s and over the years the assessment system has continued to evolve, enabling state staff and local educators to improve the administration and technical quality of the assessments and the vocational education system in general.


Michigan: Employability Portfolios

      Beginning in the 1992 - 1993 school year, Michigan introduced an employability skills portfolio for students to use in grades 8 to 12, which was intended to help schools implement the Michigan State Board of Education's model core curriculum which had been adopted in 1990. In particular, the portfolios addressed Career and Employability skills, one of nine areas in the state's core curriculum. To support more consistent implementation of the portfolio system, the Michigan Department of Education created an Employability Skills Assessment Kit detailing the elements of a portfolio and presented standardized procedures for implementation at schools, workshops, and conferences. Schools with experience in using the portfolios served as resources for other schools starting to use the system. Although the portfolios are no longer supported as a state-level program, portfolio materials are available, and anecdotal evidence suggests that they continued to be used, especially in those districts and schools whose use of the portfolios reinforced the benefits of this kind of assessment for students. Based on employer feedback, the employability portfolios were designed to assess skills in three areas: [1] academic, [2] personal management, and [3] teamwork. Academic skills included communication, math, problem solving, and science and technology. Personal management covered career development, flexibility and initiative, organization, and responsibility. Teamwork was also assessed based on student contributions to and membership in a team, responsiveness, and team communication skills. Within each of the three areas, four benchmarks and a list of skills under each area provided the basis for assessment. Student exhibits such as pictures, videotapes, or written documents were used to demonstrate the student's skills for each benchmark; these were reviewed and approved by a teacher and counselor.


Ohio: Multidimensional Assessment System

      In 1990, Ohio initiated a ten-year strategic plan for vocational programs. In 1996, the updated plan, "Ohio's Future at Work: Beyond 2000," continued to reflect the state's push for increased accountability for vocational programs. As programs strive to become more performance oriented, Ohio has found that the availability of multiple strategies for assessment will play an increasingly critical role in continuous improvement. Because of the expanded role of assessment in program accountability, the division of Vocational and Adult Education has been promoting a multidimensional system of student assessment. Ohio currently offers Occupational Competency Analysis Profiles (OCAPs) tests in 38 vocational program areas that measure student competence in technical areas of instruction and Work Keys from ACT that measure academic skills expected in the workplace. As a complement to the state's traditional paper and pencil testing system, a variety of performance-based assessments are being considered. Assessments structured around real world, career-based performance tasks are currently being piloted in business, agriculture, and the family and consumer sciences areas. Those outside of the student's regular classroom, including other teachers, parents, students, and business persons, are expected to play a role in the evaluation process. While these instruments may not be used with all students, it is anticipated that a portion of Ohio's students will have access to the assessments through the state's assessment system. Practices that incorporate multiple measures of student performance over time will provide the third dimension of Ohio's assessment system. One example is a student portfolio containing results of paper and pencil assessments, performance measures, teacher and employer observations, and student reflections on their work. While Ohio is not using portfolios on a statewide basis, the state is exploring how to collect information that demonstrates student progress, perhaps through vocational student organization activities in the assessment process.


South Carolina: Vocational Student Certification System

      In South Carolina, plans are underway to implement a state assessment system to certify student competency in secondary vocational programs. Students who pursue a course of study, completing four units in a vocational program area, will be eligible to take the state assessment. Students who pass the test will receive a certificate from the state which can be used when they enter the workforce. If students choose to pursue further postsecondary education, the certificate will give them documentation of the skills they are competent in, thus giving them the opportunity to enter at a higher level than they might otherwise. In the spring of 1999, assessments with both performance-based and standards test elements will be piloted in 11 areas. The assessments have been developed with the assistance of V-TECS and NOCTI and are customized for South Carolina. Each assessment area has an advisory council representing members of the business/industry and education communities. To support the integration of this new assessment system in vocational courses, South Carolina is in the process of developing crosswalks of the state's academic standards to the program areas in the assessment and student certification system.


Utah: Assessments for Program Accountability and Student Certification

      About five years ago, in return for increased program accountability, the Utah legislature agreed to augment funding for vocational education. The legislature passed a state law allowing a portion (12%) of funds for applied technology education to be distributed as incentives to programs demonstrating student proficiency. In order to implement this system of accountability for vocational programs and distribute incentive funds, however, measures of student competency had to be developed quickly. From initial tests developed in word processing, information processing, and accounting, Utah's student assessment system has grown to approximately 85 end-of-course and end-of-program multiple-choice and performance assessments. Approximately 78,000 students took the exams last year, up from 4,000 in the first year. The students' assessments are sent to the Utah Applied Technology Resource Center for scoring. This Resource Center also provides assistance and guidelines for teachers. A Protocol Manual has been developed as a guide to the tests and reports sent to the vocational education program. Programs receive a portion of their funding based on student scores. In addition, students scoring above 80 % receive a certificate that shows the competencies demonstrated by the exam. Because the testing program had to be up and running in a relatively short period of time, Utah is now in the process of having an outside evaluation team examine the technical quality and reliability of its assessments. Although some assessments may be changed as a result of the current analysis, the Utah Department of Education has relied on the assessment system to encourage teachers and programs to teach to uniform standards, objectives, and competencies, thus facilitating the shift to a performance-based model in Utah.




CURRICULUM STRATEGIES

      Developing the curriculum to support academic, occupational, technical, or workplace readiness standards is another step in a standards-based accountability system. States sponsor a wide range of strategies to support and assist local practitioners in the implementation of educational practices to improve student achievement. Technical assistance to local districts supports curriculum development and instructional improvement. This assistance can be provided to local educators directly by state-level staff and also by training local practitioners to become a network of trainers that will be able to reach other colleagues at the school level. In addition, states support educators through the collection and dissemination of lesson plans, integrated projects, and other approaches to implementing standards-based instruction in the classroom. Other strategies to support standards in the classroom include developing products, such as crosswalks of academic and vocational standards and curriculum guides, or sponsoring networks of model implementation sites that exemplify promising strategies for implementing reforms.




Maryland's Integrated Projects


Providing Technical Assistance, Resources, and Training To Support State Standards

      Policy Rationale and Goals: In the mid-1990s, the Maryland School Performance Program (MSPP) introduced a wide range of changes in the state's education system which were targeted at implementing high standards and holding schools, teachers, and students accountable for student performance. MSPP is described by the Maryland State Department of Education as the "state's blueprint for educational accountability and school improvement." (1998) MSPP standards for grades K-8 are called Learning Outcomes and for grades 9-12, High School Core Learning Goals. Learning Outcomes and Core Learning Goals have been adopted in English, math, science, social studies, and Skills for Success which include communication, technology, and other more broad competencies. The Learning Outcomes and Core Learning Goals form the basis for a standards-driven assessment system which will be phased in for high school students as 9th grade tests in reading, writing, math, and citizenship required for graduation beginning in 2001. Student achievement on the Learning Outcomes is already assessed in grades 3, 5, and 8 in the four core subjects using assessments developed by the state based on the standards. (Glidden, 1998, p.54) According to the state's timeline, by 2005, students will have to pass exams in three subjects -- government, English, and either geometry or algebra to graduate from high school. By 2012, students will be required to take ten tests to earn a diploma. The test areas that are planned include algebra; geometry; U.S. history; world history; government; English 1, 2, and 3; and two science tests chosen from earth and space science, physics, chemistry, and biology. (Quality Counts `98, 1998, p.150)

      To help both academic and vocational teachers implement Core Learning Goals in their classrooms, Maryland has developed a statewide staff development program to train teachers to create integrated projects. In Maryland, an integrated project links at least one academic area and one Skill for Success specified in the Core Learning Goals to at least one career area. As the state embarks on the journey to improve the achievement of all students, technical assistance to teachers is viewed as a vital tool to provide effective strategies for increasing student achievement. Through an educational program embedded with material and instructional strategies that are applicable outside the four walls of a classroom, Maryland hopes to reach the diverse learning styles of all students while at the same time holding them accountable for learning.

      Integrated projects may span multiple courses or be contained in one course. In order to forge a clear link between academic and technical areas, integrated projects must specify the Core Learning Goal or Goals that are addressed. In addition to focusing on a particular Core Learning Goal, the projects must address one of the nine career clusters that Maryland has identified.[2] The projects often include sample industry standards for the career cluster chosen, thus ensuring that the projects are linked to the skills identified by employers as vital to employment and truly reflective of real-world opportunities.

      Implementation Strategy: According to the Maryland State Department of Education's description, integrated Project Workshops are organized by both the Division of Career Technology and Adult Learning and the Division of Instruction and Staff Development. These two divisions are part of the Department of Education and have partnered on a limited basis before the Integrated Project Workshops. As the discussion around the upcoming assessment process centered more and more on Core Learning Goals, the two divisions saw the need to work closely together to ensure the successful implementation of Core Learning Goals throughout the state's educational system.

      Maryland provides statewide training opportunities to teachers twice a year which focus on the use and planning of integrated projects. In these training sessions, academic teachers and career technology teachers work together in teams to create and plan integrated projects. The teams may also include business partners in the community and postsecondary instructors who are often invited to add specific expertise or to ensure that the projects are current and relevant to students' future career goals. To date, over 970 participants have received training.

      During the state workshops, participants receive a day of instruction and time to develop and plan their project with the assistance of State Department of Education staff. Once the teachers return to their own schools, they will need additional time to further develop and refine the projects. Because the projects are often very complex and involved, planning and development typically lasts a semester with the projects implemented the following semester. The projects are often designed to reinforce something already being taught by the teacher. "[Integrated projects are] often a better way to teach students concepts and skills already outlined in the course content," says Pat Mikos of the Maryland State Department of Education.

      The state's workshops are divided into two strands. In Strand One workshops, participants are given an overview of the integrated project process and philosophy. Their projects deal with the integration of only one academic subject and one occupational area so that what the teams are learning may be tested and refined. Strand One instruction is required of the leaders of teams who wish to attend Strand Two workshops.

      Strand Two workshops cover the integration of multiple academic and vocational topics, including the integration of the entire curriculum in a broader multidisciplinary team. Implementing the integrated projects developed by the teachers on the team in their own classroom is also a requirement for participation in the Strand Two workshops and for all team members attending the workshops.

      An example of an integrated project is "Providing Health Care to Diverse Clients." In this project, students perform such tasks as examining issues that may arise as a hospital serves more diverse patients, reviewing health standards and their implications on health care a diverse population, visiting a hospital where these issues are likely to arise, and producing a brochure or report on their strategy to address a particular health care need.

      Evolution of Strategy: Forty-five projects have now been disseminated, and an additional 20 projects will be disseminated this year. The projects are disseminated to every high school in the state regardless of whether the school sent teachers to the Integrated Projects Workshop. Prince George's County, one of the larger counties in Maryland, also posts all projects on their web site so that they might be accessible to parents and educators outside of Maryland as well.

      Some postsecondary teachers are involved in the creation of these projects, and there is an effort to get more teachers from this level involved. The goal of this involvement is the greater articulation between the high school and postsecondary levels. In addition, there are plans to expand the number of training opportunities offered to teachers so that more may participate in the training.

      Lessons Learned: An early lesson of this process was that the projects must "meet the needs of the teacher," according to Mikos. In Maryland, this meant that teachers must see that the projects address and reinforce the state standards. With the upcoming changes in high school assessment raising the stakes on student achievement, teachers want to know that they are helping students succeed.

      Another lesson learned was that no one department or agency can provide all the necessary training. A cadre of experienced teachers must be developed to help train other teachers, and local educators must teach each other. Because teachers are often more willing to experiment when they are learning new techniques from their colleagues, this method of instructional delivery will provide professional benefits to the state's teachers as well.


References

Glidden, H. (1998). Making standards matter 1998. Washington, DC: American Federation of Teachers.

Maryland State Department of Education Career Connections web site. Available on-line: <http://www.mdse.state.md.us/factsndata/mdcareer.html>.

Maryland State Department of Education. (1998, February 28). State announces reconstitution-eligible schools (Press release). Available on-line: <http://www.mdse.state.md.us/sco/pressreleases/980128.html>.

Prince George's County, Maryland web site. Available on-line: <http://www.pgcps.pg.k12.md.is/~connect/biintro.html>.

Quality Counts `98: Rewarding results, punishing failure. (1999, January 11). Education Week, 18(17), 150.




Crosswalks in Kansas and Kentucky


Integrating Academic and Vocational Education by Crosswalking Standards

      Policy Rationale and Goals: Through both national skill standards projects and efforts at the state level to set standards for students, more and more practitioners are using the process of "crosswalking" to show the relationship between two or more disciplines. In a crosswalk, one set of standards is compared to another to show the relationship between the two sets of standards by identifying how they connect or overlap. With most states having adopted academic curriculum standards, vocational educators are eager to demonstrate to academic practitioners and parents how academic competencies are taught in vocational education and where academic concepts are imbedded in the curriculum. Once completed, crosswalks are essentially an organizational tool that focus educators on natural points of integration and can be used as the first step to developing integrated standards-driven curriculum and assessment.

      Implementation Strategy: The first step in developing a crosswalk is to adopt or adapt at least two sets of standards. A crosswalk must have at least one set of academic (such as mathematics) and vocational standards. The second step in developing a crosswalk is to select its purpose. Is the crosswalk to set course expectations for students? Embed academic competencies in a vocational course? Develop an overlap of academic and vocational standards for assessment or curriculum purposes? Because crosswalks are an organizational tool for the integration of curriculum and assessment across academic and vocational disciplines, it is important to consider how the crosswalk will be organized.

      The rows and columns of the crosswalk must then be determined based on its purpose and the standards that will be crosswalked. The next step involves determining the "parts" or "length" of standards to be included: Some standards are just too long to include concisely in the written format of a crosswalk. It is likely that standards will need to be condensed to fit in the format. The complete sets of standards included in the crosswalk should be listed in an Appendix as a reference for educators.

      The next step is the actually "crossing" or comparing of the standards, one at a time. This step is usually best accomplished by a single person in a relatively uninterrupted session. This stage can be tedious and time consuming, but it is essential to the process. Once the standards are "crossed," the crosswalk must be validated. A committee or group of subject area specialists from each standard area then validates the crosswalk created by the individual. Finally, changes from the validation process are then made, and the format is adjusted as necessary. The next stage is to decide how to implement the current crosswalk and develop a plan to crosswalk standards from other courses or programs of study. The following section presents two examples of crosswalks from Kansas and Kentucky.

      Crosswalking Academic and Vocational Standards in Kansas: Vocational educators in Kansas have started the crosswalk process in Family and Consumer Science (FCS). The relationship between each vocational standard within FCS is graphically displayed with each of the following academic areas: Language Arts, Math, Science and Career Development Skills. For each broad standard there is a box summarizing the relationship of academic to individual standards with a detailed summary of each standard to follow. The number in the boxes labeled Language Arts, Math, Science and Career and Development Skills refer to particular academic standards within that subject area. A summary of academic standards is also provided to vocational educators as an attachment.


Early Childhood, Education, and Services

      Standard 4.0: Integrate knowledge, skills, and practices required for careers in early childhood, education, and services.

Content Language Arts
Standards 1234567891011121314
4.1
4.2
4.3
4.4
4.5
4.6
*


*
*
*


*
*
*


*
*
*


*
*
*


*
*
*


*
*
*



*
*
*



*
*
*



*
*
*



*
*
*



*
*
*
*
*
*
*
*
*
*
*
*
*
*
*
*
*
*
*
*

Content Math
Standards 123456
4.1
4.2
4.3
4.4
4.5
4.6

*
*
*

*



*

*





*
*

Content Science
Standards123456789101112
4.1
4.2
4.3
4.4
4.5
4.6
*



*
*
*



*
*
*




*








*




*



*

Content CareerDevelopment Skills
Standards123456789101112
4.1
4.2
4.3
4.4
4.5
4.6


*
*
*
*
*
*
*
*

*
*
*
*
*

*
*
*
*
*
*
*
*
*
*
*
*




*







*

*
*
*


4.1 Analyze career paths within early childhood, education, and services.

Language Arts Standards
1,2,3,4,5,6,12,13,14
Math Standards
4
Science Standards Career Development Skills
2,3,4,5,12

4.2 Analyze developmentally appropriate practices to plan for early childhood education and services.

Language Arts Standards
12, 13, 14
Math Standards
1
Science Standards
1,2,3
Career Development Skills
2,3,4,5,12

4.3 Demonstrate integration of curriculum and instruction to meet children's developmental needs and interests.

Language Arts Standards
12, 13, 14
Math Standards
1
Science Standards Career Development Skills
1,2,3,4,5

4.4 Demonstrate a safe and healthy learning environment for children.

Language Arts Standards
1,2,3,4,5,6,7,8,9,10,12,13,14
Math Standards
1,2
Science Standards
9,11,12
Career Development Skills
1,2,3,4,5,11

Crosswalking Academic and Vocational Standards to Vocational Courses in Kentucky: The Department of Education in Kentucky crosswalked state-adopted academic or Academic Expectations to each vocational course. These course descriptions are concise (fitting on one page), easy to use, and effectively communicate clear expectations. Vocational courses are compiled in a document titled Vocational Education Program of Studies Implementation Manual: Course Models (June 1998). On the next page is an example of a course description. The numbers recorded in the Academic Expectations category refer to specific academic standards.

Introduction to Health Sciences

CourseDescription
      Introduction to Health Sciences is an orientation and foundation for occupations and functions across the health care cluster. The course includes broad health care core standards which specify the knowledge and skills that the vast majority of health care workers should have. The student will learn about the health care industry, health care economics, and the career opportunities available. Leadership development, employability skills, and medical terminology will be integrated throughout the course. All core content for Vocational Studies is included in this course.

Academic
Expectations

StateStandard

2.20
1.1
2.16

2.14;2.17

2.14

2.20

2.33

2.30

1.16
2.36

2.38


2.16
3.5
2.14
4.2

6.1

2.27

         Content/Process
        Students will...
  • Examine the factors that influence the health care industry.
  • Research the organizational structure of various health care facilities.
  • Identify how key systems affect services performed and the quality of health
    care.
  • Describe ethical practices with respect to cultural, social, and ethnic
    differences within the health care environment.
  • Recognize legal responsibilities, limitations, and the implications of actions within the health care delivery system.
  • Investigate medical/health milestones that have led to advances in health care.
  • Evaluate available community health systems, services, and resources available in the community and state.
  • Evaluate consumer products and services, and make effective consumer decisions.
  • Use appropriate technology to input, store, and retrieve information.
  • Use strategies for choosing and preparing for a career in the health care industry.
  • Demonstrate skills (e.g., interviewing, writing résumés, completing applications) that are needed to be accepted into college or other postsecondary training or to get a job.
  • Explore Maslow's Hierarchy of Needs.
  • Utilize effective self-management skills.
  • Recommend an acceptable Code of Conduct for health care worker.
  • Utilize activities of Health Occupations Students of America (HOSA) as an integral component of course content and leadership development.
  • Apply mathematics, science, and communication skills within the health sciences content.
  • Demonstrate the employability and social skills relevant to health careers.

Connections
National Health Care Core Skill Standards
Secretary's Commission on Achieving Necessary Standards (SCANS)

      Lessons Learned: Crosswalks are used for a variety of purposes. These purposes can be as simple as providing vocational teachers the information they need about their own vocational standards and how these fit with academic standards. Crosswalks can also be used to revise adopted standards. For example, in completing a crosswalk, educators may not be satisfied with the level of math embedded in their vocational course. By comparing vocational standards to expectations in academic disciplines, educators can return to their original standards and revise the sections that they feel need to be more rigorous. For state-level administrators, crosswalks can provide the basis for creating curriculum examples, course syllabi, or assessments based on standards that cross disciplines. While the use of crosswalks varies from state to state, most agree that they are a promising strategy through which to focus the efforts of educators and state administrators on standards and to increase the integration of academic and vocational curriculum and assessment systems.


References

Information about Kansas crosswalks provided by Craig Haugsness and Karmey Olson of the Kansas Department of Education (1999).

      Kentucky Department of Education. (1998, June). Vocational education program of studies implementation manual: Course models. Frankfort: Author.




Technical Assistance in South Dakota


Using Limited Resources To Support Statewide Curriculum and Instructional Improvement

      Policy Rationale and Goals: South Dakota uses a variety of methods to inform and involve teachers. Because the state offices are small and have relatively few resources for staffing, South Dakota's Division of Workforce and Career Preparation in the Department of Education focuses on providing technical assistance rather than emphasizing their role as compliance officers. Over the years, state staff members have developed a variety of methods to best utilize the limited staff time and to assist as many teachers and schools as possible.

       While technical assistance efforts are primarily designed to serve the needs of districts, schools, and teachers throughout the state, the Department of Education also strives to reinforce the expertise of teachers as a resource for local training and staff development to their colleagues. By providing "train the trainer" workshops, South Dakota staff members essentially expand their numbers, building a cadre of professionals who are also well-versed in providing technical assistance. In addition, state staff provides regional trainings on topic areas such as developing "crosswalks" which identify common skills and knowledge across subject areas.

      Other technical assistance strategies in South Dakota are the publication of resources for teachers detailing integrated projects and activities and the identification of model sites that exemplify promising instructional strategies such as using block scheduling or integrating academic and vocational technical courses.

      In all of these strategies, the desired outcome for state personnel is to facilitate the provision of technical assistance and the sharing of information across the state rather than serving as a sole source of information. In this way, the staff believes that greater numbers of individuals will get assistance and, moreover, that the professional ties among teachers will be strengthened.


Implementation Strategies

      Conducting Train the Trainer and Regional Workshops

      South Dakota has developed a Train-the-Trainer model for teachers and administrators in the state. The individuals participating in the training are encouraged to share what they have learned with others. Another goal of these sessions is to provide a forum for academic and technical teachers to collaborate, whether at the training session or back in their schools. One topic covered recently at a workshop was "Designing Courses for Quality Learning."

      Developing crosswalks is one of the first topics covered in regional trainings. Crosswalks are comparisons of at least two different sets of standards in two or more subject areas, such as a vocational and an academic area, so that integrated or common skills and knowledge can be more easily determined. Training sessions have also been conducted to provide all participants with information on the Secretary's Commission on Achieving Necessary Skills (SCANS) competencies and how those can be "crosswalked" to academic and vocational technical subject areas. While the use of crosswalks is currently voluntary, the state has considered making them mandatory. This would create a potential need for further training and dissemination. In addition to familiarizing teachers and administrators with the concept of crosswalks, these initial sessions have provided vital information to state staff as the state finalizes its decision on the use of crosswalks.

      Collecting, Publishing, and Disseminating Materials

      The South Dakota Division of Workforce and Career Preparation collects and publishes applied academic and integrated activities and projects that have been successfully implemented in classrooms in a volume entitled Education Activities for the Classroom. (Kucker, 1998a) The volume is distributed throughout the state by the Tech Prep/Career Guidance Initiatives Office. Ideas generated by the field in classrooms are synthesized using a standardized format that includes contact information from the submitting teacher. State staff members work directly with the sites submitting documents to assist them in meeting activity and publication standards. With the recent publication of the third edition of this resource, the number of items submitted and included in the publication continues to grow.

      By publishing this document, South Dakota hopes to encourage collaborative learning teachers. Sharing ideas among colleagues strengthens professional bonds between teachers and provides teachers with activities that have already been tested in the classroom. The publication process also serves as a method of recognizing teachers who are using exciting or novel approaches in their classrooms; having an item selected is in itself an honor.

      A resource book for trainers called Putting the Pieces Together: South Dakota Integration Training Model serves as a resource for teachers interested in integrating academic and vocational technical education. (Kucker, 1998c) Topics range from curriculum alignment to standards to instructional strategies and assessment; a bibliography of resources is also included.

      With the emphasis in new legislation on involving parents in career education planning, South Dakota has made this one of its key foci in offering technical assistance. A training manual entitled Parents as Partners in Career Education has been compiled and workshops are held across the state. (Kucker, 1998b)

      Selecting Model Sites

      South Dakota selects demonstration sites to serve as resources for other schools to implement Tech Prep programs. Schools are selected as model sites using a number of considerations as guidelines:the size of the school; geographic location; and types of innovative practices, such as block scheduling, integrated curriculum and applied academics as a means of raising standards, business involvement, and work-based learning for students and teachers. From these considerations, a representative group is selected as model sites in the state.

      Lessons Learned: South Dakota is continuing to work on finding ways to best leverage its scarce personnel resources. The "train the trainer" model and the use of training guides will continue to play a large role in this effort. South Dakota is also building on its work with model sites. The eight original sites again served as model sites and a ninth site was added to the list. The model site program drew almost 450 participants during 1998. Schools frequently contact demonstration sites for help in implementing innovative practices, so the lessons learned by the demonstration sites are beginning to be widely shared across the state.

      Training sessions have grown and become more comprehensive over time. In addition, all state department staff collaborate so that the same terms and concepts are consistently used in trainings. Assessments conducted at the end of training sessions create the feedback needed to continually adjust and refine training sessions. Some of the training sessions are now being conducted regionally to allow teachers to meet and network with others from their area.

      One of the most exciting outcomes of this work has been a shift in the perceptions and attitudes of teachers toward teaching. According to Dale Eggebraaten in the Division of Workforce and Career Preparation, "educators are now desiring to be facilitators of learning rather than lecturers." Teachers are beginning to embrace the practice of constant education and are modifying their curricula and instructional practices based upon new research and information. This shift has raised the excitement level of all involved, from the state personnel to the teachers in the classrooms.


References

Kucker, M. (1998a). Education activities for the classroom. Pierre: South Dakota Division of Workforce and Career Preparation.

Kucker, M., Smith-Rockhold, G., Wiese, V., & Bernis, D. (Comps.). (1998b). Parents as partners in career education. Pierre: South Dakota Division of Workforce and Career Preparation.

Kucker, M., Smith-Rockhold, G., Wiese, V., & Bernis, D. (Comps.). (1998c). Putting the pieces together: South Dakota Integration Training Model. Pierre: South Dakota Division of Workforce and Career Preparation.

South Dakota Tech Prep, South Dakota Department of Education and Cultural Affairs web site. Available on-line: <http://www.state.sd.us/state/executive/deca/workforc/techprep/techprep.htm>.




Test Flights


Emerging Curriculum Strategies


Alaska: Crosswalking State Academic Standards to SCANS Skills

      In 1993, Alaska developed content standards in ten core subject areas. These areas include: English/language arts, mathematics, science, history, geography, government and citizenship, skills for a healthy life, arts, world languages, and technology. During 1994 and 1995, the Alaska State Board of Education adopted standards in the ten core areas to serve as voluntary guidelines for Alaska's schools. As part of its state-level technical assistance program to help schools and districts in the implementation of the state's standards and the development of integrated curriculum, the Alaska Department of Education staff are in the process of developing "curriculum crosswalks" between the state's academic standards to the employability skills identified by the Secretary's Commission on Achieving Necessary Skills (SCANS). These crosswalks will help educators identify where Alaska skill standards are matching SCANS skills, providing the basis for both developing curriculum content and assessments to integrate subject matter or content specific skill standards with more general employability standards.


Arkansas: Course Content Frameworks for Technical, Academic, and Workplace Standards

      Over the last several years, Arkansas has been in the process of developing a taxonomy of workplace and academic skills which have been used as the basis for detailed content standards in each of the state's vocational courses. Content standards are being developed for 167 courses in all of the state's vocational program areas. By providing detailed content standards that include technical, academic, and workplace standards, the "Curriculum Content Frameworks" give both academic and vocational instructors a clear road map of what is expected from students on a course by course basis. For example, Curriculum Content Frameworks for Family and Consumer Sciences Education have been developed in ten courses. One of the eleven units in the Food and Nutrition course is "Menu Planning." Content standards within each unit are organized on a grid under two broad headings: "Vocational and Technical Skills" and "Academic and Workplace Skills." To illustrate how the content standards are organized, the following table reproduces two of the five standards within the Menu Planning unit (the numbers refer to specific Arkansas content standards):

VOCATIONAL AND TECHNICAL SKILLS
What the Student Should Be Able To Do
ACADEMIC AND WORKPLACE SKILLS
What the Instruction Should Reinforce
Knowledge Application Skill Group Skill Description

8.3 (list)
Considerations
in Planning
appealing and
nutritious menus
8.3.1
Plan menus for
family meals.
Foundation












Thinking
Reading



Science



Writing




Knowing How
To Learn
  • Comprehends written
    information and applies it
    to a task [1.3.8]
  • Describes/Explains
    scientific principles related to
    human maintenance/
    management [1.4.14]
  • Organizes information into
    an appropriate format
    [1.6.10]



  • Applies new knowledge
    and skills to plan menus
    [4.3.1]

8.5 (discuss)
Efficient meal
preparation
8.5.1
Plan a time and
work schedule for
the preparation
of a meal
Foundation















Thinking
Arithmetic/
Mathematics


Listening


Reading


Speaking


Writing


Seeing Things in
the Mind's Eye
  • Applies computation skills
    to plan a time and work
    schedule [1.1.5]
  • Evaluates oral information/
    presentation [1.2.2]
  • Comprehends written
    information and applies it
    to a task [1.3.8]
  • Asks questions to clarify
    information [1.5.3]
  • Organizes information into
    an appropriate format
    [1.6.10]



  • Imagines the flow of work
    activities from narrative
    descriptions [4.6.1]


Maine: Mapping State Academic Standards to Vocational Courses and Programs

      In conjunction with initiating the identification of industry skill standards for vocational program areas in the state of Maine, the Department of Education's Workforce Education Team will be "mapping" the state's academic standards or "Learning Results" to all workforce education and related programs. The goal of this process is to assist educators in identifying the academic skills that are or can be embedded in the content of their courses. The Workforce Education team intends for the mapping process to be comprehensive and will include not only workforce education and the Maine Pre-Apprenticeship Program in the process, but all technology education courses, family and consumer sciences education courses, cooperative education programs, applied academics courses licensed by the Maine Tech Prep Consortium, and the Jobs for Maine's Graduates curriculum.


Nebraska: Sharing Innovative School-to-Work Strategies

      In 1997, Nebraska produced a document recognizing innovative activities taking place in School-to-Work programs. The strategies in the document reflect the efforts of schools and teachers to embed real-world relevance into the educational programs of all students. The document is divided into three sections: [1] school-based learning, [2] work-based learning, and [3] connecting activities. A contact for each activity is also provided. Under the school-based heading, activities are divided into assessment, courses/programs/units, career exploration, and curriculum design. Field trips, guest speakers, job shadowing, school-based enterprises, studios/laboratories, and work-site learning are described in the work-based learning section. Connecting activities include articulation, marketing and public relations, community service, and teacher internships/inservice. A form to submit innovative school-to-work strategies is also included at the end of the guide.




SYSTEM SUPPORTS



      System supports are the kinds of strategies that states use to ensure that standards, assessment, and curriculum all are working together to support the improvement of student achievement. System supports may include the following:

      All of these efforts are intended to support the system in ensuring that student performance improves over time.




The Illinois Performance Management Information System


Using Database Methods To Improve Accountability and Results

      Policy Rationale and Goals: In 1993, the Illinois State Board of Education (ISBE) responded to the Carl D. Perkins Act of 1990 requirement that states create performance measures for local vocational education programs by creating the Illinois Performance Management Information System (IPMIS). Because ISBE realized that creating a data system would require the buy-in of staff at the local level, IPMIS was designed with feedback from local administrators, educators, student services personnel, and parents. In addition, ISBE was careful to make the process of using data easier rather than creating more reporting tasks for already-burdened administrators. IPMIS uses existing secondary, postsecondary, and employment databases; sets performance measurement goals based on a model of continuous improvement; and avoids ranking individual programs against one another. IPMIS is also designed to be "user-friendly" and applies the latest computer technology to enhance the system.

      For the state, IPMIS functions as both an information and an accountability system, counting student enrollment for reimbursement purposes and allowing the state to track programs; it is not intended to be punitive, however. According to Mary Ann Merano of ISBE, the principal consultant for IPMIS, rather than compare programs and schools against each other, IPMIS data compares programs to standards that are defined with the help of local educators, administrators, student service staff members, and parents. If a particular program's scores are not meeting state standards for two years in a row, the state steps in with technical assistance: "The IPMIS represents an `agreement' between local educational agencies and the state office. It ... is one of many resources that should be used for planning, monitoring, evaluating and restructuring programs."

      Implementation Strategy: Each school year, ISBE personnel in the management information systems office "cull a big file that contains records for grades 9-12 and for community college students or completers." (Miguel and Marano, 1998, p.22) They also separate data for certain student cohorts that the state wants to evaluate, such as graduated seniors matched to a list of freshmen enrolled in the state's postsecondary institutions, military enrollment records, economically disadvantaged students, or those with limited English proficiency.

      After preparing the data at the state level, regional directors in the state's 60 Education for Employment offices and its 40 community colleges receive diskettes containing three years' worth of data for their district, statewide comparative summary data, and the software to access it. Regional offices have the option of running reports for the schools in their district or distributing the software to the schools.

      IPMIS was purposely designed to be as user-friendly as possible so that all potential users would feel comfortable with the new program quickly. Numerous reports and graphs can be created for each individual site with only a few clicks of a mouse, providing tailored information to administrators and instructors almost instantly. The reports that are generated through IPMIS allow local practitioners to see what is working and what is not.

      Evolution of Strategy: The success of IPMIS has been acknowledged both by those at the local and state levels. Teachers and administrators find the reports to be easy to use and conducive to planning strategies for program improvement. By using data to locate students after high school and at the postsecondary level, teachers and administrators have the opportunity to analyze what they need to do. Some results include sharper attention to career planning efforts, more focused staff development, and improved collaboration among vocational and academic instructors.

      For state-level administrators, other facts speak volumes about the success of IPMIS. Although it is a voluntary system tied to performance measures, people at the local level are using it. In fact, local-level administrators and instructors are devising new ways to link existing data -- such as reviewing secondary and postsecondary information at the same time -- to better track where students are going after high school. In addition, IPMIS is a flexible data management system that can easily be modified and will grow according to the state's needs. For instance, the state plans to use IPMIS to better coordinate services with other state agencies and for use in regional planning.

      Another clear sign of success is that the state is considering modifying how program performance is defined as a result of the data provided by IPMIS. According to Richard Miguel of ISBE's Office of Workforce Preparation, because IPMIS data monitors and tracks achievement and improvement over time, Illinois is considering attaching additional accountability components to the system. Legislation is pending in the state to provide performance incentive funds for those programs that are improving. Clear performance measures have been tied to IPMIS and are being phased in over time; and ISBE plans to use these to determine the allocation of proposed performance incentive funds.

      Outcomes/Lessons Learned: Not surprisingly, one of the most important lessons learned through this process is one about data. Illinois found that while it is often difficult to collect accurate data, it is an even bigger challenge to get people to use data. According to Miguel, everyone's comfort level regarding how to use data was even lower than anticipated. In order to tackle this issue head on, Illinois has invited NCRVE into the state to conduct three training sessions for local users of data in an effort to expand their technical knowledge. Illinois has found that post-program data is particularly useful at the local level, saving programs the time and resources needed to collect this kind of data.

      Illinois also learned that the flexibility and adaptability of IPMIS was the key to opening a multitude of possibilities for its application, allowing the system to be useful to more people in more ways than would have been possible if it had been designed for one program or focus. By giving state officials, local administrators, and instructors the opportunity to use data to improve their own capacity, Illinois is finding that everyone is better able to meet the needs of students.


Reference

Miguel, R. and M. Marano. Windows on Progress. Techniques (1998, March, p. 22-23).




Georgia, North Carolina and South Carolina Incentive Programs


Using Incentives for Students, Teachers, and Schools to Improve Performance and Accountability

      Incentives for Students: Among other reforms centered on standards, accountability, and assessment, the Georgia Assembly established a requirement that all students pass a new set of tests to receive a high school diploma in 1991.

The new Georgia High School Graduation Tests differ from the previously required Basic Skills Test in that they include not only the areas of reading, writing, and mathematics, but also social studies and science. Furthermore, the law requires that the new tests "include process and application skills as assessed in a range of academic content, and shall exceed minimum and essential skills by extending the assessments' range of difficulty." (Georgia Department of Education)

      According to the American Federation of Teachers (AFT) report Making Standards Matter 1997, Georgia is one of 13 states that will require students to pass exams based on 10th grade standards or higher. In fact, Georgia students will be assessed in the four core subjects from material covered in the 9th, 10th, and 11th-grade Quality Core Curriculum, which was revised during 1997. According to the AFT, Georgia is only one of 20 states that will have graduation exams linked to their standards and is one of fewer than ten states that will require students to meet standards in all four core subjects.

      Incentives for Teachers: Reform efforts in North Carolina have focused on improving the quality of teachers by investing in recruitment and increasing teachers' salaries, improving the quality of teacher education training programs by requiring professional accreditation, strengthening licensing requirements, and offering mentoring programs for beginning teachers. In addition to these efforts, North Carolina launched an extensive incentive program for experienced teachers to receive advanced certification through the National Board for Professional Teaching Standards:

State legislation provides support to teachers seeking advanced certification offered through the National Board for Professional Teaching Standards (NBPTS) chaired by Governor James B. Hunt, Jr. For state-paid teachers with a clear license and a minimum of three years teaching experience in North Carolina, the State will:

      Since introducing these changes, "North Carolina has posted among the largest student achievement gains in mathematics and reading of any state in the nation, now scoring well above the national average in 4th grade reading and mathematics, although it entered the 1990s near the bottom of the state rankings." (Darling-Hammond, 1997, p.11) The state boasts the largest number of National Board Certified Teachers and is "home to 207" of them.

      Certainly many factors have contributed to the achievement gain. The number of board certified teachers and the even greater number of teachers who have gone through the rigorous national board certification process also contribute to this gain.

      Incentives for Schools: The South Carolina School Incentive Reward Program began in 1984 and was part of the Education Improvement Act of 1984 that initiated a complex accountability system, which included merit pay for teachers and principals. These two elements (merit pay for teachers and principals) are no longer part of South Carolina's program. The state concluded that it was too difficult to judge teacher merit and that the principal incentives were driven by "building a file that showed what they did, but did not prove quality." What did survive was the School Incentive Reward Program, which provides funds to schools with exceptional or improved student performance on two statewide assessments: one norm, the other criterion-referenced. In addition, districts receive an incentive reward if two-thirds of their schools qualify. The program has evolved to reflect changes in the state's education reform agenda, as enacted in the 1989 legislation, Target 2000, South Carolina's master plan for maintaining public education reform through the end of this century.

      To qualify for a reward, schools must post one-year growth or meet the standard for state improvement on the state's tests. State and nationally normed tests are used in grades 3 - 11 in reading, mathematics, language, writing, and science (not all in every grade). Ninety-eight percent of students must be included, and schools are awarded based on either a formula that allocates schools to one of four different percentile ranges (95 or higher, 90-94, 26-89, or 6-25) or whether the school's gain has been equal to or greater than the 65th state percentile rank for three years.

      In 1997 - 1998, 291 out of 1,015 schools received a reward ranging from $2,800 to $72,400 on a per student basis. District rewards are $2 per pupil. While the typical amount per school is relatively small, about $15,000-25,000, schools are free to use the money for any "instructional program" enhancement chosen by a local School Improvement Council, with the exception of salary supplements or replacement of district funds: "The door is wide open, schools can purchase PE equipment, computers and software, furniture, decorative murals;" schools are also given recognition by the superintendent and a flag signifying their award. According to John Suber, a staff member in the South Carolina State Department of Education, the program has lasted because the money is given to the school as a community who then decides how it will be spent to improve their students' educational program.


References

Darling-Hammond, L. (1997). Doing what matters most: Investing in quality teaching. Kutztown, PN: The National Commission on Teaching and America's Future.

Gandal, M. (1997). Making standards matter 1997: An annual fifty-state report on efforts to raise academic standards. Washington, DC: American Federation of Teachers.

Georgia Department of Education. (1998). Georgia high school graduation tests. Available on-line: <http://www.doe.k12.ga.us/sla/ret/ghsgtabout.html>.

North Carolina Department of Education. (1998). Financial and personnel services. Available on-line: <http://www.dpi.state.nc.us/nbpts/>.




School Sanctions in Texas, North Carolina, and Maryland


School Rating Standards, Assistance Teams, and Reconstitution

      Ensuring Equity in School and District Accountability Ratings: Efforts to improve educational accountability have been underway in Texas for many years. One of the most publicly visible aspects of the Texas Accountability System (TAS) is the Texas Assessment of Academic Skills (TAAS), which is given each spring to all 3rd through 10th grade students in the state. As part of the state's effort to hold students, teachers, and schools accountable for their performance, TAAS results are made available on the World Wide Web. Increasingly, TAAS results have become a part of the public dialogue about education reform, and widespread publishing and analysis of the results in newspapers and on television occurs each spring.

      TAAS forms the basis for another publicly reported component of the state's accountability system, the Academic Excellence Indicator System (AEIS), which is used each year to evaluate and rate individual schools and districts. TAAS results are combined with dropout and attendance data to rate schools and districts as "Exemplary," "Recognized," Academically Acceptable," or "Academically Unacceptable/Low-Performing." The rating system is used both as a basis for incentive rewards and to determine whether intervention or assistance from the state is necessary.

      In designing its rating system, Texas recognized that while it was important to encourage schools to strive for excellence, the state also had to build in a measure that would ensure equity for all students. To do this, the state has included an "equity index" in the AEIS rating system. Thus, schools and districts are rated on overall performance in addition to the performance of four categories of students. These "student groups" include African American, Hispanic, white, and economically disadvantaged students.

      For example, in order for a school to be rated as Exemplary, at least 90% of students must receive passing scores on the TAAS reading, writing, and mathematics exams. When calculated on a group by group basis, 90% of each student group must also pass the exams. The AEIS rating system further specifies that the dropout rate must be 1% or less and that the attendance rate must be at least 94% overall and for each student group. Schools that receive a Recognized rating must have at least an 80% passage rate, a dropout rate of 3.5%, and attendance rate of 94%, again overall and for each student group. A district cannot be rated as Exemplary or Recognized if it has one or more Low-Performing campuses.

       "Academically Acceptable" schools must have a TAAS passage rate of at least 40%, a dropout rate of 6% or less, and an attendance rate of 94%. If the TAAS passage rate is below 40%, the dropout rate is above 6%, and the attendance is less than 94%, then a school is deemed "Academically Unacceptable/Low-Performing," and state intervention or assistance may occur.

      As is the case in other states, accountability in Texas has evolved over time. According to Jay Cummings, a Texas Education Agency official, elements of the accountability system that have been legislated from the state level are intended to "shore up potential barriers in the [educational] system." For instance, at first, "the onus was put on students" through TAAS and the requirement that students pass an exit exam to graduate. The AEIS rating system has added another layer of accountability directed at schools and districts. Currently, the state accountability focus is shifting to the preparation of teachers. According to Cummings, there is a "new State Board of Teacher Education which will be an independent board that will accredit [schools of education] based on the success of their own students."

      Monitoring Student Progress and Assigning State Assistance Teams: In North Carolina, education reform and accountability is guided by the "ABCs of Public Education" legislation passed during the 1996 session of the General Assembly: "The ABCs of Public Education is a comprehensive plan to improve the public schools in North Carolina through three goals of strong accountability; an emphasis on the basics and high educational standards; and on providing schools and school districts with as much local control over their work as possible." (Department of Public Instruction, 1997, p.3)

      In North Carolina, the "A" in ABCs stands for "Accountability." There are several components of accountability that the State Department of Public Instruction is in the process of implementing. The legislation directed that the ABCs would apply to all K-8 schools beginning in the 1996 - 1997 school year, with high schools following in the 1997 - 1998 school year. The accountability system is designed to set school-level standards and monitor student growth and performance, to provide the basis for recognition and rewards, and to "identify schools and districts with unacceptable performance and growth, and provide assistance and/or intervention when necessary." (Department of Public Instruction, 1997, p.9)

      Like the K-8 model, the high school accountability model that is currently being implemented requires that schools meet a state-determined "expected gain" standard. This standard is determined by the state and is a composite score based on the following:

      While the first two measures included in the high school model focus primarily on academic measures of student performance, North Carolina has recognized the important role of career-focused education in its accountability system. By including a year-to-year comparison of students who complete College Prep or Tech Prep courses of study, schools are encouraged to promote the importance of both options and are measured based on program completion. Thus, schools are encouraged to enroll students in programs that are more focused on the transition after high school to employment, occupational training, or college. The high school model allows students who pursue a vocational course of study the option of receiving employment training while in high school, but the model reinforces the need for all students to succeed academically because the school's rating is also dependent on academic test scores.

      In the 1998 - 1999 school year, other components may be added. These include a comprehensive test in reading and mathematics administered in 10th grade to measure growth since 8th grade, the school's passing rate on the high school 10th grade competency test, dropout rates, and any additional EOC tests that may be mandated by the state. Based on "expected gain composite" scores, schools are designated as either meeting "Expected" or "Exemplary" growth standards, or they receive "No Recognition," meaning that the school did not meet its expected growth but was not low-performing.

      While the "expected gain" measure checks whether students are making sufficient progress, the state is also concerned that students perform at high levels on state assessments. Therefore, the state has also initiated a "performance composite," which measures whether sufficient numbers of students are performing at high levels (Level III or IV on the EOC tests). If a school did not meet its expected growth and the majority of its students are performing below Level III or IV on the tests, the school is designated "Low-Performing." These schools are required to inform the parents in their communities of the designation; in addition, a state assistance team may be assigned by the State Board of Education.

      State assistance team members "may be currently practicing teachers and staff, representatives of higher education, school administrators, retired educators, and others the State Board of Education considers appropriate." (Department of Public Instruction, 1997, p.29) The role of the state assistance team is to provide help for low-performing schools in evaluating the teaching and learning environment and providing services to help make improvements to the educational program. The ABCs legislation sets out the expectations and functions of the state assistance teams. These include a review and investigation of school operations and recommendations for improvement; semi-annual evaluations of school personnel; collaborating with school staff, central offices, and local board in the design, implementation, and monitoring of a school plan; making recommendations and reviewing progress as a school plan is implemented; and reporting to the local board of education, the community, and the State Board of Education on the school's progress.

      According to the Department of Public Instruction, members of the state assistance teams are rigorously screened and receive extensive training in the ABCs law and school improvement processes. In addition, team members who are evaluating teachers in low-performing schools must "have successfully completed the 24 hours of Teacher Performance Appraisal Training." (p.28)

      Belinda Black, a consultant in the Division of Accountability Services Reporting Section, acknowledges that there has been some "consternation" at the local level over the computation of scores among borderline schools that are judged as not qualifying for a reward under the "Expected" or "Exemplary" growth standards. The state has an appeal process in place and continues to refine its accountability measures. Despite concern at the local level, the results of the state's assistance team program look promising. Of the 15 schools that were assigned teams in the first year of the program, all except one "were exceeding their expectations, and the other one met" the expected growth composite score.

      State Intervention and Reconsitution-Eligible Schools: The Maryland School Performance Program (MSPP) is widely recognized as a comprehensive accountability system that

      In addition, MSPP mandates a system of state intervention for the state's lowest performing schools. MSPP indicators measure school, district, and state performance in relation to state standards on assessments, dropout rates, and attendance and then judges schools and districts against their own growth from year to year. In order to determine whether schools are measuring up, MSPP examines both absolute performance and progress. Maryland schools are then determined to be "Above Standards and Improving," "Above Standards and Not Improving," "Below Standards and Improving," or "Below Standards and Not Improving."

      Schools which are significantly below state standards and not improving become eligible for state intervention or reconstitution. However, decisions "about reconstitution are a last resort and based on the school's own history and circumstances, not school-by-school comparisons." (NASBE, 1998, p.38)

      Under regulations adopted by the State Board of Education in 1993, school reconstitution is defined as changing one or more of a school's administration, staff, organization, or instructional program. These regulations set forth both "the procedures for identifying schools in need of reconstitution and giving local school systems the opportunity to address the specific problems of identified schools. State intervention is a last step if the local reconstitution effort has not had the desired result of enabling the school to meet state standards or progress towards meeting the standards." (Maryland Department of Education)

      By January 15 of each year, the State Superintendent of Schools announces "Reconstitution-Eligible Schools." The school systems must respond by submitting a local school reconstitution proposal by March 15. The proposal is the basic framework that addresses the areas in which the board is declining or failing to show sustained progress. If the proposal is approved by the State Board, a transition plan with specific activities and deadlines is submitted by May 15. By the following January 15, schools must submit a "full scale, long term reconstitution plan."

      In 1995, two middle schools and one elementary school were named as "reconstitution-eligible." In 1996, an additional 37 schools were named, followed by 10 schools in 1997. On January 28, 1998, Maryland State Superintendent of Schools, Nancy S. Grasmick, issued a press release identifying the latest schools to be added to the list: "The nine schools named from Prince George's County include five middle schools and four elementary schools. The remaining twenty-nine schools are located in Baltimore City and include nine middle schools and twenty elementary schools. Today's announcement brings the total number of reconstitution-eligible schools in Maryland to ninety, seventy-nine of which are in Baltimore City." (Maryland Department of Education, January 28, 1998)

      The latest announcement in 1998 was based on 1996 and 1997 performance data and did not reflect recent changes in the management of the Baltimore City Schools System, a troubled urban system under state scrutiny since at least 1992. (Anderson and Lewis, 1997) According to Grasmick, "We will approach reconstitution with the newly named Baltimore City schools in a comprehensive way. We will ask Baltimore's Interim CEO Robert Schiller and the New Baltimore City Board of School Commissioners to craft an overarching management plan outlining how they will steer all of their reconstitution-eligible schools to success." Through state reconstitution, Maryland intends to emphasize accountability by holding all schools to the same standards for students, ensuring that schools and school systems receive local support and direction, and providing a "safety net" for Maryland families. The State Department of Education maintains a toll-free information line to answer questions from parents and school staff members regarding reconstitution.


References

Anderson, A. B. & A. C. Lewis. (1998). Accountability: Academic bankruptcy policy brief. Education Commission of the States. Available on-line: <http://www.ecs.org>.

Department of Public Instruction. Public Schools of North Carolina, State Board of Education. (1997, November). A guide to the ABCs for high schools.

Maryland Department of Education. (1998, January 28). State announces reconstitution eligible schools. Available on-line: <http://www.msde.state.us/sco/pressreleases/980128.html>.

Maryland Department of Education. (1998, November 12). School reconstitution: State intervention procedures for schools not progressing toward state standards, fact sheet 5. Available on-line: <http://www.msde.state.md.us/sco/factsheets/fact5.html>.

National Association of State Boards of Education (NASBE). (1998, October). Public accountability for student success, standards for education accountability (The Report of the NASBE Study Group on Education Accountability). Alexandria, VA: Author.




Florida, Pennsylvania, and Tennessee State Report Card


Providing Statewide Access to School Data for All Stakeholders

      Policy Rationale And Goals: Over the past decade, many states have been actively engaged in disseminating educational data to the public. As policymakers, parents, and the public have demanded better performance from the education system, a wide range of state report cards have been put in place as mechanisms for public accountability. Often, the report cards form the basis for other state-level decisions including how sanctions or rewards might be meted out to schools, administrators, teachers, and students.

      In 1994, the Council of Chief State School Officers (CCSSO) began to conduct an annual survey of state departments of education that compiles information on the status of state education accountability reports and state reports on education indicators. According to the most recent annual report by the CCSSO, "states have taken a more active role in regularly reporting indicators of the status of public education, including results of student assessments, data on students and teachers, and school finance data." CCSSO reports that as of August 1998, all state education agencies but one had at least one annual accountability or indicator report, with forty states producing two or more reports. Most of the states that have mandated a report (47) publish statistics at the district level and a smaller number of states (39) require statistics to be reported at the school level.

      While public reporting of data has become an increasingly visible part of state accountability systems, a 1995 report by the Southern Regional Education Board (SREB) asks the question: "Is your state's education report card changing in a fundamental and important way?" In other words, does the state report card provide a mechanism for assisting schools in reaching state proficiency levels and utilizing data for the school improvement process which are at the core of most state accountability reforms.

      According to SREB (1995), early state report cards emphasized educational "inputs" including such facts and figures as district and community characteristics, student characteristics, finance, and counts of teachers and other staff. Some early report cards also included student scores on standardized tests, performance on Advanced Placement examinations, and dropout rates. As state reforms that emphasize student performance on state adopted standards began to spread, state report cards started to report school-by-school comparisons of student performance to new state performance benchmarks. According to SREB, state report cards that emphasize student proficiency move away from emphasizing a school's constraints to detailing goals for all students, regardless of circumstances. Many states are engaged in improving state report cards, and Florida, Pennsylvania, and Tennessee provide three examples of easily accessible information systems for school sites, parents, and the public. Each of these examples is available on the World Wide Web.

      Florida (http://www.firn.edu/doe): According to the Florida Department of Education, "the Florida School Indicators Report is designed as a ready-reference guide providing data on every school for each of Florida's 67 school districts." School-by-school or district-by-district comparisons can be made for each of the indicators included on the report card. The data is compiled from the Department of Education's automated student, staff, and finance databases using school district reported data.

      The Florida School Indicators Report provides a wide range of data on schools in an easily accessible format. The indicators and comparisons are intended to provide the basis for further analysis at the local level.

      The Florida School Indicators Report includes the following information:

      Florida also mandates that the Department of Education and school districts produce a report known as the School Advisory Council Report (SACR). The SACR includes state, district, and school level data and covers test scores, dropout and graduation rates, information on school staff, attendance figures, and readiness to start school. The SACR must be kept on file at the school and/or district office. In addition, districts and schools must use the SACR data to distribute a School Public Accountability Report (SPAR) to parents of students. Florida also produces a School Accountability Report based partially on data in the SACR and adds information about school achievement, learning environment, and student characteristics, shown relative to state medians.

      Pennsylvania (http://www.paprofiles.org): According to the Pennsylvania Department of Education, the purpose of the School Profiles is to provide assistance to public school personnel and all citizens in evaluating a public school's qualities: "The School Profiles may provide valuable information for discussions aimed at making decisions to improve the quality of the public education system serving the children of Pennsylvania." While the purpose of the information is to provide easily accessible information, Pennsylvania also solicits feedback from "public school system customers" to ensure that the information in the profiles is useful. Accordingly, "many indicators used in our new school profile design resulted from customer feedback from school district/vocational-technical school representatives along with parents and business persons." In addition, the Pennsylvania Department of Education provides a feedback form and questionnaire on its web site that can be mailed or faxed back to the department.

      School Profiles are provided for primary, intermediate, middle, and high schools and are designed to provide tailored information at these program levels. School profiles can be found on the web site by county, a general area on a map of Pennsylvania, or searching for the school by name.

      Pennsylvania System of School Assessment, School Profiles include the following data:

      According to Vince Safran, an official in the Vocational Techncial Education Division of the Pennsylvania Department of Education, the School Profiles are an invaluable tool for beginning to evaluate the kinds of technical assistance that may be needed at a particular school. The School Profiles are widely available and are mailed on CD-ROM to public libraries, schools, and other public agencies. Safron also distributes the School Profiles to participants during site visits to schools and area vocational technical schools participating in SREB's High Schools That Work network. The profiles provide a baseline of information that frame the discussion and subsequent evaluation and technical assistance plan that the site generates.

      Tennessee (http://www.state.tn.us/education/rptcrd98): "The 21st Century Schools Report Cards provide parents, educators, local officials, business, and industry leaders and community members with an inside look at how their schools are performing and how state education funds are being spent. The report cards also provide an annual accounting to the public for progress being made toward state and national education goals."

      Since passing the Education Improvement Act, Tennessee has published seven sets of annual accountability reports. According to the Tennessee Department of Education, the accountability report cards are intended to inform parents and taxpayers of how much money is being spent on the system as well as the results on standardized achievement tests, student attendance rates, and promotion and dropout rates. Data included on the reports are collected from local school systems and compiled by various offices in the department. In addition, the 1998 Report Card Supplements "contain additional types of data, including test scores for individual schools, value-added assessment results showing student gains, school system audits conducted by the department's Internal Audit section, and school-by-school information on meeting class size standards."

      Indicators in the Tennessee 21st Century Schools Program include the following data:

      According to the Department of Education, the state report card system is not intended to "rate" or "rank" school systems, but to provide information about school performance that will initiate public dialogue about schools at the local level. In an effort to inform the public about how students of various backgrounds are doing, "Tennessee's `Value-Added' Assessment' tracks the gains students make on nationally normed tests over the course of their school careers." (Education Week, 1999, p.24) The value-added assessment system focuses on the effect of school over time and is intended to put students on a level playing field by comparing results from individual students as they progress through the state's school system.

       "At the heart of the data reporting process is an emphasis on school improvement.... These reports enable all Tennesseans to know what their education tax dollars have achieved and to make important decisions about the teaching and learning process." The Tennessee accountability report card is also summarized at the state level to show more broadly what is being accomplished statewide and to keep track of progress made since the enactment of the Education Improvement Act.


References

Council of Chief State School Officers (CCSSO). (1998). State education accountability reports and indicator reports: Status of reports across the states. (Results of a 50-state survey). Washington, DC: CCSSO, State Educational Assessment Center.

Quality Counts `98: Rewarding results, punishing failure. (1999, January 11). Education Week (Entire issue), 18(17).

Gaines, G. F. (1995). Linking education report cards and local school improvement. Atlanta, GA: Southern Regional Education Board.




Test Flights


Emerging Strategies for System Supports


Alabama: The A+ Foundation and Coalition for Better Education, Engaging Public Support

      Two organizations in Alabama, the A+ Foundation and the A+ Coalition for Better Education have formed to support education reform in Alabama. According to its Web site (http://www.aplusala.org),

       "Education reform is difficult even with legislative support, substantial state funding, effective communication, and strong business ties in place. Constant work and attention is required to affect education reform to broaden the scope of offerings and utilize effective, innovative approaches in preparing students to compete in an ever-changing society."

       The Foundation combines research and hands-on support in its efforts to improve education and also has formed several task forces. The Educational Leadership Task Force created the "Coalition for Innovation: A Network for Effective School Administrators," which includes approximately 17 school systems committed to continuous improvement. The Teaching and Student Achievement, Business-Education Partnerships, and the Education Research task forces also support various aspects of education reform. The second organization, the A+ Coalition for Better Education, describes itself as a grassroots, citizen-based advocacy organization dedicated to education reform in Alabama. The A+ Coalition was established seven years ago and has grown to "a base of thousands of members across the state who champion education reform."


Oregon: Seeds of Change, A Public Information Campaign

An effort to inform Oregonians about the changes occurring in the state's education system is called "Seeds of Change: The Oregon Schools Initiative." Launched in the summer of 1998, the state's legislature allocated $1,000,000 to help Oregonians understand the new expectations for students embedded in state-adopted standards and a new assessment system that is currently being phased in statewide. Professional services have been donated by an advertising firm in Portland to produce radio and television advertisements and handbooks for parents, teachers, and administrators. Materials have been distributed in English and in Spanish, and the Department of Education is also reaching out to those speaking other languages so that all residents will be engaged in the school improvement process. One example from a flyer sets the tone for the initiative: "The new standards are definitely a different way of doing things, and excellence can be challenging. As parents, teachers, and concerned members of the community, we all need to be involved, but we all will be rewarded. Our smart, well-rounded students will be sought after by colleges and employers. They will be good citizens and exemplary leaders. Most important of all, we will have prepared our children, to the best of our ability, for life in the 21st century."


Rhode Island: Information Works -- Accountability Goes Public

      Information Works, Rhode Island's recently released report card and information system on school accountability in the state provides the public with a wealth of information. Information Works is a volume containing school-by-school profiles of standardized test scores, student and staff attendance, graduation rates, and student discipline. In addition, Information Works has been posted on a web site for greater public access (http://infoworks.ride.uri.edu). Information Works is the result of a partnership between the Rhode Island Department of Education and the National Center on Public Education and Social Policy (NCPE) at the University of Rhode Island. NCPE administers annual surveys to parents, teachers, students, and administrators in all public schools in Rhode Island and helps the Department of Education produce the data and analysis on various school and district indicators which are used in the Information Works school profiles. NCPE also develops and maintains the Information Works web site under contract to the Department of Education.




QUALITY ASSURANCE



      Quality assurance is the aspect of standards-based accountability which ensures that educational programs as a whole are meeting standards for program quality. Examples of program quality criteria include setting standards for the qualifications of teaching staff, facilities and equipment, and ongoing professional development. Program quality criteria in these areas support accountability efforts by ensuring that all sites meet at least minimal levels of quality. In addition to setting minimum standards, other strategies such as industry-based program certification, state accreditation, and program monitoring are used by states to keep track of how programs are performing. Quality assurance strategies support programs or schools by providing a network of sites and colleagues as resources for continued improvement at the site level. In many cases, these quality assurance efforts are, or will be, tied to at least a part of program funding.




Ohio and Automotive Service Excellence (ASE) Certification


Tying State Funding to Industry-Based Program Quality Review

      Policy Rationale and Goals: To meet the needs for more technically trained and computer literate technicians in the increasingly sophisticated automotive repair industry, Ohio has combined its vocational education efforts with that of two nonprofit organizations from the automotive repair industry: ASE and NATEF. ASE is the National Institute for Automotive Service Excellence, a nonprofit organization established in 1972 to improve the quality of vehicle repair and service. At the heart of ASE efforts is a voluntary system of testing and certification of automotive repair technicians. The ASE program is complemented by the efforts of the National Automotive Technicians Education Foundation, or NATEF, a separate nonprofit foundation within ASE whose mission is to improve the quality of automotive technician training programs nationwide through voluntary program certification. NATEF offers four areas of certification: [1] Automobile, [2] Collision Repair and Refinishing, [3] Light/Medium Duty Compressed Natural Gas/Liquified Petroleum Gas, and [4] Medium/Heavy Truck. NATEF's responsibilities include oversight of the program evaluation process and making recommendations for ASE program certification. (NATEFWeb site) In an effort to expand program certification nationwide, NATEF works with state departments of education. Over the years, a partnership with the Ohio Department of Education has resulted in strict program certification requirements that are tied to state funding of secondary and postsecondary automotive programs.

      Implementation Strategy: According to an official with the Ohio Department of Vocational and Adult Education, about nine years ago a new policy was put into place requiring that all automotive programs be certified by NATEF. Programs were given five years to comply with the policy. According to Carl Workman, regional supervisor, Ohio Department of Vocational and Adult Training, currently, all programs must meet NATEF program certification requirements to receive state funding. Today, 100% of Ohio's 127 secondary Auto Tech programs are ASE certified, along with the state's full-time postsecondary programs.

      In addition, Ohio has added requirements above and beyond the NATEF conditions for certification. To receive state funding, all of the program's instructors must be ASE certified, and the program must meet two additional "required areas" to become certified. For instance, Automobile Training Programs must apply for certification in at least the four following areas: [1]brakes, [2] electrical/electronic systems, [3] engine performance, and [4] suspension and steering. In Ohio, these programs must also apply for certification in at least two of the following optional areas: automatic transmission and transaxle, engine repair, heating and air conditioning, or manual drive train/axles.

      The NATEF program certification process begins with an extensive self-evaluation performed by a team of training program instructors, administrators, and advisory committee members. The self-evaluation process allows these individuals to compare their program to national standards, to make improvements if necessary, and to prepare for the next step: NATEF review. This step consists of a review of the self-evaluation and a recommendation about whether the program qualifies for an "on site" evaluation. The on site evaluation is conducted by the Evaluation Team Leader, an educator certified by ASE and trained by NATEF. When a program is determined to be ready for a site visit, NATEF contacts the Ohio Department of Vocational and Adult Education to assign the Evaluation Team Leader.

      The last step is a recommendation for certification, which signals that the program has met industry requirements and is certified for a period of five years. Although programs are not reviewed during this five-year period, they must submit a report at the halfway point, two and a half years into the certification. This requirement serves to keep NATEF informed about the program, but is primarily for keeping programs on track for the review process that will occur after five years.

      Evolution of Strategy: Because automotive programs were given several years to comply with the department's new policy that tied funding to program certification, once the system was put in place, there has been essentially little change in the state's implementation strategy. Because NATEF and ASE have received the sanction of industry and are well respected in the field, their process is completely respected by the Department. They feel that industry-based support lends credibility to the process which, in turn, "raises the bar" at the local level. According to an official from the Ohio Department of Vocational and Adult Education, "instructors wanted to improve programs," and though there was some initial "balking" about the new requirements, instructors eventually saw how the process could improve both instruction and the reputation of their programs among students. Auto Tech programs are better able to attract students who are more serious about their future in the automotive repair industry. In part due to the success of the NATEF program certification model, Ohio has endorsed other industry-led skill certification processes requiring programs to use standards from the printing industry (PIA) and the machine trades (NIMS). Building trades standards will be piloted this year.

      Outcomes: According to the NATEF Web site, "certification of an automotive training program brings with it program credibility, prestige, recognition, and overall program improvement." The process is intended to benefit schools, students, future employers, and the automotive service industry. There is little funding available at the state level for tracking students in Auto Tech programs and evaluating the implementation of NATEF program certification across the state. Instead, Ohio has chosen to focus the efforts of state level staff on technical assistance to local programs.

      In addition, NATEF reports that repair technicians will need a variety of skills to compete in the increasingly technical and technologically advanced repair industry. Traditional automotive skills such as a thorough knowledge of automotive systems and components, mechanical aptitude, and manual dexterity continue to be critical in this industry; however, with the introduction of computer-based systems and other technological advances, technicians also need to be able to use a computer, communicate well with others, and be able to read and follow directions. Jobs in the automotive repair industry will be plentiful for those finishing training programs in high school, vocational technical schools, or community college, especially for those programs that meet or exceed its industry-recognized, uniform standards of excellence. Through tying state funding to industry-based program certification, Ohio demonstrates its financial commitment and confidence in an industry-endorsed process, two critical components for developing a national industry-based skill standards system.


Reference

National Automotive Technicians Education Foundation (NATEF) web site. Available on-line: <http://www.natef.org/about/index.htm>.




Program Monitoring in South Dakota, Oklahoma, and Alabama


Improving Vocational Programs Through Self-Assessment, Quality Standards, and Industry Certification


Providing Regional Technical Assistance and Facilitating Self-Assessment of Programs

      Each year in South Dakota, one region within the state participates in the Program Improvement Process for secondary vocational technical education. This process is "designed to help establish `where we are now, where we want to be, and how we are going to get there.'" Through completion of the "Self-Assessment of Quality Indicators" instrument, randomly selected site visits, and independent team reviews, vocational technical programs develop "Strategic Action Plans" that form the basis for further technical assistance from the state's Division of Workforce and Career Preparation (DWCP). These Strategic Action Plans address how programs plan to meet criteria and standards identified in the South Dakota State Model for vocational technical education.

Within the selected region, a random sample of programs is notified that they will be visited by an "On-Site Planning Team" facilitated by a DWCP staff member. The site visit selection process takes into account large, medium, small, urban, and rural school districts. Programs not selected for a site visit complete an Independent Team Review locally.

      The first step in the process is the completion of the "Self-Assessment of Quality Indicators" instrument, developed using information from research organizations, national models, South Dakota instructors and administrators, and state-level staff. The self-assessment instrument includes criteria and standards in five broad categories: [1] curriculum and instruction planning, [2] management, [3] educational equity, [4] career guidance and counseling, and [5] vocational student organizations. The instrument also includes indicators and measures of quality vocational technical education which program instructors use as benchmarks to evaluate their own programs. Program instructors rate their own programs based on the degree to which they believe their program is addressing the criteria or standard using the following scale:

  1. Institutionalized: Program in place a year or more

  2. Functional: Program in place a year or less

  3. Early Implementation: Program is in the planning stages; implementation is just starting

  4. Planning: No implementation yet, in planning stages only

  5. Not Yet Considered

      During the self-evaluation, program instructors also complete columns indicating a time line, the persons responsible for completing the plan, and from whom they need technical assistance to complete the plan. Technical assistance may be provided by business/industry representatives, other community resources, student services personnel, advisory committee members, and state staff. Once the self-assessment is completed, programs are instructed to prioritize their plans and goals and indicate "high priority" plans for the first year.

      Programs visited by an On-Site Planning Team review the information submitted in the self-assessment and produce a draft "Strategic Action Plan," which identifies the activities necessary to complete the plan. The state facilitator then sends the draft plan to the program instructor for review and approval by the team. When the plan is approved, a copy is mailed to the school district or multidistrict administration. Programs not selected for a site visit conduct an Independent Team Review and must also submit a Strategic Action Plan to the state. All programs in the region then participate in a fall and spring inservice hosted by state staff, at which further planning and technical assistance is provided on a regional basis.


Monitoring Programs Through State-Approved Quality Standards

      As part of the state's vocational technical plan, each Area Vocational and Technical School (AVTS) in Oklahoma is accredited by the state every five years. In addition to the state accreditation process, Oklahoma has adopted a policy that programs will adhere to national industry standards where available.[3] Programs will have five years to be in compliance with national industry standards as they are adopted in state policy.

      The initial part of the state's current accreditation process consists of a comprehensive self-evaluation by program instructors using the state's Summary Evaluation form for their particular program area. While the evaluation forms for each program area differ, all program areas follow Oklahoma's "11 standards of quality program operations" that have been endorsed by the State Board of Vocational and Technical Education.

      After the self-evaluation, the Summary Evaluation form is then completed by an external evaluation team, consisting of the state program administrator, an industry advisor, and a teacher from another part of the state. Programs must meet all minimum standards to be in compliance. Those programs that do not are given 60 days to respond with a plan of improvement, and a time is given for adjustments to be made. At that point, programs are reviewed a second time, "but only in the areas that are substandard," according to Ivan Armstrong, state program administrator for Trade and Industry programs. Programs receive interim reviews to help them prepare for the next cycle of program evaluation.

      The instruments used both for self and external evaluations are divided into eleven sections based on the standards endorsed by the State Board of Vocational and Technical Education. Under each standard, questions that reflect "quality indicators" are grouped into two categories. Minimum quality standards set by State Board Rules and Regulations

Oklahoma Standards of Quality Program Operations

Standard 1 Instructional Planning and Organization
Is instruction directed toward appropriate and clearly formulated objectives with input from partnerships such as community, business and industry, and local administration?
Standard 2 Instructional Materials Utilization
Have appropriate funds been budgeted and utilized for the purchase of a variety of quality instructional materials and equipment?
Standard 3 Qualified Instructional Personnel
Has the instructor developed and utilized methods to ensure that counselors and administrators are familiar with the goals, objectives, activities, and prerequisites for enrollment in the program?
Standard 4 Enrollment and Student/Teacher Ratio
Are enrollment and class sizes in compliance with the State Board of Vocational and Technical Education guidelines?
Standard 5 Equipment and Supplies
Is an established budget/funds equal to or above the incentive/formula monies designated for the program being used to purchase equipment and supplies that are representative of those used in business and industry?
Standard 6 Instructional Facilities
Are facilities barrier-free to accommodate students with disabilities?
Standard 7 Safety and Training Practices
Is Safety instruction planned, presented, demonstrated, and practiced by the instructor in instructional and laboratory activities?
Standard 8 Program Advisory Committee and Community Relations
Does the advisory committee include appropriate representation from business and industry, with a majority of its members being practicing technicians and others being supervisors/managers from local businesses and school administrators?
Standard 9 Leadership Development/Vocational Student Organization
Are VICA curriculum and related activities integrated into the instructional program to ensure a balance of the primary program objectives?
Standard 10 Coordination Activities
Is appropriate documentation maintained to indicate that the teacher/coordinator is actively involved with each work-site experience?
Standard 11 Vocational Student Accounting and Reports
Are student enrollment, placement, follow-up, divisional, and VICA reports correctly completed, maintained, and submitted by the due dates in accordance with state and federal requirements?

are either designated as "met" or "not met." Programs must meet all minimum standards in order to retain state certification and program funding, according to Armstrong. All other quality indicator questions are rated on a Likert scale of 1 to 5, depending on whether the program is exceeding, meeting, or falling below the standard. The table on the preceding page provides a sample minimum "quality indicator" for each of Oklahoma's 11 standards for quality program operations from the "Summary Evaluation Questionnaire for Trade and Industrial Education."

      According to Armstrong, Oklahoma's program evaluation process helps to ensure that all students receive instruction in the kinds of skills that will lead to employment opportunities. Through Standard 9, the requirement that all programs develop leadership and offer students the opportunity for extensive involvement in vocational student organizations, Oklahoma students have the option of gaining the "number one thing that industry wants: a positive attitude, good work habits, general employability."

      To meet the broad national perspectives of industry, beginning in 1990 the Oklahoma Department of Vocational and Technical Education established a policy that national organizations would provide program certification whenever possible. The policy established that Oklahoma programs would meet national standards and obtain certification to maintain state funding and accreditation for their programs. Examples of national certification programs in use in Oklahoma include standards from the automotive (NATEF), construction (Laborers-AGC), printing (PIA), and welding (AWS) industries. Oklahoma accepts the portions of national certification that are similar to its own standards and evaluates only those areas that are not included in the national certification program. Health programs also have specialized accrediting agencies which the State Board accepts in lieu of standards. In addition, the Oklahoma Department of Vocational and Technical Education is in the process of "crosswalking" vocational and technical competencies to state-adopted academic standards so that "students can obtain academic credit, where appropriate, for their work in vocational programs."


Requiring Business and Industry Certification Through State and National Industry Standards

      Beginning in the year 2003, all career/technical education programs in Alabama are required to certify to industry standards. To meet this requirement, Alabama has chosen five national certification programs for industry certification: [1] automotive, [2] construction, [3] drafting, [4] metalworking, and [5] printing.[4] All other career/technical programs will be certified according to business and industry certification standards written at the state level. According to Nancy Beggs, Acting Director of the Office of Career/Technical Education, Alabama's industry standards have been written using the NATEF program standards as a model. To assist in the effort to certify all career/technical education programs, the state legislature passed a $20 million capital improvement bond supporting the purchase of equipment.

      In order to have all programs meet industry standards by 2003, the Alabama Office of Career/Technical Education in the Department of Education has developed procedures that each district or Local Education Agency (LEA) must follow. According to these procedures, "once a program is awarded business/industry certification, it will remain a `certified program' for five years." Following are the required procedures:

  1. Plan Development: Each LEA will draft a five-year Plan for Program Certification to Industry Standards so that all programs in its jurisdiction will be certified in five years. A minimum of 20% of its programs must be certified annually.

  2. Staff Development: All teachers must receive staff development on industry certification prior to the self-evaluation and documentation stages. Teachers may participate through a local inservice or state-sponsored staff development.

  3. Documentation: Documentation must be collected and clearly organized for each program standard identified in the Career/Technical Education Business Industry Certification document.

  4. Self-Evaluation: Through a committee of school administrators, faculty, and local advisory committee members, programs compare themselves to the certification criteria. Once completed, the program is scheduled for an on-site review.

  5. On-Site Review: The on-site review team consists of a career/technical education administrator/specialist from the state Department of Education who is the team leader, business and industry representatives, and outside education representatives selected by the LEA. Based on the on-site evaluation, the team will make a recommendation for industry certification. If a program is denied certification, it will be given an explanation of what must be done in order to comply.

      In addition, for a program to be certified, teachers "must possess knowledge and skills as prescribed by industry standards and the department." Teachers must have a valid teaching certificate based on a bachelor's degree or higher in the teaching field or will be required to "pass an industry certification test as determined by the department." Though it is widely acknowledged that this requirement and industry certification in general will have a positive effect on career/technical education in Alabama, according to Beggs, it has caused some anxiety among teachers as the programs and their teaching qualifications come under scrutiny. To help ease this anxiety, the state has bolstered technical assistance services and has made a commitment to provide ongoing professional development in the industry skill certification process for teachers throughout the state.


References

Alabama Department of Education. (1998, June 12). Procedures, career/technical education business/industry certification. Montgomery: Author.

South Dakota Department of Education and Cultural Affairs, Division of Workforce and Career Preparation. (1998). Western region -- 1998-99 program improvement process: Self-assessment of quality indicators instrument. Pierre: Author.




High Schools That Work In Delaware


Using Data To Improve Academic Achievement in Vocational Programs

      Policy Rationale and Goals: The Southern Regional Education Board (SREB) was founded in the late 1940s to address the gap in academic performance between the South and other regions of the country. Today, it is a network of state and local policymakers, administrators and educators in sixteen member states, mostly located in the southern United States.[5] SREB's mission is to help individuals in government and education work together to improve educational settings and opportunities.

      In addition to its contribution to policymaking at the state level, SREB works with educators and administrators at the local level by supporting a consortium of area vocational schools and comprehensive high schools that has grown to include schools in 22 states. The network of schools is known as High Schools That Work and participating schools agree to focus on improving the academic and technical achievement of students through adopting HSTW Key Practices for Improving Student Learning, developing an action plan, and participating in data-driven technical assistance provided by SREB.

      The state of Delaware provides one example of how a commitment to HSTW Key Practices and data-driven continuous improvement has given new life to area vocational technical schools. Five of the 29 public high schools in Delaware participate in the HSTW network. These high schools were originally county vocational high schools that provided technical training services to all students in the region. In the 1980s, declining enrollment led these schools to the decision to become comprehensive full-time high schools while continuing to provide vocational studies. As a group, the five schools committed to the HSTW model as their reform strategy. Today, "Delaware has implemented school choice statewide," according to Lewis Atkinson of the state's Vocational Technical Division. Because Delaware allows students to choose to attend schools outside of their district, schools "try to differentiate themselves." Despite the perception that HSTW is "really seen as a vocational school reform," the five HSTW schools must now turn away one of every two students who apply. Because of the importance of academics at the HSTW schools, students now realize that they can receive a quality academic education, training in a field of interest, and contextual learning that will help them make plans for the future.

      HSTW goals are to:



      HSTW Key Practices for Improving Student Learning include:[6]

  1. Setting high expectations and getting students to meet them.

  2. Teaching challenging vocational studies by emphasizing the use of academic content in the context of modern workplace practices.

  3. Increasing access to academic studies that teach the essential concepts from college preparatory curriculum.

  4. Having students complete a challenging program with an upgraded academic core and a career major.

  5. Giving students access to a structured system that integrates school- and work-based learning.

  6. Having teachers work together.

  7. Actively engaging students in learning.

  8. Involving students and their parents in a guidance and advisement system that supports the completion of an accelerated program of study.

  9. Providing a system of extra help and time to help students meet higher academic and technical standards.

  10. Continuously using student assessment and program evaluation data to improve programs.

      Implementation Strategy: In order to participate in the HSTW network, schools must agree to the Key Practices and a set of high expectations set forth by SREB. Schools must also commit to at least five years of school reform to replace the general education track with a curriculum that blends modern vocational studies with college preparatory courses. HSTW believes that five years is the minimum amount of time that is needed to fully implement deep changes in the curriculum, instruction, and administration of the school. Five years also gives schools the time to use data and to see progress in the achievement of their students.

      As a member of HSTW, SREB provides a variety of services to participating schools, including assistance in the development of a site action plan, a recommended curriculum that includes more rigorous academic and technical classes, staff development to implement key practices, technical assistance in evaluation, and an evaluation process to measure changes in student achievement. These services are all intended to sustain the school's effort and support HSTW beliefs that while all schools have areas to improve, they want to create the right conditions for student performance.

      SREB requires all HSTW schools that participate to systematically and rigorously evaluate their student academic achievement. Students are evaluated through the use of multiple assessments, including the National Assessment of Educational Progress (NAEP) tests; surveys of student perceptions; follow-up surveys completed one year after students graduate; transcript studies; surveys of secondary teachers, administrators and counselors; and site visits.

      Normally, districts are given the option of testing a representative sample of students using a modified NAEP exam developed for SREB. Last year, the state of Delaware decided to pay the costs of testing all vocational completers and now "has data for the complete universe," according to Atkinson, which will provide a wealth of information at the local and state levels. The student and teacher surveys provide the context for student achievement and can often point school staff in other directions: "The student survey is a wonderful resource" for finding out how "students feel about their classrooms."

      HSTW schools also agree to a site visit from a team of visitors every three years. Called "technical assistance (TA) visits," the purpose is "to help school leaders and teachers identify changes needed to achieve the HSTW program goal: improved achievement of career-bound students through blending high-level academic and technical studies." (Bottoms, 1998) Site visit teams include a team leader who is an SREB staff member or a state official, a teacher or a staff member from a different site within the state, a representative of postsecondary education, a business or industry representative, and a community member. Prior to the site visit, the school conducts a self-assessment which is shared with the team.

      The site visits last two-and-a-half days and involve a range of activities, from interviews to classroom observations to focus groups. On Day 1, an orientation with team members takes place and the school's HSTW Advisory or Site team informs the TA team about the site's accomplishments, next big steps, and major challenges. On Day 2, the TA team starts the day at 7:30 am with an organizational meeting; then begins classroom observations; and in the afternoon, interviews students, the principal, a superintendent, and the director of vocational education at the site. A random selection of teachers and counselors are also interviewed. In the evening, the TA team talks about what they have seen and begins to prepare a draft report. On Day 3, the TA team meets in the morning to discuss the final report, and an exit conference is held with the superintendent and site leaders to debrief about the visit and the TA team's recommendations. About a month later, the TA teams submits a final report to the site discussing what is being done well, what needs improvement, and suggestions for implementation.

      When asked about the impact of SREB support for the state of Delaware and its HSTW schools, Atkinson responded, "the network helps with the implementation of action plans and the site visits are very valuable to the schools. The visits were originally viewed as an `evaluation' but are now seen as staff development." Even though teachers are sometimes frustrated that "this work is never done and there are always new challenges," the network of sites are a resource, and colleagues from other states have become a valuable source of ideas to help teachers continue to strive for improvement.

      Outcomes/Lessons Learned: HSTW began with 28 pilot sites in 13 states and has grown to include over 800 sites in 22 states. For Delaware, HSTW gives teachers the opportunity to visit other schools and see how they are reforming schools in other states. According to Atkinson, "teachers are beginning to see a reward for their efforts." The recent "jump" in mathematics scores is beginning to convince teachers that improved instructional strategies are having an impact on student learning and that students can "no longer hide in low level math courses." The next steps are to improve science and English achievement in HSTW schools.

      As a result of the emphasis on data collection and use, it is widely acknowledged that SREB has one of the most comprehensive databases on vocational education in the country. Strategies that have been shown to lead to improved student achievement include practical changes such as having students spend an hour or more daily on homework. Capturing and documenting the effects of different practices has helped to improve the program not only for the participating schools but also for the broader vocational education audience. It is also one of the few programs that has long-term data regarding its successes; this data has allowed SREB and its partners to determine practices that make a real difference complete with the data to show it.


References

Bottoms, G. (1998). Guide for High Schools That Work technical assistance visits: January 1998-August 1999. Atlanta: Southern Regional Education Board.

Rahn, M. L. (1995). Emerging national influences in education policy. Dissertation at the University of California, Berkeley.

Southern Regional Education Board (SREB) web site. About High Schools That Work. Available on-line: <http://www.sreb.org/Programs/hstw/about//high_abt.html>.

SREB web site. Overview of key practices. Available on-line: <http://www.sreb.org/Programs/hstw/about/Brochure/brochure.html#keypractices>.

SREB web site. Overview of SREB. Available on-line: <http://www.sreb.org/Main/AboutSREB/about.html>.




Indiana's Performance Based Accreditation System


Accrediting Schools Based on Educational Performance

      Policy Rationale and Goals: Prior to 1988, Indiana, like most states, accredited its schools based on a series of legal standards or "inputs" to educational programs. Educational inputs, such as the availability of textbooks, curriculum offerings, and the certification of professional staff, created minimum standards for schools and districts, taking precedence over analyzing student performance or encouraging school personnel to take the lead in the school improvement process. As stated in the Overview of PBA:

While it is important that basic resources, personnel, programs, and safety standards exist to support a quality education program, simply having the resources available is not enough. An effective accreditation system must also consider how well available resources are used to reach the ultimate goal of the educational system--the development of well-educated, knowledgeable, thinking students.

      In 1987, the Indiana General Assembly enacted legislation creating a performance-based accreditation system that focused attention on the results, or performance, of the educational system. Based on this legislation, the State Board of Education established rules governing the implementation of the Performance-Based Accreditation (PBA) system.

      According to Mary Mickelson, PBA Director for the Indiana Department of Education, "PBA went on-line in the 1988-89 school year." PBA is the only accreditation system authorized by the Indiana General Assembly and all public schools must participate. There are approximately 2,200 schools accredited through the system, including all of the state's public elementary, middle, and high schools; area vocational centers; and about 400 non-public schools that voluntarily participate in the accreditation system.



        Legal Standards

  • health and safety
  • minimum time for
  • school activities
  • staff-student ratios
  • curriculum requirements
  • staff evaluation plan
  • beginning teacher
  • internship program
  • school improvement plan
        School Improvement Plan

  • administrative instructional
  • leadership
  • curriculum
  • instruction
  • monitor student progress
  • program evaluation
  • professional development
  • evaluation of school personnel
  • school climate
  • parent and community
  • involvement
        Expected Performance Levels

  • Indiana Statewide Testing for Educational Progress (ISTEP+)
  • total battery scores
  • language arts proficiency
  • scores
  • mathematics proficiency
  • scores
  • attendance rates
  • graduation rates for high
  • schools

      Recommendations for accreditation are based on the following:

      The State Department of Education makes recommendations to the State Board of Education for an accreditation period of up to five years. If a school receives probationary accreditation, there are usually "major problems with the school program," according to Mickelson. Schools can remain on probationary accreditation for up to three years. Then the school district is placed on probationary accreditation. At the end of a total of four years, the department may provide to the General Assembly recommendations regarding the operation and administration of the school; however, this scenario would be extreme and has almost occurred only once, at which point the district made the decision to close the school.

      Implementation Strategy: The Indiana Department of Education implements PBA through a twelve-person professional staff, each of whom are responsible for approximately 40-45 schools participating in the accreditation process. Schools are on a five-year accreditation cycle, so roughly 20% of schools are participating at any one time. As stated in the Overview of PBA, staff in the department support the following mission statement:

Performance-Based Accreditation (PBA) assists school communities in their efforts to be accountable for continuous improvement of learning environments in which all students develop their potentials for a lifetime of intellectual, social, and vocational achievements.

      PBA has three basic purposes:

  1. To provide a mechanism to verify compliance with education laws and rules.

  2. To integrate a procedure for examining student achievement and school success in educating students in the accreditation process.

  3. To provide a planning, improvement, and program evaluation model for schools that emphasizes the self-study process.

      The measure of student progress used for PBA is the Indiana Statewide Testing for Educational Progress (ISTEP+), an assessment system that has been in place since 1987. Indiana students are required to take English and math assessments based on broad state-defined "proficiencies" which measure more specific "essential skills." While the testing system has remained largely unchanged since it was first instituted, and the state maintains a contract with the same provider, an "applied math area" and "process writing" were added to the system in 1996. In addition, items are added that reflect revisions to state proficiencies and essential skills.

      Because PBA incorporates a regulatory compliance mechanism with analysis of student performance and a site-based planning and improvement process, department staff have come to realize that it is the school improvement process that "drives change at schools," according to Mickelson. From the state's perspective, incorporating the self-study process into the accreditation system is the "major change" that has resulted from this system in Indiana.

      To ensure that the PBA Division provides the kind of support needed at the site level, some state department staff members are now geographically dispersed and are assigned to work with schools in a region. Department staff spend most of their time in schools and through this close contact with school personnel, regional differences can be taken into consideration and incorporated in the self-study process.

      Evolution of Strategy: According to Mickelson, the self-study process has changed how the department works with local educators and administrators. PBA now has an 11-year track record, and the department has learned from the past: "Four years ago, the department had gotten very prescriptive, even to the point of telling school sites to `use these words.'" After protest at the local level, the State Board of Education amended the rules related to school improvement planning. This change had the effect of lessening the regulatory and compliance aspect of accreditation resulting in "some schools [doing] as little as required." However, "others are excited about trying new things," and the staff at the department has "re-tooled" to think of PBA as a means to encourage schools to embrace the self-study process as valuable in and of itself in addition to the accreditation function. Eleven years ago, they began with "here's the model, follow it, and get it in on time." Now they guide schools to ask "How do we get from where we are to where we want to be?"

      In addition, the department allows the school to choose "the driver" of change. North Central Association has a self-study process that blends well with the state's requirements. Some schools use models like High Schools That Work, the Coalition of Essential Skills, or the Baldridge Self-Study Processes.

      Results/Lessons Learned: For more than one hundred years, schools in Indiana have been accredited. "For schools that do empower themselves, [PBA] has become a significant school process because the local level" buys into the process. In addition, Mickelson asserts, PBA is "multi-faceted and not legalistic" and evolves to meet changes in the educational system that occur at the state level. For instance, new licensing standards are currently being piloted and will be required by the year 2001. As these changes are implemented, they will be incorporated in the process. In addition, the public has recently raised concerns over the "expected performance standard," that is integral to PBA. Some Indiana residents would like to see an "absolute performance standard" and it is possible that how schools are ranked and evaluated against a standard of "expected" performance may change in the future.


Reference

Overview of PBA. Available on-line: <http://www.doe.state.in.us/pba/descrip.html>.




Test Flights


Emerging Strategies for Quality Assurance


Missouri: K-12 Performance-Based Accreditation

      Missouri is in the second year of the Missouri School Improvement Program, a five-year accreditation process for all K-12 school districts in the state. Each year, over 100 school districts receive an in-depth review reporting on the areas of resources such as facilities, instructional processes, and district performance. As part of the accreditation process, a site visit of approximately two to five days to the district is also conducted. Site visit teams include equal numbers of state department staff and educators from the field. The reports developed as part of this process are reviewed by the Missouri Department School Improvement Committee, and a summary of the report and the committee's recommendations for accreditation are presented to the State Board of Education for approval. Districts must also submit a School Improvement Plan addressing the concerns identified by the committee in its review report. Staff members from the Division of School Services provide technical assistance to district personnel by phone and through training sessions they hold throughout the state. In addition, a variety of materials about the Missouri School Improvement Program have been developed to help districts through the accreditation process.


North Carolina: Educational Excellence Through the Malcolm Baldridge National Quality Awards System

      Through the North Carolina Quality Leadership Awards program, six school systems have volunteered to participate in a pilot program based on the Malcolm Baldridge National Quality Awards. The award recognizes quality improvement efforts in the private sector and is presented annually by the President of the United States in a ceremony in Washington DC to U.S. manufacturing and service companies and small businesses. Using the guidelines of the Malcolm Baldridge National Quality Award system, North Carolina has tailored the award criteria to "educational organizations" by developing Education Performance Excellence Criteria. School systems in the North Carolina pilot program have made a commitment to improving performance through integrating a model of continuous quality improvement into the fabric and foundation of school and district operations. According to a 1998 guide to North Carolina Quality Leadership, the Education Performance Excellence Criteria have two results-oriented goals: [1] delivering value to customers in dynamic and changeable conditions and [2] building organizational capacity for continuous improvement. Rather than mandating that school systems participate, the pilot program allows voluntary commitment and the ability for members of these systems to be introspective as they prepare to voluntarily evaluate themselves. It is hoped that this type of self-assessment will lead to a cycle of continuous improvement and, through that, educational excellence.


Pennsylvania: Providing Incentives Through Performance-Based Funding

      In 1997 - 1998, Pennsylvania provided funding to support school performance awards based on school improvement in student achievement and/or effort. For the 1998 awards, improvement in student achievement was determined using results from the Pennsylvania System of School Assessment (PSSA) reading and math scores. Improvement in effort is premised upon the student attendance rate for all schools. All Pennsylvania Area Vocational-Technical Schools (AVTSs) that have Department of Education approved vocational-technical education programs are included in the school incentives program. For AVTSs, the effort component of the program is the same as for all public schools in the state. However, the achievement component for vocational schools was modified to more clearly address the mission of AVTSs. For the years 1997 - 1998 through 1999 - 2000, the rate of students employed related to their training will be used. To be eligible for an achievement award, Pennsylvania schools must meet the guidelines for improvement set by the state. Beginning in the year 2000 - 2001, the achievement component will be based on the level of competency as defined by scores on a state-approved occupational competency measure. By that time, the state plans to have had two years of occupational competency testing completed. Cutoff scores for schools will be set according to the requirements of the Statewide System for Core Performance Measures and Standards. To support the requirements of the federally mandated Statewide System for Core Performance Measures and Standards, in September of 1996, the Pennsylvania State Board of Education approved the use of the National Occupational Testing Institute's (NOCTI) Job Ready/Student Assessment or other Pennsylvania Department of Education (PDE)-approved standardized tests that are recognized by industry groups or associations who employ AVTS graduates. Currently, 60 Job Ready/Student Assessment instruments are available, covering at least 85% of students enrolled in PDE-approved vocational-technical programs. The state is also considering the use of the Pennsylvania Nurse Aide Competency Evaluation Program Test, the National Automotive Technician Education Foundation (NATEF) End of Program Evaluation, the Air-conditioning and Refrigeration Institute Competency Examination, and the Pennsylvania Cosmetology Examination.




POLICY LINKAGES

      This section provides examples of state reform efforts that have encouraged collaboration among state-level agencies and advanced cooperation and partnerships among local and state practitioners, community members, parents, and employers to support the academic achievement of all students.




Wisconsin's Secondary and Postsecondary Connections


Implementing Tech Prep for All Students

      Policy Rationale and Goals: Since the mid-1980s, Wisconsin has been in the process of building a system of meaningful structural and programmatic connections between secondary schools and the state's postsecondary education system. In fact, Wisconsin's Tech Prep initiative has been jointly administered by the Department of Public Instruction (DPI) and the Wisconsin Technical College System Board (WTCSB) for the past 12 years. More recently, articulation efforts have expanded to include the last two years of high school, two years at a technical college, and further education at a four-year institution (2+2+2). Since its inception, Tech Prep in Wisconsin has been defined broadly, and all students are included in the seven essential elements of the system. These elements include [1] articulation agreements, [2] appropriate curriculum design, [3] curriculum development, [4] inservice teacher training, [5] counselor training, [6] equal access for special populations, and [7] preparatory services.

      Tech Prep in Wisconsin has also received considerable support through state statute 118.34, which was originally passed in 1991 and continues to guide the efforts of the WTCSB and the DPI in administering the program statewide. In part, the legislation states that

...in cooperation with a technical college district board, each school board shall establish a technical preparation program in each public high school located in the school district. The program shall consist of a sequence of courses, approved by the technical college system board, ...designed to allow high school pupils to gain advanced standing in the technical college district's associate degree program upon graduation from high school.

      The legislation further describes the responsibilities of local technical college districts to appoint technical preparation councils, establish consortia with all school boards of school districts that operate in the technical college district, and to provide technical assistance to school boards to develop technical preparation programs in each high school.

      Tech Prep is also considered under the state's broad umbrella of workforce development, which includes statewide initiatives and those specific initiatives supported by federal funding from the School-to-Work Opportunities Act (STWOA). In addition to being at the forefront of Tech Prep implementation, Wisconsin was one of the first eight states to receive federal funding under STWOA. The state is widely recognized as a leader in workforce development and the integration of policies and systems at the state level. Over time, the state has been engaged in developing broad support for school-to-work system building and has focused STWOA funds on work-based learning through extensive State Certificated Youth Apprenticeship administered through the Department of Workforce Development and State Certified Cooperative Education administered by the Department of Public Instruction.

      Implementation Strategy: Tech Prep in Wisconsin is administered through sixteen consortia formed locally by the technical college district and the secondary schools located within the region. According to an April 1998 report Tech Prep in Wisconsin, "Tech Prep as defined in Wisconsin is cooperation between K-12 schools, technical colleges, universities, and business, labor, and community to develop and utilize integrated and applied academic and technical curricula that provide a coherent sequence of courses and experiences designed to provide high school graduates with a more technically-oriented background leading toward successful transition from school to technical education." (p.1)

      Each technical college district appoints a local Tech Prep Council, which is typically co-chaired by a secondary school administrator and a business person or technical college administrator. The local Tech Prep Council is responsible for strategic planning and oversight of the implementation of the consortium's Tech Prep plan.

      Within the boundaries of each technical college district, all secondary schools are served by the consortium's Tech Prep plan. Each technical college has a staff person called a Tech Prep Curriculum Specialist who is responsible for "conducting Tech Prep activities for the benefit of secondary school counselors and teachers and working with technical college staff," according to Connie Colussy, the DPI Tech Prep Consultant at the state level. The Tech Prep grant to local consortia may pay for this staff person, who has a leadership role in planning and evaluating Tech Prep activities. Among the many services provided by the technical colleges to secondary schools are such things as developing articulation agreements, counselor workshops, teaching training, career fairs, and teacher extern programs.

      Staff development is provided to create integrated curricula that enables "learners to better connect interrelated concepts, content, and processes and seek relationships between past, present, and future experiences and learning." Tech Prep also provides staff development for teachers to implement applied curricula that "reflects teaching strategies that require students to use knowledge and skills in solving real world problems." (Tech Prep in Wisconsin, 1998, p.5)

      At the heart of the federal Tech Prep initiative is the development of coherent sequences of secondary and postsecondary courses leading to an associate degree or two-year certificate in one of six fields specified in the Carl D. Perkins Vocational Applied Technology Education Act of 1990. Wisconsin has developed an extensive system of local and state articulation agreements between secondary and postsecondary education institutions. The state has two types of credit arrangements: [1] advanced standing and [2] transcripted credit.

      Students receive advanced standing when high school courses or competencies are equivalent to a technical college course and the course is taught by a high school teacher in the local high school. Students receive credit for advanced standing if they enroll in a technical college within 27 months after high school graduation.

      Students receive transcripted credit when a high school course is the same as a technical college course and is taught by a high school teacher who also holds Wisconsin Technical College System (WTCS) articulation certification through a Memorandum of Understanding between the high school and the technical college. Students receive college credit on an official transcript after successful completion of the course. Through an agreement among all the technical college districts, transcripted credit can be transported throughout the state. Advanced standing is also transportable, though in comparable courses only.

      There are a total of 3,717 advanced standing and 271 transcripted credit arrangements through local articulation agreements. The first group of completers under a statewide advanced standing agreement for agribusiness/science technology finished in June 1998. Statewide articulation agreements in manufacturing and electronics will be available during the 1998 - 1999 school year.

      In addition, Wisconsin has developed educational planning documents called Curriculum Maps which identify sequences of secondary and postsecondary courses that lead to the completion of a postsecondary education or training program: "To date, 86% of high schools use curriculum maps with students in helping them become aware of broad career areas and identify a tentative career interest." (Tech Prep in Wisconsin, 1998, p.10)

      Evolution Of Strategy: The WTCSB has further expanded Tech Prep and now includes articulation agreements with the 13 four-year campuses in the University of Wisconsin system which will further expand postsecondary options and deliver a 2+2+2 model of education for students. "This alignment is currently developed through the 3,900 courses articulated between secondary schools and technical colleges and the 400+ program-to-program articulation agreements between technical colleges and University of Wisconsin programs." (p.5) In addition, all 14 youth apprenticeship areas have statewide advanced standing articulation agreements that allow students completing state Youth Apprenticeship programs who enroll in a technical college program to receive a minimum number of advanced standing credits.

      In its 1998 Tech Prep evaluation, Wisconsin attributed the increase in integrated and applied curricula (2,576 integrated and applied courses revised or upgraded between 1994 and 1997) as "a major impetus for the University of Wisconsin System to develop its Competency-Based Admission Policy which opened doors for students taking non-traditional integrated and applied curricula." (Tech Prep in Wisconsin, 1998, p.5)

      Lessons Learned: According to Gabrielle Banick Wacker, Tech Prep Education consultant for the Wisconsin Technical College System, the changes and systematic development of the Tech Prep system has occurred over time and through many years of developing cooperative and collaborative arrangements at the state and local levels. All schools and all students are included in the system; this high level of inclusion creates wide ranging support for the system. A statewide client reporting system will allow secondary, postsecondary, and workforce development agencies to have data regarding the number of students with transcripted credit as well as to better track students as they enter the workforce or return to school for additional training. In an effort to address the years that are sometimes spent "drifting" after high school, the WTCSB recently adopted a goal to increase the percentage of students entering the system immediately following high school to 25% of all high school graduates. Although this is an ambitious goal, the WTCS Client Reporting System shows that almost one-third of the 1994 and 1995 high school graduating classes enrolled in a Wisconsin Technical College within three years of graduation. Through collaboration, sustained staffing and support, and the development of a data system to track results, Wisconsin has demonstrated the extent to which it takes time, effort, and commitment to build a system of secondary and postsecondary transitions which is truly meaningful to students.


Reference

Tech Prep in Wisconsin. (1998, April). Madison: Wisconsin Department of Public Instruction and Wisconsin Technical College System.




West Virginia's Jobs Through Education Act


Using a School-to-Work Framework To Implement Education Reform

      Policy Rationale and Goals: Beginning in the late 1980s, West Virginia embarked on a series of reforms that set the stage for connecting improved quality in the education system to a better prepared workforce. The state's school-to-work system began to take form under policy initiatives such as mandating higher standards for all students, involvement in SREB's High Schools That Work network, and expanding Registered Youth Apprenticeships and Cooperative Education opportunities. Other initiatives, including a system of site-based management, local school improvement councils, and performance based accreditation, combined to add a local drive for education reform. In 1990, West Virginia adopted in statute the following six Goals for Education:

  1. All students will have equal opportunities and will be ready for the 1st grade.

  2. Student performance on national measure will equal or exceed national averages with an emphasis on science and mathematics achievement. Performance measures for students in the lowest quartile will improve by 50%.

  3. The best personnel will be recruited, retained, provided professional development to improve their skills, and compensated with competitive salaries and benefits.

  4. Ninety percent of 9th grade students will graduate from high school with the knowledge and skills necessary for college, other postsecondary education, or gainful employment. The number of high school graduates entering postsecondary education will increase by 50%.

  5. All school facilities will provide a safe, disciplined environment and meet the educational needs of all students.

  6. All working-age adults will be functionally and technically literate. We will use schools, colleges, and universities as centers for life long learning.

      A series of town meetings held in 1990 served to validate the state's educational goals and gave local communities the opportunity to articulate strategies for meeting them. The strategies identified locally included emphasizing early childhood development, helping at-risk students, improving the quality of teaching, strengthening workforce preparation, and increasing educational accountability.

      In 1996, initial school-to-work system building efforts were brought together under West Virginia's principal K-12 reform initiative, Senate Bill 300 The Jobs Through Education Act. The legislation set new high school graduation requirements, mandated career and educational planning to begin in the 8th grade, required students to choose state career clusters and pathways, and established a high school diploma warrantee system for students and their employers.

      In 1997, the legislature passed complementary legislation, House Bill 4306, which established the School Performance and Audit Agency, an agency separate from the Department of Education that reports directly to the State Board of Education. The intent of establishing a separate agency was to ensure that school accreditation policies are aligned to other reforms yet are independently evaluated.

      Implementation Strategy: Subsequent to passing reform legislation, a separate School-to-Work (STW) office was established that reports to the state's STW Coordinating Council, an interagency team that includes education and all other relevant state agencies as well as representatives from organized labor and business. While the West Virginia Department of Education is the fiscal agent for STW, a three-person staff provides the primary direction for system building. To coordinate its activities with the STW office, the Department of Education created a unit of five individuals to act as liaisons in the implementation of STW with local partnerships and educators.

      Most of the liaisons to the STW office originate from the state's High Schools That Work (HSTW) network and provide a direct link to reform efforts that have been underway for many years. To build on the success and local support of the state's HSTW network, West Virginia is beginning to implement a modified version of HSTW for elementary and middle schools. West Virginia further leverages local support for STW concepts by requiring that schools submit a unified school improvement plan that is based on the HSTW Ten Key Practices to receive STW funding.

      According to Ron Grimes, Director of the West Virginia Office of School-to-Work, the state "has chosen to be strategic about how it develops its system--all efforts are evaluated against the state's focus on systemic change." From this perspective, building a system means that all students "may not have had work-based learning at higher levels [i.e. internships, mentoring at the high school level] but comprehensive career exploration has been concentrated [statewide] in the 7th, 8th, and 9th grades."

      To implement various components of the state's STW system, the Jobs Through Education Act mandates a number of requirements that aim to bring about widespread change in the education system. These include the following:

      Evolution of Strategy: According to Grimes, the strategy for implementing education reform under a school-to-work framework began under the direction of a Democratic governor and has continued "largely unchanged" under the new Republican governor. "He is very committed," and under his administration, the school accreditation law (HB4306) was passed establishing the School Performance and Audit Agency, which will continue to support alignment of programs and funding at the state level. While the "massive changes have not come without criticism" at the local level, "these are a minority." Already 28 out of 55 county school districts have implemented the 9th, 10th, and 11th grade requirements of the legislation. "Next year, all 55 counties will have to implement," stated Grimes.

      Outcomes/Lessons Learned: The state has begun to evaluate its efforts and has conducted a financial evaluation of STW grants. In addition, unannounced financial reviews and program visitations to each partnership are also conducted by the STW office. To streamline the review and evaluation process, the STW office has incorporated the HSTW three-year technical review process into its funding mechanism. In addition, the office is in the process of making recommendations regarding the type of data that should be added to the Student Performance Information State Data Collection System that collects financial and program information.

      While local concerns regarding the sustainability of the STW system under construction in West Virginia continue, Grimes reports that the state works under the proposition that if "school-to-work principles are integrated with and supportive of general education reform and you have built strong local partnerships committed to the reform effort, you have the power of all general education dollars going toward building capacity, going to scale, and the $64,000 question of sustainability is answered."


Reference

Information based on "West Virginia's answer to the $64,000 School-to-Work question", written by Tracy Schmidt, National Conference of States Legislatures, May 1998. Further information provided by Ron Grimes, Director of the West Virginia Office of School-to-Work.




Test Flights


Emerging Strategies for Policy Linkages


Michigan: Joint Development of a Career Preparation System

      With the goal of a transparent, integrated system of career development, the Michigan Department of Education and the Michigan Jobs Commission have launched Michigan's Career Preparation System. This initiative was proposed by Governor John English in the 1997 State of the State address. The system's goal is to "ensure that each graduate will receive world-class skills and training that prepares them for higher education and their first job in today's competitive market." Integrated instruction, career exploration and guidance opportunities beginning in middle school, and an emphasis on postsecondary articulation opportunities form the basis of the system. During the first year, three-year plans are being developed by the state's Education Advisory Groups in each region's Workforce Development Board. During the second year, nearly $24 million will be allocated to begin to implement the Career Preparation system. Each region's plan must address seven components of the system: [1] academic preparation, [2] career development, [3] workplace readiness, [4] professional and technical education, [5] work-based learning, [6] accountability, and [7] school improvement. At the state level, 22 members have formed the Council for Career Preparation Standards, representing business, industry, education, labor, parents, and relevant state agencies. The primary responsibilities of the council are to maintain an information system regarding employment opportunities; set career competency standards for career clusters; and provide public information on career preparation opportunities to parents, students, and others.


Illinois and Indiana: Sharing Industry Skill Standards Development Resources

      Throughout the 1990s, both Illinois and Indiana have had extensive involvement in the development of industry skill standards. The two states also work closely through the Council of Great Lakes Governors and have recently agreed to share their resources in the development, adoption, and assessments for each state's skill standards system.






RESOURCES



      This section is intended for state-level administrators to apply what they have learned in the case studies to their own state policies and practices. In addition, NCRVE views this document as an ongoing process of collecting effective practices throughout the states. In this section are the forms for creating a state's accountability practice to be included in supplements to this document and in ongoing updates on the Web.

      Also included in this section is the Public Accountability for Student Success: Standards for Education Accountability Systems by the National State Boards of Education's Study Group on Education Accountability, as well as an extensive bibliography of state-related literature.




Next Steps


Using This Publication To Deepen Our State Efforts



SETTING STANDARDS
What is our best practice in this area?











What strategy should we implement next?













ASSESSMENT SYSTEM
What is our best practice in this area?











What strategy should we implement next?













CURRICULUM STRATEGIES
What is our best practice in this area?











What strategy should we implement next?















SYSTEM SUPPORTS
What is our best practice in this area?











What strategy should we implement next?













QUALITY ASSURANCE
What is our best practice in this area?











What strategy should we implement next?













POLICY LINKAGES
What is our best practice in this area?











What strategy should we implement next?














Your State's Promising Practices

      The NCRVE Publication, Taking Off! Sharing State-level Accountability Strategies, has been created through contributions from state level staff in a variety of ways. Promising practices included in this publication were identified through a review of relevant research and literature, interviews with state staff and national organizations involved in state-level initiatives, and through the technical assistance conducted by NCRVE during 1998. States were given the opportunity to identify their own promising practices through a fax-back form to state directors. This year, states are again invited to submit promising practices that can be included in a follow-up and update to this document. Taking Off! will evolve and is intended to serve as a workbook for states as they develop their state plans to implement Perkins III accountability measures. More generally, this workbook is intended to help states envision how vocational education can continue to raise student achievement and tie its efforts to fit more closely into state education reform and workforce development efforts.

      As part of NCRVE's ongoing technical assistance, we are asking states to identify practices or "snippets" that can be shared with other states. While promising practices are approximately two to three pages in length and are usually descriptions of policies in the implementation stage, snippets are intended to be short descriptions of emerging policies or plans that have yet to be fully implemented.

      There are at least three ways that promising practices or "snippet" can be submitted. One way is to submit a promising practice or "snippet" on the forms that have been included in this document. Another way to include a promising practice is to complete the fax-back form. NCRVE staff will follow up with a telephone interview and a draft of the promising practice for you to review. The third way is electronically at the NCRVE Web site (http://ncrve.berkeley.edu/sts).

      Here are some guidelines for writing a promising practice or snippet. As you write about the promising practice, think about how your state fits on the continuums that have been discussed in this document. Does your state lean toward centralization? Decentralization? Is workforce training the primary purpose of vocational education? Is education reform the primary purpose? A mixture of both? How do these tensions affect implementation?

      Policy Rationale and Goals: First, describe the policy and the background or history in your state. What are the goals of the policy? Where did the political support for the policy come from? What about the policy is specific to the state? What can be generalized to other state efforts? How does the continuum come into play here or in the implementation strategies?

      Implementation Strategy: What strategies were used to implement the policy? How were the goals communicated to the local level? What kinds of policy levers were used to implement the policy (i.e., mandate or rule changes, technical assistance, tying policy to program funding). What kinds of support are given to the local level to encourage implementation? What constraints does your state have in implementing the strategy?

      Evolution of Strategy: How has strategy evolved or been modified? Why did the state have to modify it? If the implementation strategy has not been modified much, this section is optional.

      Outcomes/Lessons Learned: In what ways does your state feel that this policy/strategy has improved student achievement? Is there data to support it? Can the state point to other measures of success that support the continued implementation of the policy (i.e. evidence of expanded use, more requests for certain products or assistance from the state, continued political support for the policy, and so on) Are any formal evaluations planned? What has been learned that can be shared with other states?




Case Study Format

Title:


Topic:

State:
Contact Person:

Rationale for Policy and Goals:







Implementation Strategy:








Evolution of Strategy (optional):








Outcomes/Lessons Learned:












"Snippet" Format
 (limit to one-half page)

Title:


Topic:

State:
Contact Person:

Rationale for Policy and Goals:







Implementation Strategy:






































First Request for Promising Practices: September 30, 1998
Next Request:Ongoing 1999
Dear State Director:

      Since the 1980s and well into the 1990s, demands for improving the quality of education as defined by the academic performance of students have been at the forefront of state politics and policy. As we enter a new century and the third decade of sustained interest in improving the quality of education, it is important to understand the ways in which the state level has responded to pressures for change. While ultimately measured by the performance of students in the classroom, the increased interest in education has focused attention at the state level and on those administrators and officials responsible for implementing the wide ranging agenda that has emerged under the umbrella of education reform. The extent to which the ideas and policies that continue to emerge from national, state, and local influences are adopted in the classroom is seen by many as dependent on the capacity of those at the state level to sustain and support educational improvement.
      To begin to understand the demands on, the responses to, and the potential for improved administration at the state level, NCRVE is developing a technical assistance document aimed at state-level administrators called Taking Off! Sharing State-Level Accountability Strategies. This document is another component of our continuing effort to support state-level administrators with their work. We hope to describe a multitude of state-level approaches to developing standards, curriculum, assessment, and accountability systems all with the goal of improving student achievement.
      We would like to hear from you about one of your practices within your state. In particular, we are looking for practices that integrate academic and vocational education. For example, an overall accountability system that includes vocational education achievement in the system or an assessment system for vocational education that includes statewide academic standards. We have enclosed sample case studies as examples.
      Feel free to write your own case study and e-mail it to mrahn@publicworksinc.org, or fax to (626) 564-0657. Or, fill out the enclosed fax-back form about your practices. We will follow up by telephone to interview you. Thank you in advance for your time!

Sincerely,



Dr. Phyllis Hudecki
Associate Director
NCRVE




      Fax Back       from _____________________________________ (state name)

      In your state, what promising practices are being implemented to improve student achievement? Practices can be aimed specifically at only vocational education or education in general.

Please
Check
CategoriesContact Person

Setting Standards

     Development of academic, vocational, and/or career
     standards


     Crosswalks of academic and vocational standards

     Technical assistance related to standards

Curriculum Strategies

     Developmentof standards-driven curriculum frame-
     works, course syllabi, and crosswalks


     Development of career pathways/majors

     Development of standards-driven curriculum/projects

Assessment Systems

     Assessment of vocational competencies

     Assessment of work-readiness skills (i.e., SCANS)

     Assessment of academic standards

     Providing technical assistance to locals in the areas of
     standards, curriculum and assessment


System Supports

     Incentives and consequences

     State-level data systems

     Assisting locals to use data for program improvement

Quality Assurance

     Program certification/school and program quality
     review


Policy Linkages

     Secondary/postsecondary

     School-to-work

     State joint development efforts

Fax to Dr. Mikala L. Rahn at (626) 564-0657 or call with questions at (626) 564-9890.




Public Accountability for Student Success


Standards for Education Accountability Systems

      The following is a list of standards developed by the National Association of State Boards of Education's Study Group on Education Accountability, October 1998:

Standard 1: Accountability Goals and Vision
      Legal authorities clearly specify accountability goals and strategies that focus on student academic performance.
      Indicators:
     1)The purpose of the accountability system is defined in concrete terms as a program designed to help schools improve student achievement consistent with specified standards for student learning and performance.
     2)State and local standards for student achievement are articulated in clear, specific, and measurable terms for all students.
     3)The accountability system explicitly encompasses all students, regardless of race/ethnicity, gender, income, disability status, or English language proficiency.
     4)The accountability system is based on a strategy of targeting assistance where it is needed to enhance schools' capacity to teach students to high standards.

Standard 2: Governing Accountability
      At each level of the education system, designated authorities are charged with the efficient governance of the accountability system.
      Indicators:

     1)Responsibilities and lines of authority are clearly articulated for those governing and managing the accountability system (e.g., state and local boards of education) and those being held responsible for achieving student performance goals (e.g., teachers, schools, districts).
     2)Organizations responsible for governance at each level of the education system assure that information on student achievement is efficiently collected, analyzed, and reported to educators, families, policymakers, and the public.
     3)With the assistance of administering agencies, governing organizations are responsible for judging the extent to which designated agents are achieving student performance goals, deciding on consequences in response to the agents' performance, and applying those consequences.
     4)Governing organizations are responsible for enhancing the capacity of schools and teachers to achieve student achievement goals through various supports and technical assistance.
     5)Governing organizations continually evaluate the efficiency and effectiveness of all aspects of the accountability system and refine the system accordingly.

Standard 3: Responsible Agents
      Specific responsibilities for student learning and performance are assigned to designated agents.
      Indicators:

     1)Responsibilities for achieving specified goals for student learning and performance are clearly articulated and assigned to particular agents (e.g., teachers, schools, districts).
     2)Designated agents have sufficient authority and discretion to organize their resources and programs to improve student learning and performance.

Standard 4: Collecting Performance Information
      Accountability is based on accurate measures of agent performance as informed by assessments that are administered equitably to all students.
      Indicators:

     1)The governing organizations collect data on student progress in achieving performance standards from assessments administered at designated intervals (e.g., 4th, 8th, and 10th grades).
     2)Assessments and student academic standards are aligned.
     3)Multiple assessments serving different purposes are used to capture a complete picture of student progress and agent performance.
     4)Test elements meet rigorous standards for validity and reliability, and are free of racial, ethnic, or cultural bias.
     5)All students participate in assessments, with appropriate accommodations and supports as necessary to ensure their equal opportunity to perform. Alternative assessments are provided for a strictly limited number of students unable to participate in regular assessments due to disability or language.
     6)Tests are economically feasible, secure from tampering, and their administration does not overburden schools and teachers.
     7)Data are collected on critical student characteristics as well as program delivery and implementation at the school level.

Standard 5: Analyzing and Reporting Performance Information
      Those responsible for governing accountability regularly report student and school performance information in useful terms and on a timely basis to school staff, students and their families, state and local policymakers, and the news media.
      Indicators:

     1)Schoolwide scores for student performance are analyzed in several ways, including absolute performance in relation to standards, degree of improvement over their previous performance, and in comparison with predicted scores based on contextual conditions.
     2)Student performance data are reported in aggregated schoolwide averages and in disaggregated form to draw attention to defined subpopulations whose performance merits particular attention such as students with disabilities, students eligible for free or reduced price lunch, students of different racial/ethnic backgrounds, and students with special language needs.
     3)Student scores are provided to teachers for their individual students within the same academic year in which the student takes the assessment and are used to improve classroom practice.
     4)Performance reports are produced using language and formats appropriate to their various intended audiences. Reports include thorough explanations of the meaning of the results, the limitations of the data, and important student and school contextual factors that may affect student achievement.
     5)Education decisionmakers use student performance results as guides for improving policy and practice.
     6)Operation of the accountability system maintains the confidentiality of individual students.

Standard 6: Incentives and Consequences
      Incentives are established that effectively motivate agents to improve student learning. Consequences, which could include rewards, interventions, or sanctions, are predictably applied in response to performance results.
      Indicators:

     1)The accountability system encourages self-evaluation and self-improvement at every level.
     2)The accountability system includes affordable incentives and consequences in the form of rewards, interventions and sanctions designed to motivate and enhance the agent responsible for students learning and achievement.
     3)Unsatisfactory student performance relative to standards or lack of forward progress triggers a calibrated series of interventions with the agent, which might range from technical assistance and capacity building to the reconstitution of entire schools or districts as a last resort.
     4)Student promotion and graduation are linked with satisfactory performance on appropriate assessments.
     5)Agents accountable for student learning and performance clearly understand what they are responsible for and how rewards, interventions and sanctions are decided on.
     6)People and institutions being sanctioned have due process avenues for appeal.

Standard 7: Building Agent Capacity
      Agents are provided sufficient support and assistance to ensure they have the capacity necessary to help students achieve high performance standards.
      Indicators:

     1)All schools are capable of exercising school-wide strategic judgment in organizing curricula and allocating resources to enhance student learning.
     2)All teachers, administrators, and other school personnel are capable of working as a collaborative community focused on student learning as their highest priority.
     3)All schools have an adequate number of teachers and support personnel.
     4)Teachers in all schools possess the knowledge, skills, and dispositions necessary to help all students learn.
     5)All schools are equipped with adequate and appropriate technology, supplies, and physical space to effectively deliver programs and services.
     6)Professional development is provided to teachers, principals, and other personnel on knowledge and skills needed to achieve student performance goals and implement the accountability system.
     7)Technical assistance is targeted to low-performing schools to enhance their capacity to help students achieve high-performance standards.

Standard 8: Policy Alignment
      Policymakers work to ensure that education policies, mandated programs, financial resources, and the accountability system are well aligned so that consistent messages are communicated about educational goals and priorities.
      Indicators:

     1)Education policymakers ensure that the accountability system and other education policies (e.g., student content standards, licensure and certification systems, staff development programs, college of education accreditation requirements, public school choice policies, education finance systems) are consistent with the goals and strategies of the accountability system.
     2)Those responsible for governing accountability identify and make recommendations regarding statutes, rules, and regulations that are in conflict with achieving the goals of the accountability system.
     3)Governing organizations assess the degree to which the policies and expectations of accreditation agencies and professional associations are aligned with the goals of the accountability system.
     4)The accountability system includes an assessment of the degree to which higher education prepares an adequate number of well-qualified administrators, teachers, and support staff.

Standard 9: Public Understanding and Support
      The accountability system has widespread support.
      Indicators:

     1)Significant numbers of students, parents, K-12 educators, higher education leaders and faculty, business leaders, policymakers, and the public demonstrate they accept and support the broad elements of the accountability system.
     2)State and local boards of education engage school communities (administrators, teachers, parents, students) and other members of local communities in an ongoing dialogue about continuous improvement of student learning and achievement as informed by the accountability system.
     3)Discussion of student performance results and needed school improvements regularly occur in community forums.
     4)Those responsible for governing accountability involve school communities in decisionmaking processes regarding necessary interventions.

Standard 10: Partnerships
      Various established partnerships work together to support districts, schools, and teachers in their efforts to improve student achievement.
      Indicators:

     1)Public K-12 schools and families work together as partners to ensure all students learn and achieve to high standards.
     2)Public K-12 education and higher education work together as partners to ensure schools and teachers are capable of achieving the student achievement goals of the accountability system (e.g., technical assistance, preservice training of teachers, student admission, and placement criteria).
     3)Public K-12 education, human service agencies, and other organizations involved in education work together in a coordinated fashion to ensure all students have an equal opportunity to learn and develop.
     4)Public K-12 education works with the business community to improve student learning and development.




Bibliography


Setting Standards


Bailey, Thomas, & Merritt, Donna. (1997, August). Industry skill standards and education reform. American Journal of Education,105(4), 401-436.

Bailey, Thomas, & Merritt, Donna. (1995). Making sense of industry-based skill standards (MDS-777). Berkeley: National Center for Research in Vocational Education, University of California, Berkeley.

Kendall, John S., & Marzano, Robert J. (1996). Content knowledge: A compendium of standards and benchmarks for K-12 education. Aurora, CO: Mid-Continent Regional Educational Laboratory. (To order, contact ASCD at [800] 933-2723.)

Klein, Steven G. (1996). Skill standards: Concepts and practices in state and local education: A synthesis of literature and alternative conceptual frameworks.Berkeley, CA: MPR Associates.

National Skill Standards Board (NSSB). (1996). Report to Congress (October 1995 to January 1997). Available on-line: <http://www.nssb.org/bg/REPORT.HTML>.


Assessment Systems


National Association of State Boards of Education (NASBE). (1997, October). The full measure of the NASBE study group on statewide assessment systems. Alexandria, VA: Author.

Stecher, Brian M., Rahn, Mikala, L., Ruby, Allen, Alt, Martha Naomi, & Robyn, Abby. (1997). Using alternative assessments in vocational education (MDS-946). Berkeley: National Center for Research in Vocational Education, University of California, Berkeley.

Training and Employment Program, Employment and Social Services Policy Studies, Consortium for Policy Research in Education. (1994). Performance assessment systems: Implications for a national system of skill standards. Washington, DC: National Governors Association.

Wirt, John G., & Resnick, Lynn. (Eds.). (1996). Linking school and work: Roles for standards and assessment. San Francisco: Jossey-Bass.


Curriculum Strategies


Bailey, Thomas. (1997). Integrating academic and industry skill standards (MDS-1001). Berkeley: National Center for Research in Vocational Education, University of California, Berkeley.

Klein, Steven G. (1996). Applying the standard: Using industry skill standards to improve curriculum and instruction: Lessons learned from early implementation in four states. Berkeley, CA: MPR Associates.


System Supports and Quality Assurance


Cornett, Lynn M., & Gaines, Gale F. (1997). Accountability in the 1990s: Holding schools responsible for student achievement. Atlanta, GA: Southern Regional Education Board.

Gaines, Gale F. (1995). Linking education report cards and local school improvement. Atlanta, GA: Southern Regional Education Board.

National Association of State Boards of Education. (1998, October). Public accountability for student success: Standards for education accountability systems (The Report of the NASBE Study Group on Education Accountability). Alexandria, VA: Author.

Watts, James A., Gaines, Gale F., & Creech, Joseph D. (1998, October). A fresh look at school accountability. Atlanta, GA: Southern Regional Education Board.


Policy Linkages


Education Commission of the States. (1998, January). The progress of education reform 1997. Denver: Author.

Hershey, Alan M., Hudis, Paula, Silverberg, Marsha, & Haimson, Joshua. (1997, April). Partners in progress: Early steps in creating school-to-work systems. Submitted by Mathematica Policy Research, Inc. Washington, DC: U.S. Department of Education, Planning and Evaluation Services.

Van Horn, Carl E. (1995, October). Enhancing the connection between higher education and the workplace: A survey of employers. Denver: State Higher Education Executive Office and Education Commission of the States.

Vickers, M. (1997). Reconnecting education and career: Problems and solutions. In Adria Steinberg (Ed.), Real learning, real work: School-to-work as high school reform. New York: Routledge.


State by State Reports


Gandal, Matthew. (1997). Making standards matter, 1997: An annual fifty-state report on efforts to raise academic standards. Washington, DC: American Federation of Teachers.

AFT. (1998). Making standards matter, 1998: An annual fifty-state report on efforts to raise academic standards. Washington, DC: Author.

Council of Chief State School Officers (CCSSO). (1996, October). Key state education policies on K-12 education: Content standards, graduation, teacher licensure, time and attendance. A 50-state report. Washington, DC: CCSSO, State Education Assessment Center.

CCSSO. (1998). State education accountability reports and indicator reports: Status of reports across the states, 1998. Results of a 50-state survey. Washington, DC: CCSSO, State Education Assessment Center.

CCSSO. (1998, August). Key state education policies on K-12 education: Standards, graduation, teacher licensure, time and attendance. A 50-state report. Washington, DC: CCSSO, State Education Assessment Center.

Education Commission of the States and National School-to-Work Office. (1997, October). Profiles in connecting learning and work: State initiatives. Denver: Author.

Finn, Jr., Chester E., Doyle, Denis, Galston, William, Marshall, Will, & Traiman, Susan. (1998, July). The state of state standards. Fordham Report, 2(5).

Joftus, Scott, & Berman, Ilene. (1998). Great expectations? Defining and assessing rigor in state standards for mathematics and English language arts. Washington, DC: Council for Basic Education.

McCarthy, Martha, Langdon, Carol, & Olson, Jeannette. (1993, November). State education governance structures. Submitted by the Indiana Education Policy Center, Indiana University, Bloomington, Indiana. Denver: Education Commission of the States.

Olson, Lynn. (1999, January 11). Quality Counts `99: Rewarding results, punishing failures. Education Week (Entire issue), 18(17).

Setting the standards from state to state. (1995, April 12). Education Week (Special report).


Resources and Research on State Education Policy and Practice


Adams, Jr., Jacob E., & Kirst, Michael W. (1997, September). School leadership for accountability. Discussion draft of paper prepared for the 1998 AERA Handbook of Research in Education Administration.

Brandt, Richard M. (1995). Teacher evaluation for career ladder and incentive pay programs. In Daniel L. Duke (Ed.), Teacher evaluation policy: From accountability to professionalism. Albany: State University of New York Press.

Charner, Ivan. (1996, October). Studies of education reform, study of school-to-work initiatives. Washington, DC: U.S. Department of Education, Office of Educational Research and Improvement.

Corcoran, Thomas B. (1995, June). Helping teachers teach well: Transforming professional development. CPRE Policy Briefs. New Brunswick, NJ: Consortium for Policy Research in Education.

Cornett, Lynn M. (1995, February). Lessons from 10 years of teacher improvement reform. Educational Leadership, 52(5), 26-30.

Cornett, Lynn M., & Gaines, Gale F. (1992, January). Focusing on student outcomes: Roles for incentive programs. The 1991 National Survey of Incentive Programs and Teacher Career Ladders. Atlanta, GA: Southern Regional Education Board.

Danzberger, Jacqueline P., Kirst, Michael W., & Usdan, Michael D. (1992). Governing public schools: New times, new requirements. Washington, DC: The Institute for Educational Leadership, Inc.

Darling-Hammond, Linda. (1997, November). Doing what matters most: Investing in quality teaching. Prepared for the National Commission on Teaching and America's Future, New York, NY.

Duke, Daniel L. (1995). The move to reform teacher evaluation. In Daniel L. Duke (Ed.), Teacher evaluation policy: From accountability to professionalism. Albany: State University of New York Press.

Elmore, Richard F. (1996). Getting to scale with successful educational practices. In Susan H. Fuhrman & Jennifer A. O'Day (Eds.), Rewards and reform: Creating educational incentives that work. San Francisco: CPRE and the Pew Forum on Education Reform, Jossey-Bass.

Elmore, Richard F., & Fuhrman, Susan H. (1994). Governing curriculum: Changing patterns in policy, politics, and practice. In Richard F. Elmore & Susan H. Fuhrman (Eds.), The governance of curriculum (1994 Yearbook of the Association for Supervision and Curriculum Development). Alexandria, VA: ASCD.

Elmore, Richard F., Peterson, Penelope L., & McCarthey, Sarah J. (1996). Restructuring in the classroom: Teaching, learning, and school organization. San Francisco: Jossey-Bass.

Fuhrman, Susan H. (Ed.). (1993). Designing coherent education policy: Improving the system. San Francisco: Jossey-Bass.

Fuhrman, Susan H. (1994). Legislatures and education policy. In Richard F. Elmore & Susan H. Fuhrman (Eds.), The governance of curriculum (1994 Yearbook of the Association for Supervision and Curriculum Development). Alexandria, VA: ASCD.

Fuhrman, Susan H., & Elmore, Richard F. (1994). Governors and education policy in the 1990s. In Richard F. Elmore & Susan H. Fuhrman (Eds.), The governance of curriculum (1994 Yearbook of the Association for Supervision and Curriculum Development). Alexandria, VA: ASCD.

Fullan, Michael G. (1994). Coordinating top-down and bottom-up strategies for education reform. In Richard F. Elmore & Susan H. Fuhrman (Eds.), The governance of curriculum (1994 Yearbook of the Association for Supervision and Curriculum Development). Alexandria, VA: ASCD.

Hill, Paul T., Harvey, James, & Praskac, Amy. (1993). Pandora's box: Accountability and performance standards in vocational education (MDS-288). Berkeley: National Center for Research in Vocational Education, University of California, Berkeley.

Kaagan, Stephen S., & Coley, Richard J. (1989). State education indicators: Measured strides, missing steps. New Brunswick, NJ: Center for Policy Research in Education, Rutgers University and Policy Information Center, Educational Testing Service.

Kirst, Michael W. (1990, July). Policy perspectives, accountability: Implications for state and local policymakers. Washington, DC: U.S. Department of Education, Office of Educational Research and Improvement.

Lusi, Susan Follett. (1994). Systemic school reform: The challenges faced by state departments of education. In Richard F. Elmore & Susan H. Fuhrman (Eds.), The governance of curriculum (1994 Yearbook of the Association for Supervision and Curriculum Development). Alexandria, VA: ASCD.

Millman, Jason, & Sykes, Gary. (1995). The assessment of teaching based on evidence of student learning. In Samuel B. Bacharach & Bryan Mundell (Eds.), Images of schools: Structures and roles in organizational behavior. Thousand Oaks, CA: Corwin Press, Sage Publications.

Mohrman, Susan Albers, & Lawler III, Edward E. (1996). Motivation for school reform. In Susan H. Fuhrman & Jennifer A. O'Day (Eds.), Rewards and reform: Creating educational incentives that work. San Francisco: CPRE and the Pew Forum on Education Reform, Jossey-Bass.

Murnane, Richard J., & Levy, Frank. (1996). Teaching the new basic skills: Principles for educating children to thrive in a changing economy. New York: The Free Press.

Odden, Allan R., & Conley, Sharon. (1992). Restructuring teacher compensation systems. In Allan R. Odden (Ed.), Rethinking school finance: An agenda for the 1990s (41-96). San Francisco: Jossey-Bass.

Picus, Lawrence O. (1992). Using incentives to promote school improvement. In Allan R. Odden (Ed.), Rethinking school finance: An agenda for the 1990s (166-200). San Francisco: Jossey-Bass.

Pipho, Chris. (1997, May). Standards, assessment, accountability: The tangled triumvirate. Phi Delta Kappan. Reprinted with permission from <http://www.ecs.org/ec/289a.htm>.

Rowan, Brian. (1996). Standards as incentives for instructional reform. In Susan H. Fuhrman & Jennifer A. O'Day (Eds.), Rewards and reform: Creating educational incentives that work. San Francisco: CPRE and the Pew Forum on Education Reform, Jossey-Bass.

Spillane, James P. (1994). How districts mediate between state policy and teachers' practice. In Richard F. Elmore & Susan H. Fuhrman (Eds.), The governance of curriculum (1994 Yearbook of the Association for Supervision and Curriculum Development). Alexandria, VA: ASCD.

Timur, Thomas B., & Kirp, David L. (1988). Managing educational excellence. New York: The Falmer Press.

Wilson, Bruce L., & Rossman, Gretchen B. (1993). Mandating academic excellence: High school responses to state curriculum reform. New York: Teachers College Press.


[1] Oregon adopted PASS, the Proficiency-based Admission Standard System, which will require students to demonstrate their knowledge and skills in six content areas for college admission and PREP, Proficiencies for Entry into Programs, which identifies proficiencies in academic and technical skills necessary for success in community college programs and are aligned with CAM and PASS.

[2] The nine career clusters include Consumer Service, Hospitality and Tourism; Business Management and Finance; Manufacturing, Engineering Technology; Environmental, Agricultural and Natural Resources; Health and Biosciences; Arts, Media and Communication; Transportation Technologies; Human Resource Services; and Construction and Development. Maryland State Department of Education Career Connections web site: http://www.mdse.state.md.us/factsndata/mdcareer.html.

[3] Current industry standards that are used or under consideration in Oklahoma include National Automotive Technicians Education Foundation (NATEF) (automotive), Printing Industries of America (PIA) (printing), American Welding Society (AWS) (welding), and National Institute for Metalworking Skills (NIMS) (machine tool).

[4] Industry standards in use in Alabama include NATEF (automotive), Associated General Contractors (construction), American Design Drafting Association (ADDA) (drafting), NIMS (machine tool), and PIA (printing).

[5] SREB's 16 member states are Alabama, Arkansas, Delaware, Florida, Georgia, Kentucky, Louisiana, Maryland, Mississippi, North Carolina, Oklahoma, South Carolina, Tennessee, Texas, Virginia and West Virginia.

[6] Makin, Richard C., Senior Director for Management and Planning, HSTW, SREB, presentation in Santa Fe, NM, October 14, 1998.