Exit Exam: May 11, 2005 – A Research Brief: Are California High Schools Ready For The Exit Exam?
Jennifer Jellison Holme and John Rogers, UCLA-IDEA
May 11, 2005
Click here for the Fact Sheet
Click here for the Press Release
Download: A Research Brief: Are California High Schools Ready for the Exit Exam? (332 KB, 6 Pages)
Re-Examining Evidence from California’s Independent Evaluator
Jennifer Jellison Holme and John Rogers, UCLA/IDEA
May 11, 2005
On April 21, 2005, UCLA’s Institute for Democracy and Access (IDEA) published a policy brief raising a series of questions about California’s readiness for the California Exit Exam (CAHSEE). The policy brief challenged the conclusion of the state’s independent evaluator, Human Resources Research Organization (HumRRO), that California schools have made great progress in providing all students with opportunities to learn the material covered on the Exit Exam. IDEA researchers Holme and Rogers concluded that HumRRO’s own “data reveals that many California schools still are not adequately preparing students for success on the CAHSEE.” In correspondence in early May, HumRRO’s President Lauress Wise argued that the IDEA researchers misunderstood HumRRO’s data. The differences between HumRRO and IDEA researchers are not merely academic. The questions raised are of critical importance to current policy discussions about the implementation of the Exit Exam. In an effort to inform this debate, Holme and Rogers have re-examined HumRRO’s data.
This new policy brief summarizes the evidence on the four questions of greatest interest to policy makers:
1) Do all California students have the opportunity to learn the material covered on the Exit Exam? 2) Are students in need of help being identified and supported? 3) How many students will pass the Exit Exam? 4) What is the impact of the Exit Exam on dropout rates?
1) DO ALL CALIFORNIA STUDENTS HAVE THE OPPORTUNITY TO LEARN THE MATERIAL COVERED ON THE EXIT EXAM?
HumRRO presents aggregate data for all of California’s students, as well as disaggregated data for specific sub-groups (e.g. economically disadvantaged students and special education students.) However, it does not disaggregate data for schools with the highest failure rates. Its report thus provides no policy relevant information on whether schools with the highest failure rates have different conditions than schools with lower failure rates. This information is critical to understanding whether students in all California schools are receiving adequate opportunities to learn the material on the Exam.
Our analysis of initial pass rates for the Class of 2006 indicate that the state has a serious“ failure concentration” problem, in that large numbers of failing students are concentrated into the same schools. According to our analysis:
• More than 1/4th of the students in regular high schools who failed the ELA section in 10th grade are concentrated in schools where overall ELA failure rates are 40% or greater.
• More than 1/4th of the students who failed the math section in 10th grade are concentrated in schools where overall math failure rates are 40% or higher.
• These schools with high failure rates (where at least 40% of students failed either ELA or mathematics) comprise 1 out of every 8 high schools in the state (13% of all high schools.)
Our analysis of California Department of Education data reveals that schools with greater than 40% failure rates are likely to experience serious opportunity to learn problems. The state of California has identified a group of so called “Williams” schools that are most deficient in facilities, schools materials, and teacher qualifications. The high schools with greater than 40% failure rates on the math section are 47 times more likely to be designated as “Williams” schools than as non-“Williams” schools. Conversely, high schools with less than 20% failure rates on the math section are 34 times more likely to be non-“Williams” schools as “Williams” schools. The conditions in these two groups of schools are starkly different. The schools with high concentrations of ‘failers’ have 8 times the rate of severe overcrowding (24% vs. 3%) and 8 times the rate of severe teacher shortages (51% vs. 6%) as schools with low rates of failure.1 On average, schools with high concentrations of ‘failers’ have a 74% teacher credential rate compared to a 92% teacher credential rate in the low failure schools. HumRRO makes much of the fact that 90% of all students reported that either most or all of the questions on the Exit Exam were covered in their class. But, a closer look at the data shows that 80% of students who did not pass the Math section said that they had not been taught all of the subject matter covered on the Exam.2 We suspect that the students who reported that they have not received the standards-based instruction are concentrated in the schools with the least access to teachers trained in the standards—the schools with the highest concentrations of failure. Importantly, we don’t have conclusive evidence on this last point. Nor does HumRRO have evidence to the contrary.
2) ARE STUDENTS IN NEED OF HELP BEING IDENTIFIED AND SUPPORTED?
The California Department of Education has told high schools that they should have remedial programs in place to ensure that students who initially fail the Exam can later succeed. In our “Response to HumRRO” we criticized the small number of principal surveys (34) HumRRO seemed to rely upon for the claim that “one of the most positive results of the CAHSEE requirement has been to help schools identify students who need additional help in acquiring essential skills and to implement programs to provide that help” (p.138).
HumRRO responds:
Our conclusion that “one of the most positive results of the CAHSEE requirement has been to help schools identify students who need additional help in acquiring essential skills and to implement programs to provide that help” was not based on this survey but rather the much larger study included in our May 2003 report submitted in response to AB 1609 requirements” (correspondence, Lauress Wise, HumRRO President, 5/2/04).
Our on-going concerns:
a) In many instances in the Year 5 report, including the example cited above, it is not clear which findings are based on the small number of 2004 principal surveys and which are based on the larger number of 2003 interviews. Citations of specific surveys are not included.
b) One of HumRRO’s major findings, clearly linked to the above point about increased student support, was based on the small number of principal surveys in 2004. In the conclusion of the Year 5 report, HumRRO states: “Principals reported continued efforts to implement programs and practices to help students who are not prepared to pass the CAHSEE and to promote learning for all students” (General Finding #4, p.137.) The principal data referred to here are responses from 34 interviews. As we noted in our “Response to HumRRO,” in a state with over 1,000 high schools, such a small number of responses should not be used to inform policy. Further, in reviewing HumRRO’s report, we discovered that the 34 principals who responded to the survey come from schools with very high rates (95%) of teachers credentialed in the subject areas that they are teaching. As we argue above, schools with the greatest concentration of students failing the Exit Exam have far higher rates of uncredentialed teachers. Clearly the results from the principal survey are not representative of California schools. All general findings and recommendations based on these principal surveys cannot be trusted.
3) HOW MANY STUDENTS WILL PASS THE EXIT EXAM?
In the Year 5 report, HumRRO claims that passing rates on the Exit Exam are likely to increase by 10% a year and that this level of improvement will result in “approximately the same percentage of students in the Class of 2006 being able to meet the CAHSEE requirement as currently graduate from high school” (p. vii.). We asked HumRRO President Lauress Wise to provide the basis for this optimistic projection.
HumRRO’s Explanation:
“The statement in the executive summary of our most recent (Year 5) CAHSEE evaluation report should probably have said that cumulative passing rates increased by about (more than) 10 percentage points. The reference is Tables 2.6 and 2.7 in our Year 4 report (pages 16-17). Cumulative passing rates for the Class of 2004 increased from 73% at the end of 10th grade to 86% at the end of 11th grade for ELA and from 53% to 68% for Math. There were similar increases for each demographic group, including nearly 20 percentage point increases for economically disadvantaged (School Lunch) students and English Learners, although many groups started at a lower point. .... If cumulative passing rates for the Class of 2006 started at about 65% and (conservatively) increased 10 percentage points for each of the next two years they would also be at about 85%”
(correspondence, Lauress Wise, HuMRRO President, 5/2/04).
Our continued concerns:
Wise speaks only to the pass rates on the ELA and Mathematics sections, rather than to the percentage of students who have passed both sections of the Exam. It is possible that HumRRO’s 10% figure refers to the rate at which students will pass both Exams. But it is not clear why the figure is not greater or less than 10%. Without any models for why we should expect this 10% rate to hold true, we have little reason to be confident in this figure. We also find two logical problems with this 10% projected annual increase in passing rates. First, there is a problem with assuming rates of increase at the lower end of the score scale (i.e. increases from 55% to 65% passing) will be maintained at the top end (i.e. 75 to 85% passing) where pass rates are likely to increase by a smaller margin. Second, pass rates cannot be expected to increase at the same rate between 10th and 12th grades
4) WHAT IS THE IMPACT OF THE EXIT EXAM ON DROPOUT RATES?
Our Policy Brief of April 21 questioned HumRRO’s claim that the Exit Exam did not increase dropout rates. IDEA researchers argued that evidence about student enrollment changes prior to the implementation of the “Diploma Penalty” cannot inform the question of whether or not such a penalty will increase dropout rates.
HumRRO responds:
“Students in the Class of 2004 were subject to the CAHSEE requirement until after the end of the 11th grade. The enrollment declines from fall of 10th grade to fall of 11th grade were 6.8 percent for this cohort compared to a 7.7 and 7.9 percent for the two prior high school classes (Table 2.20, page 40 of our Year 5 report). The fall 11th grade to fall 12th grade enrollment decline for this class was 7.7 percent compared to 8.4 percent for the Class of 2003 and more than 10.5 percent for each of the four preceding classes. Most, although not all, of this year students in the Class of 2004 were subject to the CAHSEE requirement” (correspondence, Lauress Wise, HuMRRO President, 5/2/04).
Our continued concerns:
a) Members of the Class of 2004 believed they were going to be held to the graduation requirement when they first took the Exit Exam. However, 11th graders in the Class of 2004 found out in the summer between their 11th grade and 12th grade year that they were not going to be held to the Exit Exam requirement. As a result, students’ decisions to return to school for the 12th grade were not impacted by failure to pass the Exam; dropout figures for this transition from 11th to 12th grade cannot be used as evidence that the Exit Exam did not increase dropout rates. Yet it is at this point that we would likely see increased dropouts due to the Exit Exam, as 11th graders who failed the exam would be returning in 12th grade having failed the exam multiple times.
b) In the Year 5 report, HumRRO cites lower dropout rates between the 10th and 11th grade for the Class of 2005 as further proof that the Exit Exam has not increased dropouts. Yet the Class of 2005 found out in the summer after their 10th grade year that they were not going to be held to the CASHEE requirement; as a result, their decisions to return for the 11th grade were not affected by failure to pass the Exit Exam. Therefore, enrollment decline trends for this class cannot be used to assess whether the Exit Exam has an impact on dropouts.
c) In the Year 5 report, HumRRO data show that for the Class of 2006, enrollment declines for the prior year-- between 9th and 10th grade --actually increased slightly (p.39). The HumRRO evaluators note that this increase was due to larger than normal retention rates in 9th grade, but no data is presented to support this. Increased retention rates (an issue that is not assessed by the Year 5 evaluation) could lead to increased dropout rates, as research has consistently demonstrated that one of the strongest predictors of dropping out is being retained.
d) In the Year 5 report, HumRRO suggests that the Exit Exam might be responsible for declining dropout rates in the state. Their own data show, however, that enrollment declines between 11th and 12th grades have steadily been falling in California since the1998-1999 school year--dropping from 11.6% in the 1998-99 school year to 7.7% in the 2003-04 school year (p.41, table 2.21). This steady annual decline suggests that the decrease in dropout rates HumRRO observes for the Class of 2004 are not attributable to supposed “benefits” of the Exit Exam, but rather to other unidentified longer-term causes.
CONCLUSION
Are California high schools ready for the Exit Exam? Readiness, as defined in the original legislation, means that schools provide students with the opportunity to be successful on the Exam. Our analysis of data from HumRRO and the California Department of Education points to the existence of a sizeable number of high schools (roughly 1 of every 8 in the state) where students have not been provided with this opportunity. HumRRO’s evidence does not tell us if these students have been taught the material on the Exam. 80% of students who fail the Exam say that they were never taught some of the material, suggesting that schools with the highest failure rate have not provided standards based instruction for all students. HumRRO offers no solid evidence that schools are doing the required job of identifying and supporting students who initially fail the Exam. The recent evidence that HumRRO provides on this question comes from 34 principals that are clearly not representative of the state as a whole and do not speak to the conditions of the schools with the highest failure rates. HumRRO does not offer adequate support for its conclusions about how many students ultimately will pass the Exam or the effects of the Exam on dropouts. Perhaps HumRRO’s optimistic predictions will bear out in the end. But the evidence they provide at present is weak. And this is simply not a foundation for sound and just policy making. It is imperative that policy makers have real and substantial evidence before they make decisions that are so consequential to students’ futures.
______________________
1 We define severe overcrowding as schools that have more than twice as many students per acre as the state says is appropriate. We define severe teacher shortage as schools with more than 20% uncredentialed teachers.
2 HumRRO’s data indicate that 22% of students who failed the math section reported that they had never been taught many items on the Exam and another 58% said that they had not been taught some items on the Exam. HumRRO Year 5 Report, p. 69, Table 3.20. because the students who are re-taking the Exam at the end of 11th and in 12th grades have not been successful on the Exam thus far. Thus pass rates are likely to increase by a smaller percentage on each subsequent
administration of the Exam.
INSTITUTE FOR DEMOCRACY, EDUCATION, AND ACCESS
For more information please contact John Rogers at UCLA/IDEA:
1041 Moore Hall • Box 951521 • Los Angeles, California 90095-1521 • www.ucla-idea.org • 310-206-8725