A brief summary/overview of the SBE Agenda. As before, commentary will be in Italics. Please share back comments, observations, objections, etc.

This board meeting's agenda is relatively short and includes a larger than the usual number of waiver requests coming from districts or charter school applications that have been rejected by districts and county offices for various reasons.

Most of the rest of the agenda deals with work continuing on the new 'dashboard' accountability system, and on making that 'fit' Federal legislative requirements (from the latest iteration of Elementary and Secondary Education Act, known as the Every Student Succeeds Act, which replaces No Child Left Behind...which was everyone's favorite). At present, test scores in English and math are the main basis for accountability calculations on the dashboard, especially for K-8 schools. History-Social Science is not included in statewide accountability. Student achievement is measured by test scores on the Smarter Balanced Assessment Consortium (SBAC) tests in English language arts and math. Other criteria include attendance and for high schools,  graduation rates, Advancement Placement/ International Baccalaureate enrollments and pass rates, and classes required for college admission grades are used to show student achievement.
Various aspects of accountability dominates this SBE agenda. Work continues on determining criteria for 'local measures' in some of the LCAP state priority areas. Precisely how accountability will work may impact social studies education substantially, as was the case during the roughly 15 years of the Academic Performance Index, and the national Annual Yearly Progress (report). Hence the close attention to that ongoing discussion at the SBE provided here. 
Item 01 is the 8th update explaining to the SBE how the work designing external support for schools and districts is falling short on the new state accountability report, the 'Dashboard', is coming along. The SBE will hear how the new system is impacting districts so far. At present, there are three levels of 'assistance' for schools and districts, depending on how much orange and red coloring shows up on the 'dashboard' reports and for how many years those colors have not changed much. At present, SBE is told here, 43 County Offices of Education are 'providing support' to 223 districts statewide, apparently at the second 'assistance' level. The California Department of Education (CDE) explains that California has 528 elementary districts, 76 high school districts (some of which include middle schools), and 344 unified school districts. Thus about a fourth of all districts are receiving some degree of external 'assistance' at present. This number could dramatically increase when the 'school growth' measure reports are added to the accountability Dashboard reports currently planned for the fall of 2018. 
Item 02 explains the recommendations from Educational Testing Service (ETS), the SBAC's testing company, regarding adding a measure of the legislatively required 'school growth' measure to the Dashboard reporting system, the 'school growth' on test scores. The SBE is being shown three ways ETS suggests calculating growth from scores, each of which is called 'very complex' and which will require SBE study over the next several SBE meetings. ETS shows that one of those methods ETS calls 'residual gain' seems to be the most accurate reporting system.
Reporting 'growth' in test scores presents a hurdle in part because tests are different at each grade level (for the most part) in both ELA and math. Scaled score reports (which are used for SBAC test score reports) show how a student scored compared to the other test takers on bell curves of each test. So a fifth grader earning a scaled score at the mean of all test takers who also scored at the mean when in 4th grade has made a normal year's growth even though the scaled score point is the same each year. 
 
ETS's 'residual growth' model explained in Attachment 1 to Item 02 calculates the difference between an individual student's scaled test score and a 'expected' test score for that student, which is determined (ready for this?) by calculating the difference (or distance) from a current year
student score and a linear regression line relating the prior year test scores and the current year test scores for each grade level in each tested subject for all students tested in the two years.  (Whew. Clear, huh? Who will explain THAT to parents...? Go to SBE Agenda for May 2018,  Attachment 1 for Item 02 to examine several scatter plot charts provided by ETS showing how this would work for an individual student. Mercifully, the ETS scatter plot charts sent to SBE would not paste into this email.) 
The average (mean) of these differences would show a school's growth or lack of it, and determine the colors on the school's dashboard 'school growth' report.
In the ETS modeling scatter plot chart provided to SBE using real test result information, of note is that roughly as many scores fall below the regression line as above it in both of the tested years. Many many students will thus show 'negative' growth in this measure, and many schools and districts will see orange and red colors on their dashboard 'school growth' reports, both for 'school wide' and subgroup results, leading to more intervention 'support'. As mentioned above, curricular narrowing can be expected over time. In the past, content areas not included in accountability were the ones 'narrowed.' 
 
Including HSS student learning measures in several LCAP state priority areas is a way to keep HSS in accountability, and one way to avoid having HSS be' narrowed'. Whether or not to add these measures to  LCAPs remains as of now a district level decision. 
 
Jim Hill
Governmental Relations Committee (member)
California Council for the Social Studies
%d bloggers like this: