Maryland Science Assessment Letter

September 26, 2008

The Honorable Nancy Grasmick
State Superintendent of Schools
Maryland State Department of Education
200 West Baltimore Street
Baltimore, MD 21201

Dear Superintendent Grasmick:

I am writing regarding our review of Maryland’s general science assessments under the Elementary and Secondary Education Act (ESEA), as amended by the No Child Left Behind Act of 2001 (NCLB).

As outlined in my letter of February 28, 2008, states had to meet four basic requirements in science for the 2007-08 school year. In particular, each state was required to: (1) have approved content standards in science; (2) administer a regular and alternate science assessment in each of three grade spans; (3) include all students in those assessments; and (4) report the results of the regular and alternate science assessments on state, district, and school report cards. Based on the evidence submitted to date, Maryland appears to have met the first three requirements for 2007-08; however, Maryland has not yet submitted data to the Department demonstrating that all students were included in the science assessments for 2008. Please let us know within 10 days of receipt of this letter when Maryland will have that data available so that we can confirm that Maryland has, in fact, met the basic requirements for administering science assessments in 2007-08. States that do not provide the outstanding evidence to verify that they have met the four criteria for the 2007-08 school year have not met the basic requirements of the statute and will be subject to consequences, such as withholding of Title I, Part A administrative funds.

In 2008-09, Maryland must provide evidence for peer review that demonstrates full compliance of its science standards and assessments. In anticipation of that required peer review, Maryland chose to participate in an optional technical assistance peer review in May 2008. I appreciate the efforts that were required to prepare for the technical assistance peer review and hope that the process provided useful feedback to support Maryland’s efforts to monitor student progress toward meeting challenging science standards.

Based on the evidence received from Maryland, which was limited to evidence related to Maryland’s general assessments for science and which was reviewed by the peers and Department staff, we have concluded that Maryland does not yet meet all the statutory and regulatory requirements of section 1111(b)(1) and (3) of the ESEA. Specifically, we have concerns with the technical quality of the Maryland School Assessment (MSA) for science and high school assessment (HSA) for biology, including the assessments’ relation to external variables, and alignment of the MSA for science and HSA for biology to grade-level content standards. The complete list of evidence needed to address these concerns is enclosed with this letter. Please note that this list is limited to evidence needed to address the concerns related to Maryland’s general science assessments, but Maryland will also need to demonstrate compliance of its alternate science assessment for students with the most significant cognitive disabilities. We have scheduled peer reviews for states’ science assessments for the weeks of October 25 through November 2, 2008, and March 23 through 27, 2009. All materials for review must be provided to the Department three weeks before the scheduled peer review.

Please keep in mind that science assessments represent one piece of a state’s complete standards and assessment system, which also includes general and alternate assessments for reading and mathematics. As stated in our letter to you on June 12, 2006, Maryland’s standards and assessment system is currently designated fully approved. In order for Maryland to remain fully approved, Maryland must demonstrate that all components of its standards and assessment system, including general and alternate assessments for science, comply with all ESEA requirements for standards and assessment systems as administered in 2008-09.

We look forward to working with Maryland to support a high-quality standards and assessment system, of which science standards and assessments are an integral part. If you would like to discuss this further, please do not hesitate to contact Valeria Ford (Valeria.Ford@ed.gov) or Abigail Rogers (Abigail.Rogers@ed.gov) of my staff.

Sincerely,

Kerri L. Briggs, Ph.D.

Enclosure

cc: Governor Martin O’Malley
Ron Peiffer

SUMMARY OF ADDITIONAL EVIDENCE THAT MARYLAND MUST SUBMIT TO MEET ESEA REQUIREMENTS FOR ITS GENERAL SCIENCE STANDARDS AND ASSESSMENTS

2.0 – ACADEMIC ACHIEVEMENT STANDARDS

  1. Evidence that descriptions of competencies associated with each achievement level have been finalized and that they reference specific grade-span content.
  2. Evidence of diverse representation in development of alternate achievement standards in science.

3.0 – FULL ASSESSMENT SYSTEM

  1. Evidence of the comparability of online and paper-and-pencil forms based on the operational Maryland School Assessments (MSA) and High School Assessments (has). The analyses must include checking the decision consistency of categorization of students at the performance levels.
  2. Evidence of the comparability of the multiple operational MSA forms used in 2008.
  3. Documentation that the MSA and HSA assessments measure higher-order thinking.

4.0 – TECHNICAL QUALITY

  1. Consequential validity of the MSA.
  2. Evidence that reporting structures are consistent with the sub-domain structures of the MSA.
  3. Evidence that the MSA is appropriately related to internal or external variables. Evidence that both the MSA and HSA are related to external variables (e.g., other tests, student grades, etc.) is needed.
  4. Analyses of reliability of the MSA based on the 2008 results.
  5. Documentation of the use of the DIF analysis results to correct or eliminate items that exhibited bias.
  6. A plan for monitoring item bias and improving the tests over time.
  7. Evidence that accommodations used during administration of the MSA and HSA yield meaningful scores.
  8. Evidence of how consistency of forms over time will be ensured.
  9. Operational criteria for the administration, scoring, analysis, and reporting components of Maryland’s assessment system.
  10. Documentation for monitoring the on-going quality of the assessment system.
  11. Documentation that the monitoring of accommodations is occurring (e.g., summary monitoring reports, lists of audits conducted, etc.).
  12. Evidence that the validity of scores for students based on accommodated administration include results from operational assessments indicating that policies and procedures have been followed.

5.0 – ALIGNMENT

  1. Evidence of alignment of the operational MSA and content standards.
  2. Evidence demonstrating the cognitive challenge of the MSA and HSA tests as well as a rationale for the use and placement of item types (BCR and SR).
  3. Documentation that the operational assessments reflect the same degree and pattern of emphasis as are reflected in the state’s academic content standards.
  4. Detailed assessment specifications or a more complete description of the test development process and a description of how the assessment reflects both the content knowledge and skills specified in the academic content standards for both the MSA and the HSA.

6.0 – INCLUSION

  1. Actual data for all student and subgroup enrollment and the number or percentage tested during 2007-08 for the MSA and HSA.

7.0 – REPORTING

  1. Evidence that a summary report including the number of students enrolled and tested/not tested is produced for science.
  2. A report of participation and assessment results for all students (including migrant students) and for each of the required subgroups in its reports at the school, district, and State levels.
  3. Evidence that science assessment results are readily available to all parents, teachers and principals.
  4. Evidence that schools are delivering reports to parents, teachers and principals as soon as practically possible after the assessments are given.
  5. Final reports used for 2008 results.

Return to state-by-state listing