Tag Archives: Correctional Education

Awards13

Maryland State Department of Education

Abstract – Enhanced Assessment Instruments Grants Program—
Kindergarten Entry Assessment Competition

Overview of the proposed project
The proposed Consortium of seven States (Connecticut, Indiana, Maryland [fiscal agent], Massachusetts, Michigan, Nevada, and Ohio) and three partner organizations (WestEd, the Johns Hopkins University Center for Technology in Education, and the University of Connecticut’s Measurement, Evaluation, and Assessment Program) has a compelling vision for enhancing a multistate, state-of-the-art assessment system composed of a kindergarten entry assessment (KEA) and aligned formative assessments. This enhanced system—supported by expanded use of technology and targeted professional development— provides valid and reliable information on each child’s learning and development across the essential domains of school readiness; this information will lead to better instruction, more informed decision-making, and reductions in achievement gaps. The Consortium recognizes that achieving this vision will be challenging, requiring high levels of commitment, technical expertise, collaboration across member States and partners, and strong management skills, systems, and supports. Building on a highly successful existing effort already underway between Maryland and Ohio, the proposed enhanced system greatly expands the use of technology for more authentic and compelling items and tasks; efficiency of administration, scoring, and reporting; and increased student motivation. The end result will be a more reliable and valid system that provides timely, actionable data to identify individual student and program strengths and weaknesses, drive instruction, support curricular reform, and inform all stakeholders in the system about the effectiveness of preschool and kindergarten programs.

Project objectives and activities

  • Establish the governance and management infrastructure for the proposed work;
  • Develop the KEA and formative assessments (for children aged 36–72 months), to be fully
    implemented in all Consortium States;
  • Conduct all necessary and appropriate studies to ensure reliability, validity, and fairness of the
    assessment system;
  • Develop and implement professional development for the administration and use of the
    assessments;
  • Develop and deploy the necessary technology infrastructure; and
  • Implement stakeholder communication to measure the impact of the KEA and formative assessments on the efficacy of learning.

Proposed project outcomes
By the 2016–17 school year, the Consortium will provide an assessment system that:

  • includes strategic use of a variety of item types to assess all of the essential domains of school
    readiness, with each domain making a significant contribution to students’ overall comprehensive scores;
  • produces reliable, valid, and fair scores, for individual children and groups/subgroups, that can be used to evaluate school readiness, guide individualized instruction, and better understand the effectiveness and professional-development needs of teachers, principals, and early-learning
    providers;
  • is designed to incorporate technology in the assessment process and the collection of data and that is cost-effective to administer, maintain, and enhance; and
  • includes a KEA that can be a component of a State’s student assessment system, including the
    State’s comprehensive early learning assessment system, and can provide data that can be incorporated into a State’s longitudinal data system.

North Carolina Department of Public Instruction

Abstract – Enhanced Assessment for the Consortium (EAC) Project
Submitted by North Carolina’s Department of Public Instruction
(CFDA 84.368A)

The North Carolina Department of Public Instruction (NC DPI) along with 8 other Consortium states (AZ, DE, DC, IA, ME, ND, OR, RI), one collaborating state (SC), and three research partners, SRI International, the BUILD initiative, and Child Trends, will enhance NC’s K-3 formative assessment which includes a Kindergarten Entry Assessment (KEA). The Consortium believes that a KEA as part of a K-3 formative assessment will provide more meaningful and useful information for teachers than a stand-alone KEA. The Consortium proposes to enhance the K-3 assessment including the KEA because a single snapshot of how a child is functioning at kindergarten entry will have limited value and create an implementation challenge since teachers prefer information that can guide instruction for the entire school year. Furthermore, a good KEA must include content that extends beyond kindergarten to capture the skills of higher functioning children so enhancing an assessment that covers kindergarten entry through Grade 3 produces a significantly more useful assessment at marginal additional costs.

The NC K-3 assessment being developed under their RTT-ELC grant will be enhanced by: (a) aligning the content of the NC assessment to standards across the Consortium and enhancing the validity of the assessment through evidence-centered design (ECD) and universal design for learning (UDL); (b) incorporating smart technologies for recording and reporting to reduce assessment burden on teachers; and (c) expanding the utility of the assessment to a broader range of users by soliciting and incorporating input from stakeholders in the other Consortium states into the design of the assessment.  The project will be led by NC DPI with a management team that includes the three research partners (SRI, BUILD and Child Trends) who will work together provide overall leadership and coordination to the project. Project work has been organized around seven major activity areas: (1) overall project management; (2) across- and within-state stakeholder engagement including support for implementation planning; (3) application of ECD/UDL to the assessment content; (4) enhancement of professional development materials; (5) pilot and field testing; (6) psychometric analyses and performance levels; and (7) technology. Each activity team will be led by either NC DPI or one of the research partners and many of the teams will include staff from more than one organization to facilitate cross-project coordination.  The Consortium states will play a significant role in the development of the enhanced assessment.  All Consortium states will undertake Tier 1 activities including participating in regular consortium calls and meetings; sharing state-developed early childhood and K-3 assessment-related materials including standards; providing input into the review of assessment-related materials; and conducting broad stakeholder outreach activities. Some Consortium states will engage in additional Tier 2 activities including participating in the ECD/UDL co-design teams; pilot testing the assessment content; pilot testing the assessment supports such as technology enhancements and reporting formats; field testing the assessment; convening state experts to review assessment-related materials; and conducting more in depth stakeholder engagement activities.

The primary outcome of this project will be an enhanced formative K-3 assessment that includes a KEA that provides powerful information for improving student outcomes. The EAC will be a developmentally appropriate, observation-based formative assessment based on learning progressions that teachers use to guide instruction across the five domains of development and learning. Smart technologies built into the EAC will assist teachers with documentation and scoring, minimizing teacher burden, increasing reliability, and maximizing the EAC’s utility so that teachers can use it on a regular basis to inform instruction. Additionally, the EAC will provide meaningful and useful information to the students and families. Students will receive developmentally appropriate information to show where they are in their learning and where they need to go next. Families will contribute evidence for the assessment and will receive information to assist in supporting their child’s development and learning. Finally, the KEA will produce a child profile of scores across the five domains. The KEA child profile data will be useful in the aggregate for principals, district and regional administrators, state policymakers, and advocates to inform programmatic decisions around curriculum, professional development, policy development, and resource allocation. In addition, the KEA will be the first assessment point within a K-3 formative assessment system that will inform instruction and learning, improving student achievement.

Texas Education Agency

Abstract: The Texas Kindergarten Entry Assessment System:
Proposed by the Texas Education Agency

The Texas Education Agency (TEA), in collaboration with The University of Texas Health Science Center’s Children’s Learning Institute (CLI) – and backed by the Texas Association of School Boards, the Texas Association of School Administrators, and a network of renowned experts from the University of Miami, New York University, the University of Denver, the University of Virginia, the University of Texas at Austin, Michigan State University, and Kansas University – proposes to implement an ambitious and achievable Texas Kindergarten Entry Assessment System (TX-KEA) that enhances the quality and variety of assessment instruments and systems used by Texas’ 1,227 school districts serving 5,075,840 total students, including up to 400,000 incoming kindergarten students across 4,342 elementary campuses, annually.

The TEA, throughout this proposal, has set the bar high in terms of its six proposed goals for its assessment system. These goals revolve around providing innovative and flexible, technology-driven assessment solutions designed to measure student achievement at kindergarten entry across multiple domains. Addressing the U.S. Department of Education’s Absolute Priorities 1, 2, 4, and 5, these goals include: (1) construct item pools with good content validity for assessing nine domains of school readiness in English or Spanish; (2) scale items within a heterogeneous sample of socio-linguistically diverse students; (3) select items for paper-pencil and computerized versions; (4) evaluate reliability, validity, sensitivity, and fairness of the TX-KEA; (5) develop a technology platform for the TX-KEA and integrate with the state’s longitudinal data system; and (6) develop, launch, and coordinate a comprehensive information and training system for teachers and administrators.

This proposal is anchored in an understanding of the assessment needs of Texas and other states. Through a systematically designed risk and project management approach, TEA and its collaborators will develop assessment and data reporting solutions that optimize outcomes for schools, teachers, administrators, parents, community stakeholders, and ultimately, children. TEA has assembled an experienced team with the full array of expertise and experience required to develop and implement the TX-KEA successfully. We have proposed an officer-incharge, Dr. Susan Landry, who has worked across the nation to advance changes in assessment, teaching, and learning, which have led to unprecedented achievements for school leaders, teachers, families, and children. We also have proposed a project director, Dr. Jason Anthony, who is a renowned expert in language and literacy as well as the development and implementation of cutting-edge assessments. Additional experts with exceptional technical knowledge and skills, and academic faculty with strong experience and expertise in assessment and child development complement the team.

Building on a national reputation for high-quality early childhood education – as evidenced by the success of the Texas School Ready! Project, one of the nation’s only scaled, comprehensive school readiness interventions – combined with the successful development and launch of its innovation longitudinal data initiative, the Texas Student Data System (TSDS), TEA is poised to lead the nation and benefit other states by building a kindergarten entry assessment system that will promote comprehensive analyses of student school readiness and support the ability of teachers, administrators, and parents to be responsive to multiple domains of student strengths and needs.

Awards11

Kansas State Department of Education

Abstract

Accessibility of Technology-Enhanced Assessments (ATEA)

The ATEA project will investigate the accessibility of technology-enhanced item and task types
for students with vision and/or motor disabilities. These students are among the most difficult to
accommodate on computer assessment systems. Historical accommodations include alternate forms, such as Braille or large print paper-and-pencil tests, and alternate means of presentation and response, such as the use of readers, scribes, and assistive technology. Many of these students participate in alternate assessments, which are often individualized and non-standardized. Cognitive load may be higher with accommodations such as tactile graphics, oral presentation, and dictation to a scribe. Physical effort may be greater when reading Braille or operating an eye-gaze or sip-and-puff computer interface. The time required to complete an assessment may be longer and result in greater fatigue. Technology-enabled accessibility features for these students have not yet been tested. The comparability of scores and score inferences with these assessment adaptations has not been evaluated.

These topics will be investigated with the intention of benefiting the five major assessment
consortia. Planned technology-enhanced item types will be identified. Teacher review panels representing ATEA states will assist in evaluating the accessibility of items and tasks and developing means to improve accessibility. Cognitive labs will permit individualized examination of technology-enabled accessibility features and accommodations. Large-scale data collection across the ATEA consortium states will result in analyses of item difficulty, differential item functioning, and score comparability.

Project outcomes include a catalog of accessible technology-enhanced item and task types with
guidelines for maximum access, a comprehensive description of student characteristics, data on student performance and the comparability of scores, and procedural documentation. The project’s National Advisory Board will have experts who are also members of at least one of the major assessment consortia technical advisory committee. Edvantia will conduct external evaluation. Kansas, Utah, Wisconsin, West Virginia, Michigan, and Missouri will participate. Additional states are interested and plan to participate, but did not have time to sign the Memoranda of Understanding for the submission.

Maryland State Department of Education

Abstract

Guidelines for Accessibility of Assessments Project (GAAP)

The Guidelines for Accessibility of Assessments Project (GAAP) is a collaborative effort to develop, research, and implement guidelines that will be used to make assessment items and tasks developed using the Common Core State Standards (CCSS) accessible to students requiring spoken and signed representation of content. Currently there are no standard accepted best practices for representing content in a spoken (henceforth audio) or signed form. With the adoption of digital delivery of tests and tools such as APIP, there is an opportunity to develop nationwide consensus on best practices and for state assessment programs and assessment consortia to apply these practices in a consistent manner thus enabling greater access for students and increasing the validity of test score-based inferences about students’ academic proficiency.

The GAAP project will focus on audio and sign guidelines for English Language Arts and mathematics. The development of audio guidelines will be informed by the currently funded EAG and OSEP projects, current state practices, and initial work performed by PARCC and the Smarter Balanced Assessment Consortia. Sign guidelines will be informed by current state practice in states such as Massachusetts and South Carolina, by native signers, by deaf K-12 mathematics educators, and by higher education sign experts.

GAAP involves a consortium of 18 states (Utah, Vermont, New Hampshire, Arizona, Connecticut, Rhode Island, Minnesota, Maine, Michigan, Montana, Idaho, Kansas, North Carolina, Washington, Colorado, South Carolina and Oregon) led by the Maryland State Department of Education. The GAAP Consortium will collaborate with Measured Progress accessibility experts, National Center for Educational Outcomes evaluation experts, WGBH’s National Center for Accessible Media audio accessibility experts, CCSS content experts, and nationally recognized sign leaders in an iterative process that will include 1) development of audio and sign guidelines, 2) application of guidelines to CCSS items, 3) state, expert, and advisory board member review of guidelines and application to sample items, and 4) research with students who regularly use audio or signed supports for assessment. The resulting guidelines and sample item representations will be widely disseminated and made publicly available.

Oregon Department of Education

Abstract

English Language Proficiency Assessment for the 21st Century (ELPA21)

The English Language Proficiency Assessment for the 21st Century consortium (ELPA21), led by Oregon as the governing state in partnership with twelve other states, Stanford University, and CCSSO, has formed to develop an English Language Proficiency Assessment that is aligned to the Common Core State Standards (CCSS). ELPA21’s proposed assessment design is intended to ensure the valid, reliable, and fair assessment of the critical elements associated with English language acquisition and mastery of the linguistic skills linked to success in mainstream classroom environments. In addition, ELPA21’s proposed assessment will support ongoing improvements in instruction and learning that are useful for all members of the educational enterprise, including students, parents, teachers, school administrators, members of the public, and policymakers. This assessment will incorporate principles of Universal Design and will comply with Accessible Portable Item Profile (APIP) standards. ELPA21 development will be based upon the prior successes of member states (for example, the Kansas writing tool, the
Michigan diagnostic screener, test items from Iowa and Louisiana, and online test delivery specifications from Oregon).

The deliverables for the diagnostic screener and summative components of ELPA21 will include open-source: performance level descriptors, item banks for practice and for operational delivery, psychometric scale, performance levels (cut scores), test design and delivery specifications, test specifications and blueprints, professional development resources, and administration and security protocols. Participating states who are currently part of the PARCC, SMARTER Balanced, NCSC, DLM and ELPA21 consortia will strive to work with these consortia to maximize compatibility and interoperability across user platforms. These resources as well as model Request for Proposal language will be available to states for use (individually or in multi-state partnerships) to contract with vendors for operational assessment in the 2016-2017 school year.

Awards

District of Columbia

The Development of Alternate English Language Proficiency Assessment Procedures for English Language Learners with Significant Disabilities

The Washington, DC Office of the State Superintendent of Education (OSSE), on behalf of the 17-state World-class Instructional Design and Assessment (WIDA) Consortium, proposes to develop and implement a feasible, accessible, valid, and efficient standards-based English language proficiency (ELP) alternate assessment system that yields technically sound results and facilitates the inclusion of English language learners (ELLs) with significant disabilities in educational accountability systems across the WIDA Consortium. This performance-based alternate assessment system will compliment and parallel the University of Wisconsin – Madison and WIDA’s evidence-based collection alternate ELP approach that is currently being field-tested within WIDA Consortium states. This new performance-based approach will give WIDA states the flexibility to implement alternate ELP assessments consistent with their existing alternate academic content assessments. The WIDA Consortium, which is located within the Wisconsin Center for Education Research (WCER) at the University of Wisconsin – Madison, will lead the development of this assessment through a cooperative agreement with the Washington, DC OSSE.

The WIDA Consortium, originally established with funding from a U.S. Department of Education Enhanced Assessment Grant, currently includes Washington, DC and 16 additional states. Combined, the 17 WIDA partner states contain approximately 550,000 K-12 ELLs. Since 2003, WIDA has created and adopted comprehensive English language proficiency standards (2004, 2007) that represent the second language acquisition process and the language in the content areas of language arts, mathematics, science, and social studies. Based on these standards, WIDA developed a K-12 ELP test battery–ACCESS for ELLs–that approximately 420,000 students took in spring 2007. The ACCESS for ELLs is currently being used by more states than any other ELP measure. WIDA also provides professional development activities and maintains a Web site (www.wida.us).

The proposed alternate ELP assessment system, named the Alternate ACCESS for ELLs with Significant Disabilities, will be designed to (a) meet the accountability requirements of the No Child Left Behind Act of 2001 and the Individuals with Disabilities Education Improvement Act of 2004, (b) meet the technical requirements of the Standards for Educational and Psychological Testing (American Educational Research Association, American Psychological Association, & National Council on Measurement in Education, 1999), (c) facilitate the involvement of ELLs in participating states’ accountability systems, (d) provide a method for monitoring the ELP growth of ELLs with significant disabilities, and (e) provide guidance to individualized education program (IEP) teams in developing appropriate language proficiency IEP goals and objectives. The development of this parallel form of the Alternate ACCESS for ELLs will follow key principles that require the assessment to (a) identify and assess skills that are critical to language proficiency development; (b) be aligned with the WIDA Consortium’s language proficiency standards; (c) be sensitive to student growth and accurately reflect students’ abilities in language areas; (d) lead to instructional opportunities that meet student needs; (e) provide reliable and valid results; (f) be non-biased and sensitive to cultural differences; (g) produce results that are helpful to teachers, parents, and administrators in making educational decisions; and (h) be time- and resource-efficient, as well as consistent with participating WIDA Consortium states’ existing academic content alternate assessment systems. A multi-part investigation using a multi-method, multi-source approach to developing and evaluating the alternate assessment will be used. Throughout all phases of the proposed alternate ELP assessment development and validation procedures, the involvement of WIDA Consortium states will increase the likelihood of participation by a culturally, geographically, and structurally diverse population of schools that will increase the generalizability of findings from the project. It is anticipated that we will be collecting data from a minimum of 300 schools within the consortium, including schools from each WIDA Consortium state.

Although a variety of academic content assessments are available for use with ELLs with significant disabilities, there currently exist no alternate ELP assessments for ELL students with significant disabilities. Consequently, this project will advance theory, knowledge, and practice in the fields of assessment and instructional programs for ELLs with significant disabilities. We anticipate that the development and use of alternate ELP assessments for ELLs in the beginning stages of English language acquisition will prove to be a valid, reliable, and equitable way to assess the English language proficiency of ELLs with significant disabilities.

Minnesota

Modifications for a Better Assessment of What Students with Disabilities
Know and Can Do

The state of Minnesota, in collaboration with the states of Ohio, Oregon, and with the American Institutes for Research propose a research and design study to improve our planned Alternate Assessment of Modified Achievement Standards (AA-MAS). The AA-MAS targets persistently low-performing students with disabilities.

Our proposed project addresses Absolute Priority 1 by collaborating with the American Institutes for Research and university-based cognitive psychologists to improve the reliability and validity with which state assessments can measure the academic achievement of students with a variety of disabilities whose skills are not appropriately measured through the general education assessment or the alternate assessment based on alternate achievement standards (AA-AAS). Although an AA-MAS is permitted under ESEA for this population, it is not required, and these improvements will extend the reliability and validity of the tests for this population beyond the ESEA requirements.

Development of our AA-MAS builds on the idea that deficits in specific cognitive traits may impede student performance on assessment tasks. Under an existing General Supervision Enhancement Grant (GSEG), our consortium is currently designing modifications to remove specific impediments that arise from deficits in working memory, executive function (planning), and focused and sustained attention. Successful modifications will lower barriers imposed by these deficits while minimizing changes to the construct being measured, thereby enabling students with disabilities to more accurately show what they know and can do.

This proposal describes a field test designed as an experiment. Students in the experimental group and the control group will respond to assessments including both modified and unmodified test items. Qualified evaluators will also administer brief, validated screeners to Minnesota, Ohio, and Oregon Consortium page 2 measure the four cognitive traits, deficits in which are often associated with cognitive disabilities. Analysis of the data will evaluate whether the modifications render the assessments more accessible without simply making them easier.

Our design directly addresses the Secretary’s three competitive preference priorities by 1) promising significant advancement in our understanding of how to validly test student with disabilities through alternate assessment and accommodations; 2) collaborating in this effort
with a three-state consortium; and 3) proposing a dissemination plan that will reach all state assessment programs.

Nevada

Integrated Simulation-Based Science Assessments
into Balanced State Science Assessment Systems

The Nevada State Department of Education will lead a collaboration of seven states including Connecticut, Massachusetts, Nevada, North Carolina, Utah, Washington, and Vermont to study the feasibility of integrating computer simulation-based science assessments into balanced state science assessment systems. The collaboration will take place in partnership with WestEd, the Council of Chief State School Officers (CCSSO), and the Center for Research on Educational Standards and Student Testing (CRESST) at UCLA. The purpose of the project is to support the assessment of
science knowledge and inquiry strategies not typically well-measured in paper-based large scale science tests by implementing local technology-based science formative, curriculum-embedded and end-of unit benchmark assessments that can augment district and state science test evidence of progress on science standards. The goals of the project are to study: (1) the technical qualities of the simulation-based science assessments; (2) the feasibility and utility of the assessments for formative, summative, and accountability purposes; (3) the effects of the simulation-based assessments for all students, English learners, and students with disabilities; and (4) propose alternative models for integrating simulation-based assessments into state science assessment systems.

Four of the states (NV, NC, UT, WA) will pilot test the assessments in three demographically diverse districts per state, 108 teachers and approximately 10,800 students. The project will study how the rich environments, multiple representations, flexible response formats, and accommodations in the technology-enhanced science assessments benefit the performance of disadvantaged students, English learners, and students with disabilities. CCSSO and Nevada will lead a State Science Assessment Design Panel of all seven states to monitor the pilot testing and develop specifications and models for integrating simulations into balanced state science assessment systems.

New Hampshire

Examining the Feasibility, Effect and Capacity to
Provide Universal Access through Computer-Based Testing

The project proposed here seeks to examine the feasibility, effect, and capacity to deliver state achievement tests using a computer-based test delivery system specifically designed to provide universal access to test content for students with disabilities or special needs. The proposed project is a direct outgrowth of prior work supported by EAG funding conducted by New Hampshire, Vermont, and Rhode Island in which the feasibility of using computers to provide specific test accommodations was examined. Based on this prior work, members of the New Hampshire Department of Education Curriculum and Assessment program conducted a statewide pilot test in which the interface used for the prior EAG project was used to provide a read aloud accommodation to students for its 2006 grade 10 mathematics test. This successful pilot led to a collaborative effort with Nimble Assessment Systems to develop a comprehensive test delivery system that employed principles of universal design to flexibly meet the accessibility and accommodation needs of individual students. The proposed project brings together 11 states to examine the feasibility and effect of using this comprehensive test delivery system to improve test validity for students with disabilities and special needs who are believed to benefit from one or more of the accessibility and accommodation tools built into the system. Specifically, members of this collaborative project include: New Hampshire, Vermont, Rhode Island, South Carolina, North Carolina, Georgia, Montana, Iowa, Connecticut, Maryland, and Florida. In addition, the proposed project includes partnerships with the National Center for Educational Outcomes, Nimble Assessment Systems, and the NECAP state contractor (currently Measured Progress).

The proposed project will undertake 5 major initiatives:

  • Conduct a set of three computer-based test accommodation efficacy studies.
  • Employ the UAS to deliver the operational Grade 11 Mathematics, Reading and
    Writing tests, and the operational Grade 4, 8 and 11 Science tests.

  • Develop and validate a school computer-based test delivery capacity index.
  • Analyze school computer-based test delivery capacity for schools in participating states.
  • Conduct a cost analysis for preparing for and delivering a test using a computer-based
    test delivery system with embedded accommodations.

Given that three members of the collaborative project have jointly developed achievement tests that are used across their respective states, the project will focus specifically on the NECAP tests. This will enable efficient replication of the study findings across multiple states while allowing all participating states to refine research questions, analyses, and the development of the School Capacity Index. Collectively, the number and variety of studies undertaken through this project holds promise to rapidly advance assessment practices within each of the participating states, while also informing the practices in non-participating states.

Utah

Alternate Assessment Design—Mathematics (AAD-M)

The Utah State Office of Education is the applicant for the proposed Alternate Assessment Design—Mathematics project. With technical support from SRI International, the states of Utah, Idaho, and Florida will collaborate to achieve the following goals: (1) extend the conceptual framework of evidence-centered design (ECD) to alternate assessment based on alternate achievement standards (AA-AAS) using the Principled Assessment Design for Inquiry (PADI) model and (2) develop AA-AAS testing designs, blueprints, and assessment task specifications that address priority state academic standards in mathematics for students with significant cognitive disabilities.

The collaborating states have AA-AAS systems for students with significant disabilities (1%), and they have completed one peer review cycle (NCLB). They are now refining their assessments to improve technical quality. The states are seeking to improve quality by developing a system of structured or standardized performance tasks that are aligned with grade-level academic content, boosting expectations for student achievement, and increasing access to grade-level academic standards using ECD. The states are also seeking to increase the reliability of their alternate assessment systems.

Two obstacles hinder achieving these intentions: few states documented the procedures or the rationale they used to set priorities for what is tested in alternate assessment (the content domains) or used a systematic process to examine the characteristics of the content to be tested to guide the design of the assessment and the tasks or items. The proposed project will address these obstacles by taking the next logical step to integrate practice with grounded measurement principles—emulating and extending an ECD approach to designing alternate assessment (AA-AAS).

The foundation for this project, ECD, is a practical theory-based approach to developing quality assessments that combines developments in cognitive psychology and advances in measurement theory and technology. The purpose of the project is to apply ECD, as it is manifested in the PADI model, to designing, delivering, and scoring assessments of the academic achievement of students with significant cognitive disabilities. Although Utah, Idaho, and Florida have unique needs, the PADI model is robust and suitable for addressing each state’s needs.

In addition to emulating the PADI model, the scope of the project encompasses (1) developing design pattern (frameworks or schema used to design assessments), (2) producing assessment blueprints or templates that are based on the design patterns, (3) describing the conditions required to effectively present tasks and evaluate student performance, and (4) producing written descriptions of exemplars of assessment tasks and scoring systems. In conducting these activities for the alternate mathematics assessments, the collaborating states will gain the expertise needed to apply the principles of the ECD model to designing alternate assessments (AA-AAS) in other subject areas. The project will thereby produce procedural guidelines for designing future assessments. These guidelines, additional materials about the approach, and other project results will be disseminated to the field on a website and through multiple presentations at key conferences and meetings.

In addition to these important, concrete benefits to statewide assessment systems and the field of measurement, the proposed project offers an opportunity both to extend a contemporary approach to test design through a novel application and to evaluate this extension through pilot-testing of selected tasks across multiple states.

Performance

 

 

 

 

Legislation, Regulations and Guidance

Legislation

Elementary and Secondary Education Act of 1965, as amended


Regulations


Program-Specific Regulations

Awards

 

Awards made under the predecessor to the Competitive Grants for State Assessments program, the Enhanced Assessment Grants (EAG)

Awarded through a competition in 2016

Abstracts for 2016 Enhanced Assessment Grants Awards
img src=”/images/ed_gl_download.gif” align=”top” width=”10″ height=”14″ alt=”download files” border=”0″ –>> MS Word (17KB) | Abstracts HTML

View the full applications funded in 2016

Lead State Award Amount
Maryland State Department of Education $3,843,805
Nebraska Department of Education $3,987,395

 

Awarded through a competition in 2015

Abstracts for 2015 Enhanced Assessment Grants Awards
img src=”/images/ed_gl_download.gif” align=”top” width=”10″ height=”14″ alt=”download files” border=”0″ –>> MS Word (17KB) | Abstracts HTML

View the full applications funded in 2015

Lead State Award Amount
Arizona State Department of Education $1,977,086
California Department of Education For the State Board of Education $2,690,672
Kansas State Education Agency $5,816,159
Michigan Education Agency $4,341,835
Minnesota Education Agency $2,961,888

Awarded through a competition in 2013

View the abstracts for Enhanced Assessment Grants Awards made in 2013
img src=”/images/ed_gl_download.gif” align=”top” width=”10″ height=”14″ alt=”download files” border=”0″ –>> MS Word (52K) | Abstracts HTML

 

View the full applications funded in 2013

Lead State Award Amount
Maryland State Department of Education $4,999,994
North Carolina Department of Public Instruction $6,131,422
Texas Education Agency $3,988,124

2011 Funds (awarded through competitions in 2012)

View the abstracts for FY2011 Enhanced Assessment Grants Awards img src=”/images/ed_gl_download.gif” align=”top” width=”10″ height=”14″ alt=”download files” border=”0″ –>> MS Word (52K) | Abstracts HTML

View the full applications funded with FY2011 Funds in 2012

Lead State Award Amount
Kansas State Department of Education $1,757,103
Maryland State Department of Education $1,926,577
Oregon Department of Education $6,273,320

 

<!–>

2011 Funds (awarded through competitions in 2012)

View the abstracts for FY2011 Enhanced Assessment Grants Awards img src=”/images/ed_gl_download.gif” align=”top” width=”10″ height=”14″ alt=”download files” border=”0″ –>> MS Word (52K)

Lead State Award Amount
Kansas State Department of Education $1,757,103
Maryland State Department of Education $1,926,577
Oregon Department of Education $6,273,320

–>>

 

2010 Funds (awarded through a competition in 2011)

View the abstract for FY2010 Enhanced Assessment Grants Awards img src=”/images/ed_gl_download.gif” align=”top” width=”10″ height=”14″ alt=”download files” border=”0″ –>> MS Word (52K) | Abstract HTML

View the full application funded with FY2010 Funds in 2011

<!–>

–>>

Lead State Award Amount
Wisconsin Department of Public Instruction $10,486,195
Arizona Department of Education $930,982
Illinois Department of Education $1,918,845
Kansas Department of Education $1,126,307
Minnesota State Office of Education $1,564,900
New Hampshire Department of Education $1,902,282
North Carolina Department of Public Instruction $1,674,928

 

View the abstracts for FY2009 Enhanced Assessment Grants Awards img src=”/images/ed_gl_download.gif” align=”top” width=”10″ height=”14″ alt=”download files” border=”0″ –>> MS Word (19K)

Lead State Award Amount
Arizona Department of Education $1,555,846
Arizona Department of Education $930,982
Illinois Department of Education $1,918,845
Kansas Department of Education $1,126,307
Minnesota State Office of Education $1,564,900
New Hampshire Department of Education $1,902,282
North Carolina Department of Public Instruction $1,674,928

 

View the abstracts for FY2008 Enhanced Assessment Grants Awards img src=”/images/ed_gl_download.gif” align=”top” width=”10″ height=”14″ alt=”download files” border=”0″ –>> MS Word (19K)

Lead State Award Amount
Idaho Department of Education $1,351,189
Minnesota Department of Education $1,272,071
Pennsylvania Department of Education $1,815,720
Utah State Office of Education $737,153
Virginia Department of Education $1,832,249
Washington Office of Superintendent of Public Instruction $1,644,098

 

View the abstracts for FY2007 Enhanced Assessment Grants Awards img src=”/images/ed_gl_download.gif” align=”top” width=”10″ height=”14″ alt=”download files” border=”0″ –>> MS Word (47K)

Lead State Award Amount
District of Columbia Office of the State Superintendent of Education $1,220,427
Minnesota Department of Education $1,523,907
Nevada State Department of Education $1,683,765
New Hampshire Department of Education $1,765,196
Utah State Office of Education $1,357,223

 

View the abstracts for FY2006 Enhanced Assessment Grants Awards img src=”/images/ed_gl_download.gif” align=”top” width=”10″ height=”14″ alt=”download files” border=”0″ –>> MS Word (47K)

Lead State Award Amount
Connecticut Department of Education $758,052
Illinois Department of Education $1,890,401
Iowa Department of Education $1,238,760
Montana Department of Education $1,765,196
Pennsylvania Department of Public Instruction $708,537
South Carolina Department of Education $1,119,620

 

View the abstracts for FY2005 Enhanced Assessment Grants Awards img src=”/images/ed_gl_download.gif” align=”top” width=”10″ height=”14″ alt=”download files” border=”0″ –>> MS Word (82K)

Lead State Award Amount
Delaware Department of Education $1,263,909
Georgia Department of Education $1,153,899
Hawaii Department of Education $1,500,866
Idaho Department of Education $1,535,349
North Carolina Department of Public Instruction $1,671,666
Oregon Department of Education $1,061,204
Rhode Island Department of Education $2,117,809
South Carolina Department of Education $1,325,076

 

  • No appropriation in fiscal year 2004

 

View the abstracts for FY2003 Enhanced Assessment Grants Awards img src=”/images/ed_gl_download.gif” align=”top” width=”10″ height=”14″ alt=”download files” border=”0″ –>> MS Word (42K)

Enhanced Assessment Grant 2003 Chart

Lead State Award Amount
New Hampshire Department of Education $1,058,243
Oklahoma Department of Education $835,887
Rhode Island Department of Education $723,009
South Carolina Department of Education $1,016,376
West Virginia Department of Education $818,985

 

For more information about a specific FY 2002 Enhanced Assessment Grant, click onto lead State

<!–>View the abstracts for FY2002 Enhanced Assessment Grants Awards img src=”/images/ed_gl_download.gif” align=”top” width=”10″ height=”14″ alt=”download files” border=”0″ –>> MS Word (65K)

–>>

Enhanced Assessment Grant 2002 Chart

 

Lead State Award Amount
Colorado Department of Education $1,746,023
Minnesota Department of Education $2,013,503
Nevada Department of Education $2,266,506
Oklahoma Department of Education $1,442,453
Pennsylvania Department of Education $1,810,567
Rhode Island Department of Education $1,788,356
South Carolina Department of Education $1,719,821
Utah Department of Education $1,842,893
Wisconsin Department of Education $2,338,169

<!–>

  • The Press Release announcing FY 2002 Grant Awards includes a description of each project, collaborating states and groups, and grant amounts. (February 12, 2003)

–>>

 

<!–>

For more information about a specific FY 2002 Enhanced Assessment Grant, click onto lead State

View the abstracts for FY2002 Enhanced Assessment Grants Awards img src=”/images/ed_gl_download.gif” align=”top” width=”10″ height=”14″ alt=”download files” border=”0″ –>> MS Word (65K)

Enhanced Assessment Grant 2002 Chart

Lead State
Colorado Department of Education
Minnesota Department of Education
Nevada Department of Education
Oklahoma Department of Education
Pennsylvania Department of Education
Rhode Island Department of Education
South Carolina Department of Education
Utah Department of Education
Wisconsin Department of Education
  • The Press Release announcing FY 2002 Grant Awards includes a description of each project, collaborating states and groups, and grant amounts. (February 12, 2003)

–>>

Resources

Program Specific Resources

    • Monitoring Plans
      • Student Achievement and School Accountability Programs Monitoring Plan for the Enhanced Assessment Grants Program For Grants Funded through a 2013 Competition
        MS WORD (178K)
      • Student Achievement and School Accountability Programs Monitoring Plan for the Enhanced Assessment Grants Program For Grants Funded with FY2011 Funds
        MS WORD (209K)
      • Student Achievement and School Accountability Programs Monitoring Plan for the Enhanced Assessment Grants Program For Grants Funded with FY2010 Funds
        MS WORD (209K)
      • Student Achievement and School Accountability Programs Monitoring Plan for the Enhanced Assessment Grants Program For Grants Funded with FY2009 Funds
        MS WORD (104K)
      • Student Achievement and School Accountability Programs Monitoring Plan for the Enhanced Assessment Grants Program For Grants Funded with FY2008 Funds
        MS WORD (121K)
    • Enhanced Assessment Grant Program Performance Indicators under the Government Performance & Results Act (GPRA)
      •  MS WORD (96K) Applies to Cohorts in 2012 and Earlier
      •  MS WORD (24K) Applies to Cohort Awarded Through a 2013 Competition

Training Resources for Grant Administration

Funding Status

FY 2021 and 2022 Funds Awarded through a Competition in 2022

Estimated Available Funds: $29,200,000

Estimated Range of Awards: $1,000,000 to $3,000,000 Estimated Average Size of Awards $2,500,000 Number of Awards: 9-10

FY 2019 and 2020 Funds Awarded through a Competition in 2020

Estimated Available Funds: $12,300,000

Estimated Range of Awards: $750,000 to $3,000,000 Estimated Average Size of Awards $2,500,000 Number of Awards: 4-6

FY 2018 and 2019 Funds Awarded through a Competition in 2019

Estimated Available Funds: $17,622,000

Estimated Range of Awards: $1,000,000 to $4,000,000
Estimated Average Size of Awards $2,500,000
Number of Awards: 4-8

FY 2016 Funds Awarded through a Competition in 2016

Estimated Available Funds: $8,860,000

Estimated Range of Awards: $100,000 to $4,000,000
Estimated Average Size of Awards $2,500,000
Estimated Number of Awards: 3-6

Note: funding under the predecessor to the Competitive Grants for State Assessments program, the Enhanced Assessment Grants (EAG)

FY 2014 Funds Awarded through a Competition in 2015

Estimated Available Funds: $8,495,000 to $17,870.000

Estimated Range of Awards: $1,000,000 to $6,000,000
Estimated Average Size of Awards $2,500,000
Estimated Number of Awards: 3-6

Note: The Department is not bound by these estimates.

FY 2012 Funds Awarded through a Competition in 2013

Estimated Available Funds: $9,200,000

Estimated Range of Awards: $4,200,000 to 5,000,000
Estimated Average Size of Awards $4,600,000
Estimated Number of Awards: 2

Note: The Department is not bound by these estimates.

FY 2011 Funds Awarded through Competitions in 2012

Available Funds: $9,900,000

EAG English Language Proficiency (ELP) Competition
Size of Award: $6,273,320

Number of New Awards: 1

EAG Accessibility Competition
Range of Awards: $1,757,103 to $1,926,577
Number of New Awards: 2

FY 2010 Funds Awarded through a Competition in 2011

Size of Award $10,486,195

Number of New Awards: 1

FY 2009 Funds

Range of Awards: $930,982 to $1,918,845

Number of New Awards: 7

FY 2008 Funds

Available Funds: $8,732,480

Range of Awards: $789,179 to $1,961,563
Number of New Awards: 6

FY 2007 Funds

Available Funds: $7,563,200

Range of Awards: $1,220,427 to $1,729,088
Number of Awards: 5

FY 2006 Funds

Available Funds: $7,563,200

Range of Awards: $708,537 to $1,890,401
Number of Awards: 6

FY 2005 Funds

Available Funds: $11,680,000

Range of Awards: $1,061,204 to $2,117,809
Number of Awards: 8

FY 2004 Funds

No awards.

FY 2003 Funds

Available Funds: $4,484,000

Range of Awards: $723,009 to $1,058,243
Number of Awards: 5

FY 2002 Funds

Available Funds: $17,000,000

Range of Awards: $1,442,453 to $2,338,169
Number of Awards: 9

Return to Grant Enhanced Assessment Instruments

Eligibility

 

Who May Apply: (by category) State Education Agencies (SEAs)

Who May Apply: (specifically) A consortium of SEAs also may apply.

An application from a consortium of SEAs must designate one SEA as the fiscal agent.

Back to Competitive Grants for State -Home

Applicant Information

Federal Register Notices

Current Application

2022 Competitive Grants for State Assessments Application Package, WORDPDF