Plans and blueprints

Blueprint or plan assessments

Plan an assessments

Answer important questions with performance data

An assessment outline or blueprint answers instructional questions. A good plan will create assessments telling a story and providing feedback about student learning.

  • Answer instructional questions
  • Create an instructional story of student learning
  • Provide students with useful feedback

Establish the purpose and use of the assessment

What do we need to learn about our students?

How are we going to use the data or results?

Team or program assessments planned for a purpose improve instructional decisions and target learning opportunities, benefitting teachers and students.

How are we going to use the data to improve learning and instruction?

  • Predictions for high-stakes tests
  • Diagnostic data for instructional planning
  • Grades from tests that match classroom instruction
  • Data for program placement
  • Intervention decisions to enter or exit a program
  • Instructional feedback for the future
  • Create formative plans to improve teaching and learning environment

For example, a predictive test mirrors the style and standard set from an important summative assessment. Assessments used for grading reflect current classroom instructional standards and include the type and style of content currently taught in the classroom.

Outline and define the assessment

Outlines or blueprints define assessments to answer instructional questions and focus performance data benefiting students and teacher.

The blueprint is the first step in creating tests that tell a story about students.

Standards, skills, and content

  • Current instructional standards
  • Important skills within the standards
  • Key content knowledge
  • Previous standards relevant to current learning
  • Standards that complement a future learning cycle

Do any standards complement each other?

Complementary standards come from the same domain or strand. They work together to measure student learning and reduce item counts while offering reliable data. A few items covering a specific knowledge or complementary skills will provide indications on students learning, just not at a psychometric level necessary for high stakes assessment.

Not all current standards or their skills/knowledge need to be measured. If classroom assignments provide an adequate view of student achievement, it is not necessary to assess those skills or content.

Get the big picture of learning from multiple sources of data and student work.

Questions guide the plan

  1. What skills or content knowledge within the standards need assessment?
  2. What level of difficulty and complexity is appropriate at this time?
  3. Are we taking the test offline or online?
  4. What items are available to use?
  5. How long are we going to give students to complete the test?
  6. How many questions do we need on this assessment?
  7. What types of passages, charts, illustrations, or graphs should we use?

Item counts per standard do not have to be high for classroom assessments. Quality items, along with the right standard mix, will provide enough information.

What item types should we use?

  • Multiple choice
  • Constructed response
  • Performance tasks
  • Writing prompt
  • Technology-enhanced

Online versions of an assessment will allow for different item types. Online can be good or bad, depending on available technology, and the quality of the online item. Data analyzed by a team or department should either be online or offline, not a mixed environment. If each teacher is going to look at the results in isolation, it doesn’t matter.

Modifications and participation decisions

  • Will the test environment and administration be consistent with all students?
  • What modifications will be allowed during the assessment?
  • Do specific students need additional support?

Team assessments typically expect administration for all students with little or no modifications. Expectations and reality do not always match. Without agreed guidelines, the test environment may look different from class to class. Inferences and decisions benefit from consistent administration guidelines.

Team or Department meeting

Instructional team meetings

Team or department meetings have different:

  • teachers
  • skills sets
  • students
  • personalities

Planning together creates consistent instructional and learning experiences.

Simplify or murder momentum

Excessive planning kills momentum and enthusiasm.

A lack of execution is not always a lack of effort, but it could be over-planning or a lack of flexibility.

Focus simply

  1. What is the next instructional and learning outcome?
  2. What standards, skills, and knowledge support the focus?
  3. Do the decisions fit into the learning progression?
  4. Is this the best time to focus on the result?
  5. Why is this learning important now or in the future?


What diagnostic tool should we use to start the learning cycle?

Is there recent work product or performance data that can serve as the diagnostic?

An advantage of using recent student data reduces the need for a diagnostic providing similar data.


What will we model for our students? Is every teacher expected to model the same instructional practice?

An agreement on modeling specific learning behaviors or skills builds consistent learning environments and provides a look at the effectiveness of instructional practice.

Leverage teacher expertise and experiences to help target professional development.

How will we guide students to independent practice?

Not every class has to be identical, but consistent learning environments help benchmark instructional impacts. Performance data tells part of the story while actual instruction fills any informational gaps.

Student independent practice decision

Independent student work shows individual learning growth, and can also be analyzed for instructional effectiveness.

Independent practice could be a formative assessment or any work students complete during instruction.

If the team is going to use any work to evaluate instructional effectiveness, agree on modifications or additional support that everyone can use with their students.

Materials and resources

What do we already have? What do we need?

Textbooks will be a crucial resource. If you have any special materials used for instruction, it is best to have team access to them. Teachers then can decide if they will benefit their class instruction.

Some teams don’t share or use the same resources. Note the differences to help gauge their effectiveness and potential use in the future.

What task or assessment will all students produce for the team?

  • Do we have an assessment ready?
  • Does everyone agree on using the evaluation with the same administration guidelines?
  • What type of modifications, and with which students?

A formative assessment works for common team assessments. Students show learning with a shared experience. If you allow any modifications or additional support for specific students, agree on this before your administration. Consistency creates better environments for performance analysis.

How will we score and use the data?

  • Is there a rubric we can use for writing or performance tasks?
  • Can we rely on the data as a barometer for teaching or material effectiveness?
  • Did everyone follow the agreed administration guidelines?
  • Are we going to share the results outside of the team?

If you are using an assessment for grades or placement, administration guidelines should be consistent across all classrooms.

If some students were provided support outside of the initial agreements, it is something to consider during analysis.

Additional scaffolding or extra help can skew results for future planning purposes. The extra support is not wrong or bad, but for instructional inference, you need to be honest with your results internally.