Project management

Assessment project management

Project management governs plans, concepts, and ideas without losing control.

Project plan

  • Defines creation process
  • Focuses people
  • Identifies responsibility for tasks
  • Keeps project on time
  • Saves money and heartache

Scope creep

Keep the creep out of your district.

Learn to say no to ideas that do not match purpose and goals.

Scope creep is a common problem in schools, and grows projects without adding value.

Basics of an assessment project plan

  • Identify need for assessments
  • Detail data use requirements
  • Schedule administration timelines
  • Plot test windows
  • Gather resources or materials
  • Define the technology needed
  • Create blueprints for the assessments
  • Build the assessments
  • Design scoring procedures
  • Organize and draft data analysis process
  • Reteach or intervention expectations
  • Communicate the who, what and where information is stored, shared, and archived

Assessment instructional meeting

Instructional team meetings

Team or department meetings have different:

  • teachers
  • skills sets
  • students
  • personalities

Planning together creates consistent instructional and learning experiences.

Simplify or murder momentum

Excessive planning kills momentum and enthusiasm.

A lack of execution is not always a lack of effort, but it could be over-planning or a lack of flexibility.

Focus simply

  1. What is the next instructional and learning outcome?
  2. What standards, skills, and knowledge support the focus?
  3. Do the decisions fit into the learning progression?
  4. Is this the best time to focus on the result?
  5. Why is this learning important now or in the future?


What diagnostic tool should we use to start the learning cycle?

Is there recent work product or performance data that can serve as the diagnostic?

An advantage of using recent student data reduces the need for a diagnostic providing similar data.


What will we model for our students? Is every teacher expected to model the same instructional practice?

An agreement on modeling specific learning behaviors or skills builds consistent learning environments and provides a look at the effectiveness of instructional practice.

Leverage teacher expertise and experiences to help target professional development.

How will we guide students to independent practice?

Not every class has to be identical, but consistent learning environments help benchmark instructional impacts. Performance data tells part of the story while actual instruction fills any informational gaps.

Student independent practice decision

Independent student work shows individual learning growth, and can also be analyzed for instructional effectiveness.

Independent practice could be a formative assessment or any work students complete during instruction.

If the team is going to use any work to evaluate instructional effectiveness, agree on modifications or additional support that everyone can use with their students.

Materials and resources

What do we already have? What do we need?

Textbooks will be a crucial resource. If you have any special materials used for instruction, it is best to have team access to them. Teachers then can decide if they will benefit their class instruction.

Some teams don’t share or use the same resources. Note the differences to help gauge their effectiveness and potential use in the future.

What task or assessment will all students produce for the team?

  • Do we have an assessment ready?
  • Does everyone agree on using the evaluation with the same administration guidelines?
  • What type of modifications, and with which students?

A formative assessment works for common team assessments. Students show learning with a shared experience. If you allow any modifications or additional support for specific students, agree on this before your administration. Consistency creates better environments for performance analysis.

How will we score and use the data?

  • Is there a rubric we can use for writing or performance tasks?
  • Can we rely on the data as a barometer for teaching or material effectiveness?
  • Did everyone follow the agreed administration guidelines?
  • Are we going to share the results outside of the team?

If you are using an assessment for grades or placement, administration guidelines should be consistent across all classrooms.

If some students were provided support outside of the initial agreements, it is something to consider during analysis.

Additional scaffolding or extra help can skew results for future planning purposes. The extra support is not wrong or bad, but for instructional inference, you need to be honest with your results internally.

Undermine a performance task

Undermine the performance task

  1. Practice and rehearse the task in preparation for information recall, not complex thinking.
    2. Provide information during the task to prompt acceptable responses, hindering students thinking.
    3. Structure the task with specific directions and steps to create an intended response. Students create response that shows teacher thinking, not students.
    4. Use poor item formats, or items that target the wrong skills or content. Student does not have opportunity to apply knowledge and skills in the right context.
    5. Task lacks clear guidelines. Students don’t know or understand what is expected.
    6. Hands-on tasks are too structured. Students demonstrate ability to follow directions, not higher order thinking.
    7. Accept only one way to answer and respond to task.
    8. Task contains too many drilled responses or item types, not higher order thinking items.
    9. Structure of assessment binds answers to one potential response.
    10. Task assesses the easiest aspects of student performance instead of making students think through response.

Reference: Creating Tests Worth Taking. Wiggins, Grant. Educational Leadership, v49 n8 p26-33 May 1992