Assessment Program

Assessment Creation

Classroom assessment

Create an assessment for a purpose.

Question the assessment

Is the test necessary? Does it provide value to teacher and student?

Purpose of the assessment

What data do I need? How will this help students?

Performance data for learning and instruction

  • Why do we need the student performance data?
  • What use do we have for the student results?
  • Will the results help instruction?
  • How will the data be used to help students improve learning?
  • Is the data going to be shared?
  • How will feedback from the assessment help our students?
  • Are all students expected to take the test?
  • What accommodations are acceptable?
  • What intervention will we use to improve future learning?

Focus each assessment on instructional goals and needs

  1. Outline or blueprint the assessment
  2. Select a test format that best suits your data needs and instructional focus in the classroom
  3. Add passages, items, and prompts that match the skills and concepts from your outline
  4. Balance the rigor and complexity of the assessment including passages, graphs, and illustrations
  5. Avoid items that cue or answer other items on the test
  6. Create an assessment that can be completed in the expected time

Review and question items

  • Do items assess standards in more than one way?
  • Does each standard or domain tested match instruction and the requirements of the standards?
  • Does each item test a standard fairly as taught in the classroom?
  • Is there an appropriate mix of item difficulty and complexity for the overall test and each concept?
  • Does every item measure a skill or content identified in the plan or outline?
  • Do the questions work together to create a complete picture of student learning?
  • Do any questions cue or help answer other items?

Question the test

  • Is the question mix assessing the standards in more than one way?
  • Is each standard or domain tested for the breadth required by the language of the standards?
  • Is there a good mix of difficulty and complexity for the overall test and each concept?
  • Do the content or knowledge items work together to create a complete picture of student learning?
  • Is overlap or cueing minimized?

Performance task

  • Is the task filling a gap that other assessments or classroom work do not address?
  • What work will students complete to show mastery?
  • Is this a real world or authentic task?
  • Why is the performance needed for learning and instruction?
  • Is the task worth the time and effort from the students?

Check and review


  • Provides valuable data and student information
  • Each question measures important content or skills
  • Matches instructional focus and intended learning

Content and skills

  • Each item measures content objectives, and their skills, accurately and appropriately
  • The vocabulary for each item matches instructional practice, and content area specifications
  • Overall test content parallel actual instruction during the learning cycle
  • Required background knowledge is fair
  • Questions free from bias


  • Requires students to show thinking needed at this point in the year for Critical, Creative, Problem solving, etc…
  • Age and grade appropriate thinking required

The value of performance data relies on each item assessing the right learning outcome

Content review

  • Does the item measure important learning?
  • Is the content measured worth long-term learning?
  • Does the item measure instructional significant content or skills?
  • Does the student have to think to answer each question? Is it simple recall of information, or do they have to apply critical thinking, problem-solving, or creativity?
  • Is the item overly specific or abstract?
  • Does the question attempt to serve too many purposes? Can you tell from a student response that they understood the behavior or knowledge tested?
  • If a student incorrectly answers an item, will you be able to say what the student did wrong?
  • Is this item cueing an answer to another question?
  • Does this item measure facts, not opinions?
  • Is there anything tricky about the item? Will all students understand the problem?

Format and style review

  • Does the item use a format appropriate for the content, and age of students?
  • Is the question so complicated that most students will not understand the topic?
  • Are the items formatted consistently? For example, vertically or horizontally.
  • Is vocabulary appropriate for the student population tested?
  • Does the item or passage, require too much reading? Is it worth the student’s time?

Stem or the question

  • Does the stem have correct grammar, spelling, and punctuation?
  • Is the stem written so all students should understand the problem?
  • Are there any words or phrases that do not need to be part of the stem?
  • Is all information in stem relative to answering the item? Does the student have to weed out unnecessary information to respond to the item?
  • Is there a better way to phrase the stem?

Correct answers and distractors (incorrect answers)

  • Does the right answer match the key?
  • Are all distractors reasonable and standard errors?
  • Is there too much repetition of phrases in the distractors? For example, repeating the same introductory term in each item answer choice.
  • Does the correct answer item choice vary in position across the assessment?
  • For numerical answers, are the potential solutions in logical or numerical order? Be careful in systems that let you scramble answers for multiple versions of assessment. It may create unfair versions of the item.
  • Is the length of each choice similar?
  • Are there any clues that give away the correct answer, such as silly distractors?

Source: Haladyna, T. (1997). Writing test items to evaluate higher order thinking. Boston: Allyn and Bacon.