Data & AssessmentAssessment Design Assessment DesignCreate/Select a Quality Assessment

Create/Select a Quality Assessment

Whether you are writing an assessment from scratch or selecting one that has already been written (e.g., from a publisher), there are many considerations you must keep in mind in order to render accurate data on student mastery of standards/concepts. This lesson walks you through important assessment design considerations. This lesson functions best after you have followed the "Write/Select Quality Questions" lesson (as the lesson that proceeds it) and are now ready to ask your self additional questions about your assessments balance, focus, etc. This lesson does not concern the evaluation and quality of individual questions (that is covered by the "Write/Select Quality Questions" lesson); rather, it concerns the evaluation and quality of how those questions are put together.

Where to Start

Where to Start

Have related resources handy. For example:

Have your questions available. This lesson functions best after you have followed the "Write/Select Quality Questions" lesson (as the lesson that proceeds it) and are now ready to ask your self additional questions about your assessments balance, focus, etc. This lesson does not concern the evaluation and quality of individual questions (that is covered by the "Write/Select Quality Questions" lesson); rather, it concerns the evaluation and quality of how those questions are put together.

Terminology

Terminology

You also might want to review question term definitions such as the following:

  • Answer Options = options a student may pick from in a closed-response (e.g., multiple choice) assessment; these include the correct answer and the distractors
  • Distractor = wrong answer
  • Instructions = these may be directed at the student (e.g., "Read the passage below and answer the questions that follow") or at the test administrator (e.g., common for lower-elementary levels where the teacher reads questions to the class)
  • Item = question or other way of measuring the student's mastery level of a particular standard or concept
  • Keyed Response = another term for correct answer
  • Rationale = reason student might have selected an answer option (e.g., if he selects "B. 10" for the question "What is 8 + 12?" he might have forgotten to carry the 1)
  • Stem = statement or question that precedes answer choices
  • Stimulus Material = material intended to be used to answer a question (e.g., graph, table, passage, map, picture, diagram, etc.)

Take the Test

Take the Test

That's right. Take the assessment, just as a student would - preferably a week or so after you wrote or selected questions for it (or have others take the test who weren't involved in the questions writing/selection). Note which questions and test sections are easiest, which are hardest, which could render unnecessary confusion for students, etc.

Even though careful thought already went into each question, you can still catch problems at this stage. Also, you want to get a feel for overall rigor and balance. Do you already spot changes that need to be made?

Balance and the Big Picture

Balance and the Big Picture

Remember, this evaluation applies to the test as a whole and its sections (not individual questions):

Breadth/Scope of Standard

Standards often require multiple things of students. Do the questions on your test appropriately cover the breadth of the standard, or are they limited to only one of its aspects? Even if the questions are well-crafted, you might need to replace some questions with others to thoroughly assess a standard.

Rigor

Does the collection of questions assessing a standard match the rigor the standard requires? For example, if the standard requires students to evaluate arguments contributing to the development of the Constitution, has the test successfully required students to evaluate. Consider the Bloom's Taxonomy level of the standard being assessed.

# of Questions per Standard

Consider the assessment as a whole, the pacing guide, and the assessment series as a whole. For example, does the pacing guide note this assessment should cover 5 standards, whereas your 20-question tests contains 11 questions on a relatively simple standard, leaving just 9 questions to assess the remaining 4 standards? That would be a problem you'd want to remedy. A bare minimum of 3-4 questions is typically needed to accurately assess mastery of a standard, though this number can vary based on standard scope and complexity. If you are mirroring state blueprints or question allotments determined ahead of time, be sure to compare the assessment to these.

Independence

While multiple choice questions may share the same stimulus material, all questions should function independently from one another. For example, answering a question correctly should not rely on having answered a previous question correctly, nor should it rely on (or be helped by) information revealed in another question (within the stem or answer options).

Format

Format

Consider all of the following in the way of format:

Look

You might opt to mirror the look of state Released Test Questions in terms of how questions are numbered, how answer options are itemized, how many columns are used, how much white space is on a page, etc. If this assessment is one in a series, they should all maintain a cohesive look.

Instructions

If there are pre-test instructions for students and teachers, are they as clear and brief as possible?

Stimulus Materials

If stimulus material(s) are used to answer questions (e.g., graph, table, passage, map, picture, diagram, etc.), are the images of good quality, clear, etc.? Is their connection to the question(s) clear (i.e., will students know they have to use them to answer related questions)?

Remember that while there are numerous advantages to multiple choice tests (e.g., they make a good start to an assessment program, especially if your colleagues are resistant, they keep scoring objective, they facilitate instant feedback for students/parents/educators, they save educators time, they are cost effective, etc.), Illuminate also supports multiple measures.

For example, your assessment might feature a combination of assessment types (as shown above). You don’t have to mix assessment types like this; just know the Illuminate sheet design is open-ended to accommodate varied needs.

Implementation

Implementation

Just like a new pacing guide, a new assessment can constitute a big change for teachers, and the way you handle its roll-out is crucial. There are free ways to survey entire grade levels or subject areas (e.g., Survey Monkey, Google Forms, etc.) to obtain feedback on your assessment draft(s). Note that free, electronic surveying options will require minimal time compared to hard-copy or email approaches.

When your assessment is finalized, add it to the Illuminate system and share it with everyone who will need access to it (don't forget administrators, Teachers on Special Assignment, instructional coaches, etc.). Also be sure the administration window, pacing guide integration, and related tasks (e.g.., results analysis workshop) are all clear. Plan to repeat this information and provide reminders. Accompany it by predicting and answering related questions (e.g., who is photocopying test booklet copies for students? Are teachers sharing sets? Do students/teachers scan in the classroom or turn their tests in elsewhere? etc.).

Next Steps

Next Steps

You might be interested in the "Create an Assessment without GradeCam" lesson, which will walk you through the steps of adding your assessment to the Illuminate system. You might also refer to other lessons in this manual, or refer to the "Step 9. Evaluate Progress and Communicate Next Steps" lesson for details regarding next steps in the assessment's life cycle (e.g., planning for post-administration evaluation of the assessment).