Assessment Literacy FAQs

Contact Information

Educator Evaluation

OTES - (P) 614-369-3770
OPES - (P) 614-752-5073
SGM - (P) 614-466-9225
SGM - (P) 614-369-3770 
eTPES - (P) 614-466-9225

eTPES - Technical Support

(P) 1-877-314-1412

Regional Field Specialists

Northwest Region
Apryl Ealy
Southwest Region
Apryl Ealy
Cathryn Shaw

Southeast Region
Cathryn Shaw
Northeast Region
Tom Rounds

Central Region
Apryl Ealy
Cathryn Shaw

Assessment Literacy FAQs

Assessment Literacy FAQs (updated 10/16/2015)

Assessment Literacy FAQs (updated 10/16/2015)

    My district would like Assessment Literacy training. Whom do I contact?

    For assistance, contact the Regional Field Specialist assigned to your region indicated under “Contact Information” in the menu on the left of this page. These specialists can offer free professional development, which includes a variety of tools to facilitate the effective, meaningful design and implementation of assessments in classrooms.


    Back to Top

    How do I select the standards to include on my assessment?

    When the purpose of an assessment is to determine what a student has learned after an extended interval of instruction, such as an end-of-course exam, emphasis should be placed on the standards identified as learning priorities. In such cases teachers should narrow the focus of the assessment to those learning priorities that represent the most essential knowledge and skills that students should know. Some things that should be considered when selecting these learning priorities include:


    • Longevity -- does the intended learning address knowledge and skills that are important for the student to know this year and in years to come? For example, reading comprehension is a lifelong skill.
    • Leverage – does the intended learning address knowledge and skills that are important for other content areas?  For example, the ability to interpret charts and graphs is important in many content areas.
    • Levels – does the intended learning address knowledge and skills that will be important for the student to know in the next school year? Think prerequisites.

    These are guidelines, and it is not necessary that all three of these criteria be met for a standard to be considered a priority.  However, those that do meet all three criteria should be priorities.  It is also important to emphasize that the selection of learning priorities is best done collaboratively -horizontally and vertically - with other educators.  

    Back to Top

    How do I determine which assessment method to use to assess a standard?

    No single assessment method is superior to any other, but the case can be made that some methods are stronger matches for some learning targets. Selected Response, Constructed Written Response, and Performance Assessments are all possible choices depending on the learning targets to be assessed and the purpose of the assessment. Working together with colleagues to make the decisions about the best match for each learning target is preferred.  

    Selected Responses (Matching, True/False, Fill-in the Blank and Multiple Choice) are good matches when assessing recall or knowledge. Constructed Written Responses (Short Answer, Extended Response) are useful when assessing understanding or reasoning. Remember to have a written scoring guide or rubric already created; it is suggested you share rubrics with the students in advance as well. Performance Assessments are useful choices when a product needs to be reviewed or a performance needs to be observed. A written rubric is also necessary for Performance Assessments.

    Back to Top

    Is it appropriate to assess higher-level thinking with multiple choice items?

    The appropriateness of an assessment method (for example, multiple choice or other selected response methods) depends on the purpose and context of the assessment.   When speaking of “higher-level thinking,” we are usually referring either to higher levels of Bloom’s Taxonomy or higher levels of Webb’s Depth of Knowledge (DOK).  While it is not impossible to assess higher-level thinking (for example, DOK level 3) with multiple choice items, it may not serve the purpose well.  For example, a student may be asked to evaluate a situation in which they must draw a conclusion based on evidence from a text.  It is possible to structure an item such that the student must reason through the question and pull information together, justifying their reasoning in their mind in order to select the correct response(s). However, it is very difficult to write such items well, and higher-level thinking tasks often involve an extended period of time to perform.   It is also important to keep the purpose of the assessment in mind.  If the purpose is for the student to demonstrate mastery of the knowledge and skills in a DOK 3 standard (which is usually what is desired), then constructed response or performance would be more suitable assessment methods, since these methods require the student to show their reasoning more directly.

    Back to Top

    How can I improve the quality of my assessment items?

    Generally speaking, the quality of an assessment item begins with the alignment of the item to the standard or learning target being assessed and the instruction given. First, ensure that the standard, instruction and assessment item are all aligned, regarding both content AND rigor. Secondly, match the item type to the standard. To do this, consider which assessment method (i.e. selected response, constructed response, performance, etc.) will best allow a student to demonstrate learning of the standard. Contact your Regional Field Specialist for district support.

    Back to Top

    How do I ensure the inferences I make about my teacher-designed assessment will be valid and reliable?

    It is certainly more challenging to determine if a teacher-designed assessment is valid and reliable.  Using the checklist provided by the Department in the Guidance on Selecting Assessments for Student Learning Objectives is a good first step.  In addition, the considerations listed below will also help to improve the validity and reliability of your locally-designed assessments.

    Below are some considerations for improving validity:

    • Ensure a representative distribution of assessment items.
    • Ensure assessment items are aligned to standards and course learning targets.
    • Ensure assessment items are assessing the standards at the appropriate cognitive complexity level.
    • Ensure that other content experts review the assessment.


    Below are some considerations for improving reliability:

    • Avoid ambiguous test questions.
    • Provide clear and consistent directions.
    • Develop a systemic administration procedure.
    • Ensure consistent use of rubrics.
    • Use multiple scorers (when possible) for items that are not selected response.

    Back to Top

    How many questions should be on my SLO pre/post assessment?

    The number of questions on an assessment is related to the purpose of the assessment. There is no one set recommendation for number of questions on an assessment. Assessment length is related to the breadth and depth of content that the assessment is designed to measure. More complex and high-priority standards will require more questions to determine student mastery compared to less complex or low-priority standards. Remember, each standard identified in the SLO must be assessed on the pre/post assessment. The writing or review team for the assessment should balance coverage of the standards by multiple assessment items with realistic expectations of the overall length of the assessment.


    The type of assessment item used can also affect the number of items needed to assess a particular standard. For instance, a single constructed response item can often generate the same amount of information as several selected response items.  

    Additionally, it is recommended that the assessment be realistic in terms of the time required for administration. Therefore, educators should consider what is developmentally appropriate for their students when reviewing or creating assessments. Educators will need to make decisions about balancing larger data sets with a developmentally-appropriate assessment.

    Back to Top

    Should my SLO pre and post-tests be identical?

    Using the same instrument as a pre- and post-assessment is not ideal. In fact, using the same assessment multiple times within the same year may decrease the validity of results since students will have seen the questions before.  A well-written pre-assessment (used in conjunction with other forms of baseline data) can be a valuable source of data, because it should closely align with the post-assessment to measure growth. An assessment blueprint is a tool that can be used to ensure that pre- and post-assessments measure the same general content, as the post-assessment, at comparable levels of rigor. It is highly recommended that both assessments be reviewed by content experts for validity and reliability. Contact your Regional Field Specialist for free district support on blueprint design.

    Back to Top

    Can I change the post-assessment after administering the pre-assessment for my SLO?

    It is not advisable to change your post-assessment after your pre-assessment has been administered.  The pre- and post-assessments should be aligned.  They should assess the same content at the same cognitive complexity level.  If the post-assessment is more difficult than the pre-assessment, your pre-established growth targets may not be met. However, for students scoring in the upper range on the pre-assessment, you may need to include a capstone project in addition to the post-assessment to demonstrate growth. This capstone project would be included in the student growth target.

    Remember, this is a learning process.  The goal is to learn from the process in these early years.  Districts and schools should have clear expectations regarding locally-designed assessments to ensure quality pre- and post-assessments.

    Back to Top

    Why is collaboration important when designing an assessment?

    It is strongly encouraged that colleagues work together when designing high-stakes assessments. Grade level and/or subject area colleagues (within or across districts) should collaborate when designing these assessments. Working collaboratively will help ensure district, building and grade-level consistency, assist with vertical alignment, and greatly enhance test validity, reliability and absence of bias. In instances where a team of teachers cannot create an assessment, the assessment should be developed in conjunction with an instructional coach, curriculum supervisor, special education teacher, English Language Learner teacher, an administrator or other faculty member with assessment expertise.

    Back to Top

    How can a test blueprint help me create an assessment?

    A test blueprint is a tool that can be used to design a high quality, aligned assessment. A blueprint requires the teacher to identify the intended learning to be measured in a given assessment and the level of rigor. A test blueprint also guides assessment item selection and development.

    A test blueprint can also be used to evaluate existing assessments.  Blueprinting an existing assessment will help a teacher be certain that the assessment measures what they have intended and is aligned to the standards. An example blueprint can be viewed here. Contact your Regional Field Specialist for free district support on blueprint design.

    Back to Top

    How do I include stretch in my assessment?

    To have sufficient stretch, an assessment must contain questions that vary in complexity. The assessment should contain both basic and advanced knowledge and skill questions so that both low-performing and high-performing students can demonstrate growth. When considering incorporating stretch on a particular standard, design questions at varying depths of knowledge. Karin Hess’s Cognitive Rigor Matrix  can be especially helpful for creating assessment items with stretch.  Here is an example:

    Ohio's Learning Standards Mathematics. Content.6.EE.9 Use variables to represent two quantities in a real-world problem that change in relationship to one another; write an equation to express one quantity, thought of as the dependent variable, in terms of the other quantity, thought of as the independent variable. Analyze the relationship between the dependent and independent variables using graphs and tables, and relate these to the equation. For example, in a problem involving motion at constant speed, list and graph ordered pairs of distances and times, and write the equation d = 65t to represent the relationship between distance and time.

    DOK Level 1 – List and graph ordered pairs of distances and times

    DOK Level 2 – Analyze the relationship between distance and time using graphs and tables

    DOK Level 3 – Describe a situation involving a moving object that this graph could represent

    Because this Grade 6 Math standard is written at DOK Level 2, the assessment should include items that ensure students are mastering the material at that level.  However, based on pretest or trend data, stretch could be achieved by including questions at DOK Levels 1 and 3 so that low and high performing students can demonstrate their learning.

    Back to Top

    Why do I need to learn about Webb's Depth of Knowledge (DOK)?

    Depth of Knowledge (DOK) was created by Norman Webb for the purpose of aligning assessments and assessment items to the cognitive complexity level of the standards they were designed to assess. The DOK level is determined by the degree of mental processing required of the student to meet the objectives of a particular standard, assessment item or instructional activity. The DOK level focuses on how deeply a student needs to understand the content.  Understanding the DOK level of the standard will help teachers create assessment items that accurately assess the standard at the expected level of rigor.

    For more information, contact your Regional Field Specialist.

    Back to Top

Last Modified: 3/31/2019 9:50:54 PM