Choosing and Using Tests Effectively

    Liz Brooke asks the questions that assessments need to answer

    Right now, educators are contemplating several questions as they strive to prepare students for the fall. This is particularly true when it comes to assessing the strengths and weaknesses of their students.

    As an educator you most likely face a tsunami of testing: federally required tests, state-mandated evaluations, district benchmarks, and interim assessments. Plus, there’s the formative assessments you need for determining whether students have mastered a concept or skill. And oftentimes there are overlapping purposes to these tests. But you can reduce testing redundancy with the following tip. Always ask yourself: what question is this data going to help me answer?

    If you don’t know the answer to that question, don’t give the test. That’s because data that could help you may already be available elsewhere. If that’s the case, you could be spending that testing time on instruction.

    Also, if you don’t know why you’re administering a certain assessment, you will likely not be able to quickly connect the results to your instruction or other relevant decision. So, you’ll have spent instructional time on a test, but you’ll be no better off than you were before. (Hopefully, this will save you some instructional time in the future.)

    No matter what type of testing it is (e.g., a screener, progress monitoring, etc.), what’s important is the question that it’s going to help you answer and the quality (e.g., reliability and validity) of the test as an answer that question.

    Of course, you won’t have much say when it comes to government-mandated tests (e.g., high-stakes state assessments), but understanding the question that state assessments are trying to answer (i.e., have they learned what they needed to learn for their grade level) will help in how you interpret the results or any decisions that are made with those results. But even states are thinking about alternatives to traditional summative assessments.

    However, in this article, let’s discuss formative assessments (tests that inform instruction throughout the year). Some tests may even overlap with state assessments, so you’ll also be able to determine the current likelihood of students performing well on the high-stakes version.

    Three Major Questions When Using Assessments: Especially in this Post ‘Remote Learning’ Era

    To make effective use of the assessments that you do choose to employ, ask yourself an additional three major questions:

    1. Where are my students in relation to where they “should be”?

    You can think about this in terms of grade-level benchmarks or risk level. We know there’s been interrupted learning. You will be determining how and to what degree that has impacted individual students in relation to grade-level standards.

    2. If they are behind, what are their areas of need?

    Once you know which students are behind, you’ll need to know why and what their specific learning gaps are (a student could be struggling in more than one area). When this question is answered, you can start to build a profile of strengths and weaknesses that will tell you how intense the instruction needs to be and what to focus on during instruction. The answer to question one will help determine how intense the instruction needs to be to help close the gap, and the answers to question two will help determine what you focus on during that instruction.

    3. What do we do about it?

    If you’re testing just to check a box and say you tested, that is a waste of time, unless you do something with the data. And doing something with the data means determining how to close the learning gaps, whether that means adjusting your instruction focus, the intensity of instruction, or at a school/district level, the resources dedicated to certain areas.

    Some questions that fall under the general umbrella question of what to do about learning gaps are the following:

    • Do students need more time with the teacher so that they can be explicitly taught a particular skill?

    • Is there a small group of students who all need support in the same area and can be grouped for instruction?

    • Can you leverage a program that incorporates technology to be able to personalize instruction and provide guided practice or corrective feedback in certain areas? And can you leverage the data from the program to help inform your instruction?

    The Final Questions

    A powerful way to answer the questions in the previous section is by having formative assessments embedded within digital curriculum or products. For example, Lexia Learning’s Lexia Core5 Reading and Lexia PowerUp Literacy programs provide high-quality assessment data without teachers having to stop instruction time for tests.

    Whatever formative assessment you use, the data should be simple to interpret and actionable to enable you to alter your instruction. You can take data-driven next steps to change a student’s trajectory.

    Nowadays, there remains uncertainty around whether learning is going to be blended or in person. So, you need to consider whether your assessments can be administered remotely. That was a big challenge with state assessments during the last school year, and a lot of states canceled their assessments. Here are some final questions to consider in the era of remote learning:

    • Can your assessment be given remotely?

    • Can it be given to a large group of students remotely or do you have to do it one on one?

    • Can you get that data quickly and remotely?

    Once you have the assessment data and begin the task of addressing the learning gaps that occurred over the last year and a half, the programs you choose matter. Teachers need tools and resources based on the science of reading and efficacy around those resources in order to make a real difference. The questions above will help point you in the right direction.

    Dr. Liz Brooke, CCC-SLP, is Lexia Learning’s chief learning officer and former director of interventions at the Florida Center for Reading Research.

    LEAVE A REPLY

    Please enter your comment!
    Please enter your name here