Home » Assessment » Page 2

Category: Assessment

District Support for Classroom Assessments

District Support for Classroom Assessments

School district administrators, who set clear guidelines for the use of classroom assessments and provide meaningful professional development related to assessment literacy, are also working to make what goes into data collection tools, such as a data dashboard, more useful for educational decisions.

But expensive data dashboards are only as valid as the information that goes into them. So how can districts make sure their digital technology tools are collecting and reporting high-quality information, particularly with respect to classroom assessment data?

Assessment experts argue that districts need to invest time and money to support teachers as they design classroom assessments aligned to the curriculum and standards.[1] One obstacle is that real and perceived validity problems of large-scale assessments in the wake of No Child Left Behind have “created a backlash against the validity of testing in general.”[2] Anti-testing sentiment has been so high that some teachers have been tempted to scrap assessments and grades altogether. Such actions make data collection tools less useful for educational decision-making and remove an essential piece of the feedback loop that gives students what they need to improve.

Steven Turner, a psychology teacher at Albemarle High School (Virginia), argues that students deserve information about their progress and that whether we change traditional grading percentage structures, or call grades something else, “we still need an effective system to measure student growth and performance.”[3] In a follow-up email to his article, Mr. Turner elaborated on how he thinks districts can support teachers as they develop and use classroom assessments.[4] What follows is an annotated summary of that email.

Clear guidelines from the district. Turner maintains that unless a division superintendent takes measurement of student growth and achievement seriously, assessment in the classroom will be variable and ineffective. The Michigan Assessment Collaborative has developed detailed standards for assessment literacy that reinforce Mr. Turner’s point. According to the standards, district administrators should believe in the need for “uniformity in assessment expectations and practices across buildings.”[5]

Explain the purpose. When asked about potential conflicts regarding district policies on classroom assessment and teachers’ expectations of autonomy in their classrooms, Mr. Turner suggested that “teacher pushback against district mandates comes from specific policies or requirements without first understanding the purpose.” He went on to say that most teachers have to provide instruction within a set of curricular guidelines that leave room for individual autonomy. In his view, it’s not unreasonable to think that a division could set grading and assessment guidelines that serve the same purpose.

Assessment literacy professional development. Mr. Turner also observed that teachers often have very little training in assessment and grading and that professional development is frequently focused on improving instruction. The result is that educators “often fail to see that a strong understanding of assessment can improve instruction.” He suggests that central office folks who work in assessment, research, and/or accountability should work together with departments of instruction to find the best way to make sure that teacher knowledge about assessment—including the foundational concepts of validity and reliability—makes it into the classroom.

[1] How to Plan for a Balanced Assessment System While Keeping Curriculum in Mind

[2] Unbalanced Literacy

[3] Grading? Why Bother

[4] Steven Turner, Albemarle High School (Virginia) teacher, email message to author, February 14, 2019.

[5] Assessment Literacy Standards A National Imperative, p.12.

Why Use Math Performance Tasks

Why use Math Performance Tasks, with David Foster and SVMI

www.sparklewater.com Performance task assessment as an educational tool is receiving fresh interest in the context of unhappiness with multiple-choice assessments1. For David Foster, Founder and Executive Director of the Silicon Valley Mathematics Initiative (SVMI), performance task assessment is much more than another passing educational fad. Since forming the Mathematics Assessment Collaborative (MAC) in the 1990s, he has been a champion of using mathematics performance task assessment to improve both student learning and teacher instructional practices.

The work of SVMI/MAC was used as an extended example in the 2013 publication, “Teacher Learning Through Assessment: How Student-Performance Assessments Can Support Teacher Learning,” by Linda Darling-Hammond and Beverly Falk2. The article describes how teachers in MAC member districts use research-based design principles to write performance task assessments aligned to the Common Core State Standards (pages 11–15). Students across all districts take common assessments, and teachers engage in the rubric-scoring process as professional development. Assessment reports include a reproduction of the tasks, scoring rubrics, and examples of real student work, all of which can inform classroom instruction.

To learn more about SVMI’s performance task assessment, Educational Data Systems staff took an opportunity to discuss the topic with David Foster and two other SVMI leaders, Cecilio Dimas, Partner and Director of Innovation & Strategy, and Tracy Sola, Assistant Director. Unless indented to indicate a direct quote, the text for this blog post is an edited but closely paraphrased version of our discussion.

EDS: Would you tell us about the history of why and how SVMI/MAC has supported performance task assessment for math education?

David Foster (DF):The story begins with the English-language arts (ELA) response to a 1975 Newsweek magazine article, “Why Johnny Can’t Write.” Our English teacher colleagues argued that the assessments at the time stressed grammar, vocabulary, and spelling. Writing was not being assessed, so it wasn’t being taught.

That was the birth of the Bay Area Writing Project to assess writing using a prompt, with humans using a rubric to do the scoring. It dramatically changed the way we teach writing in this country and also dramatically changed the way we did professional development. Professional development started to be about long-term ideas for developing good techniques for teaching writing and using writing assessments to review and inform teaching.

Teaching mathematics is about teaching problem solving. The parallel to ELA is to give kids good problems to solve, be able to look at their work, be able to score collectively, and be able to use it formatively to improve instruction.

In the mid-1990s, California was involved in “math wars,” and the governor was impressed with what was happening in Texas (see below for context information). The political battle ended in 1997 with CA adopting new standards in English and mathematics, and the state needed new tests to assess the new standards. Developing a new statewide assessment [from scratch] takes five years, so the state chose an existing off-the shelf multiple-choice assessment.

I was concerned that a multiple-choice test would not provide robust information about students. We had started the Mathematics Assessment Collaborative (MAC) in 1996 and had partnered with the University of Nottingham Mathematics Assessment Resource Services (MARS) to promote research and design a performance assessment instrument. The idea was to use performance task assessment to be able to look at students’ thinking, see how they approached the problem, and see how they were communicating their understanding. Essentially the math performance tasks would parallel a good writing task.

Cecilio Dimas (CD): The original call to action for the 23 MAC member school districts was to discuss other assessment options above and beyond the multiple-choice test. Santa Clara and San Mateo counties [CA] were original MAC members, and although membership has grown in geographic scope and fluctuated through time, the goal remains the same.

DF and CD: SVMI/MAC members administered the first MARS-developed performance-based tests for grades 3, 5, 7, and 9 in the spring of 1998 and expanded to grades 3 through 10 the following year. In 2004, grade 2 performance tasks were added. Teachers in the SVMI/MAC member districts took over writing the tasks in 2012 and continued to use MARS task design principles, aligning the new tasks to Common Core State Standards. SVMI/MAC continues to expand; for example, we have added “Integrated” mathematics options for high school students.

Tracy Sola (TS): Teachers ask for K–1 performance tasks, but we know that it is not developmentally appropriate for those young students to sit down and take a long written test. We have developed K–1 tasks, but they are used differently in classrooms—for example, as whole-group lessons or in individual interviews.

EDS: Will you expand on why performance assessment is vitally important to teachers, students, and parents?

DF: We’d all be better at teaching and learning if we focused a whole lot more on student thinking and student work. Far too often we focus on just what we’re supposed to cover or teach.

Performance task assessment gives us detailed information about what students know and how to build on that to meet the learning goals. [This] understanding helps us be far more effective in addressing the learning needs of those students.

CD: Performance tasks create a space for a lot of student thinking to surface. Even if students are struggling, that’s also usable data.

DF: Typical reports from multiple-choice tests give you a score that tells you only that your kids aren’t very good at fractions (for example). Of course, we want correct answers, but that is a byproduct of what we want students to know and be able to do. What we really want to know is the process and the thinking that goes along with it. What is helpful is to understand what they do know about fractions and where there are misconceptions. Performance assessment provides that information.

For more information about the Silicon Valley Mathematics Initiative, please visit https://svmimac.org/.

FOOTNOTES

  1. There are better ways to assess students than with high-stakes standardized tests
    What Happens When States Un-Standardize Tests?
    Assessment Flexibility for States under ESSA: Highlights from New Hampshire’s Innovative Assessment Application
  2. Teacher Learning Through Assessment


David Foster

David Foster

David Foster is the executive director of the Silicon Valley Mathematics Initiative (SVMI) comprised of over 160 member districts in the greater San Francisco Bay Area. Besides the intensive work in California, SVMI consults across the country including New York, Illinois, Massachusetts, Ohio, Tennessee and Georgia. SVMI is affiliated with programs at University of California, Berkeley, Stanford University and San Jose State University. David established SVMI in 1996 working as Mathematics Director for the Robert N. Noyce Foundation. SVMI developed most of the content (videos, POMs, MAC Toolkits, Coaching Materials, etc.) that is available on www.insidemathematics.org. Foster is the primary author of Interactive Mathematics: Activities and Investigations, published by Glencoe/McGraw-Hill, 1994. David was a Regional Director for the Middle Grade Mathematics Renaissance, of the California State Systemic Initiative. David taught mathematics and computer science at middle school, high school, and community college for eighteen years.


Cecilio Dimas

Cecilio Dimas

Before joining SVMI as a Partner and Director of Innovation & Strategy in 2016, Cecilio was the Director of the STEAM (Science, Technology, Engineering, Arts, & Mathematics) Initiative at the Santa Clara County Office of Education (SCCOE). Prior to leading the STEAM Initiative, he served as a mathematics coordinator at SCCOE where he supported districts in their efforts to implement the Common Core State Standards-Mathematics. He has taught at both elementary and secondary levels, including 2nd grade, 3rd grade, 5th grade, 7th grade math, and Algebra I. SVMI has played a vital role in his development as a learner, teacher, coach, and facilitator. He enjoys working with students and teachers and believes that fostering the development of critical-thinking, collaboration and communication skills are essential for all students to have to thrive in our local, national, and global communities.

 


Tracy Sola

Tracy Sola

Tracy Sola has been the Assistant Director of the Silicon Valley Mathematics Initiative since 2015.  Tracy has worked with SVMI to deliver professional development to teachers, coaches, and administrators since 2008, facilitating professional development experiences across the San Francisco Bay Area, Southern California, New York, Georgia, and Illinois. Tracy also directed the SVMI Lesson Study Project for many years. On behalf of SVMI, Tracy collaborates with Arizona State University, The University of California, Berkeley, and The Shell Center, University of Nottingham, UK, in the development of electronic versions of mathematics curriculum. Tracy coached K-8 mathematics and taught grades K-8 for 16 years.

2018 Assessment Resource Roundup

2018 Assessment Resource Roundup

The end of the calendar year presents a natural time to reflect on resources that we’ve found to be interesting, informative, or thought provoking. This blog provides a list—in no particular order—of five websites related to K–12 assessment that caught our interest in 2018. We hope you will find these links helpful, and Happy New Year!

The Center for Assessment. The National Center for the Improvement of Educational Assessment, widely known as the Center for Assessment, recently celebrated their 20th anniversary. Their blog, “CenterLine,” provides helpful information on a broad range of assessment-related topics from statewide accountability to how teachers use assessment data to improve instruction.

Retrieval Practice. “Retrieval practice” is a learning strategy that can involve quizzing or low-stakes assessment as a way to recall information. There are many examples of blogs devoted to retrieval practice and how it relates to classroom assessment. We’ve chosen to highlight the following post by Aidan Severs because it provides concrete tips for teachers as well as links to resources explaining the cognitive science behind retrieval practice. Although this particular post highlights non-quiz tips for teachers, it includes plenty of useful information related to classroom assessment.

Polls on the Use of Educational Data. The Data Quality Campaign released the results from two polls about the use of educational data—one poll of parents and one of teachers. Complete with infographics, this post summarizes the findings of both polls. While many parents value educational data, teachers report that they don’t have enough time in their school days to make effective use of data.

Using Test Data to Devise Instructional Plans. Speaking of teachers using assessment data, Brian Bushart, the Curriculum Coordinator for Elementary Mathematics in Round Rock ISD (TX), wrote a blog post with an excellent example of using test data to devise instructional plans. He argues that reviewing items and student responses will yield “…a truer picture of what’s challenging your students so that you can more accurately target with what and how to support them.”

Including Teachers in Item Writing. Teacher Development published an open-access article on making large-scale assessment more relevant to educational practice by making it a part of teacher professional development. Despite the usual limitations of self-reported data and selection bias, authors Corey Palermo and Margareta Maria Thomson provide some evidence that involving teachers in item writing and review can be part of an effective professional development program.

Local Assessment Administration Methods: Where does Your School District Fit In?

Local Assessment Administration Methods: Where does Your School District Fit In?

hollywoodtrans.com buy generic cialis In preparation for the 2018 California Educational Research Association (CERA) annual meeting,
Educational Data Systems developed and administered a brief survey of local assessment administration practices in California public school districts. The conference theme was “Readiness Across the Ages: Sharing Ideas, Connecting Systems, and Creating Solutions.”

We wanted to investigate the theme of “readiness” with respect to local assessment. At a minimum, we think readiness in educational assessment means that schools, teachers, and students will be ready

  • for assessments to provide valid measures of what students know and can do;
  • for online, computer-based statewide assessments; and
  • for using assessment data to improve classroom learning and instruction.

In practical terms, how ready are students for any type of assessment format? How ready are district educators to prepare students for statewide assessments and to use local assessments to inform classroom instruction? To begin forming answers to these questions, we developed a short survey.

We sent an email survey invitation to 860 California educational assessment professionals who have made their way onto our email list in the past year. The survey consisted of only three main questions: the first regarding types of local assessments that are administered with paper-and-pencil (P&P); the second about types of local assessments given in a computer-based or online testing (CBT) format; and the third concerning plans to change assessment administration methods in the future.

Our survey received 102 responses, and we used the answers to inform our conference presentation, Ready for Any Format: Use Online and Paper Assessments Effectively and Efficiently. The topic drew an engaged audience of CERA conference participants to our session. Although the survey results should not be taken as scientific evidence of broad trends, for our respondents:

  • Benchmark and interim assessments were the most common types of local assessments
  • Most districts use both P&P and CBT administration methods for local assessments
  • About 60 percent reported no plans to change local assessment administration methods
  • Almost 40 percent reported that they plan to change at least some assessments from P&P to CBT
  • Practice for online statewide assessment was the most common reason given for switching from P&P to CBT

If you are wondering how local assessment practices in your district compare to those of our respondents, or if you would like to see more details of our survey and the results, you may access the presentation here.

Join Us at the California Educational Research Association (CERA) Conference

Join Us at the California Educational Research Association (CERA) Conference

viagra check these guys out We are pleased to announce that Educational Data Systems will be participating at this year’s California Educational Research Association’s conference. Educational Data Systems has a long, rich history of contributing to CERA. This year’s conference will be held on November 12-14 at the Disneyland Hotel and the theme is “Readiness at All Ages.”
Read more