The goal of these quarterly assessments is for students to practice taking a test like the state test they will take in May. They are also a place to look at data- the results from scantron get analyzed by administration all the way up to the superintendent. This process is new to us this year. It doesn't really fit with my Standards Based Grading approach, but high stakes tests are a reality that's front and center right now because one school didn't meet adequate yearly progress, which means the whole district is labeled as "at risk." Point being, we have to give exams that look like state exams.

We started at the last meeting by discussing what we have currently studied this quarter and what we will get to by the end of the quarter. There are three levels of geometry with different books and different teachers and we all have different approaches. It's interesting to see what our common ground is and how to negotiate being in approximately the same place at the end of each quarter. Once we had a list of topics, we headed to ProblemAttic. This site is a great resource because it has problems pulled from our state exam (MCAS in Mass.) as well as other states, organized by topic. Then it puts everything into a nice format for you at the end! (The formatting is the only thing missing from our state site.) We went through the relevant topics and added all of the problems that were acceptable. The last step at this meeting was to save the pdf of 82 problems to the shared folder we have on dropbox. (If you don't have dropbox ask someone for an invite- it gets both of you extra free space!)

When I arrived at today's meeting I found all 15 pages printed and taped to the dry erase board, labeled by topic. It was a beautiful sight, I wanted to photograph it but didn't want to be that nerdy (I know, if I can't be nerdy in the math department something's wrong, but I was feeling shy). The six of us then went through each section, identifying problems that were too easy, too hard or redundant. We debated the relative merits of problems and how they align with our courses.

"This one has students determine that a rectangle is a parallelogram, while this one gives them diagrams of the same shapes and asks them what they have in common"

"I spent a lot of time on always, sometimes, never problems so I'd rather the first one"

"Done!"

By the time we finished this process we were down to 48 problems (from 82). I was worried this was still far too many problems because we have to give the same exam to everyone, including my kids on IEP's. Another teacher made a good point, saying that many of the questions only required use of definitions, no calculations required. To see how many questions fell in that category, we grabbed a marker and put an orange smiley face next to any problem that shouldn't require kids to put pencil to paper (sketching examples is always a good idea, but on a summative exam most of these problems would be quick). Then because we'd already started color coding we put pink neutral faces next to one step problems and blue sad faces next to multi step problems. By the time we had finished I was okay with the number of problems, but still a bit worried about the stamina of my low level kids. Another great idea: let's put them in order of difficulty like the SAT. This way students would feel confident at the beginning and anyone who tends to shut down once they see one problem they don't know how to do would have a greater probability of getting a good chunk of the test done. I'm almost happy except this one lingering question: we only sorted the multiple choice questions, and I want everyone to spend time on open response. I'm still deciding if I want them to start with the open response or give that section out after an hour or give them the choice but make sure to require that they write something for each problem before handing it in. Something to discuss at the next meeting.

This process took a lot of time and isn't feasible for every assessment we give, but it was a great way to spend our professional development time. We got to talk about our teaching, what concepts we want to emphasize, how to assess that and implicitly, what we value. Everyone got a chance to analyze the questions which will make the data we get back so much more meaningful. (That was the issue I had with the midterm data, there wasn't time to go through this process so I didn't have any investment in the questions.) I'm interested to see how it plays out, especially to see if the problems students find most difficult match the ones we expect to challenge them.

## No comments:

## Post a Comment