sbg

Modelpalooza 2014

Here in Ohio, we have a very robust and active community of modeling teachers. Each year, under the careful direction of Drs. Kathy Harper and Ted Clark, we offer three different workshops (physics, chemistry and advanced) in Columbus as well as three follow-up weekend events throughout the rest of the year. The final follow-up of the school year is Modelpalooza, a day that brings together recent workshop graduates with experienced modelers from around the state. Often, experts in the field, such as David Hestenes, will come and present a talk, and in the afternoon, the advanced workshop graduates present the modeling units that they worked together to develop.

This year’s Modelpalooza was yesterday, March 1st. It remains one of those amazing professional development opportunities where I continue to learn and be exposed to new ideas despite having used modeling instruction for the past six years. We begin the day by sharing modeling success stories. This is important for the recent workshop graduates, as it not only gives them a chance to hear from experienced modelers, but also to share and acknowledge the transformations that they’ve seen in their classrooms over the past six months.

Rather than a national speaker for this year’s Modelpalooza, we decided to take advantage of the wealth of experience right here in Ohio. Experienced modelers presented break-out sessions in the morning to share their own work. We had sessions on CASTLE Electricity by Holly McTernan, Adjusting Modeling for Different Student Abilities by Joe Griffith and ??? (Apologies mystery presenter) and Standards Based Grading and Modeling by myself. I had really wanted to see Holly’s CASTLE session, as I have no direct experience with the CASTLE materials, but my own session went pretty well. The crowd had heard of SBG, but were all relatively new to it. Primarily, the presentation was a walk-through of my own adoption of SBG and what the transition was like in my modeling-based physics classes. A few folks asked for my presentation, so I’m putting a link to it below.

Standards Based Grading & Modeling Presentation – Modelpalooza 2014

The highlight of the day for me though was Dr. Harper’s presentation during the second breakout session. While Dr. Clark was presenting on PhET Simulations in the Chemistry Classroom, Dr. Harper shared a review of the literature on Expert-Novice Comparison in Problem Solving and Alternate Problem Types. I had tweeted some of the research based conclusions that most surprised me. For those folks that were asking follow up questions, I’ve included the literature review handout below. The conclusions that stood out to me were the following:

  1. Transferring math skills into physics is more difficult than transferring them into any other area. This is the Bassok & Holyoak paper below. Does this seem counter intuitive to anyone else? I would have assumed that physics (arguably the most mathematical of the sciences) would allow for a natural flow of knowledge/skill from one domain to the other. I haven’t had time to get a hold of the paper yet, but I plan to.
  2. Students plug numbers into physics problems to free up memory slots. Cognitive research shows that most people have about seven working memory slots in which to hold a piece of information for short term retrieval. Now take a look at one of the introductory kinematics equations – \vec{x}= \vec{x}_{0}+\vec{v}_{0}\Delta t+\frac{1}{2}\vec{a}\Delta t^2. If we consider only every variable a different piece of information, there are five symbols to interpret and keep track of. Your less mathematically adept students will have even more as they work to recall delta means “change in”, etc. By substituting numerical values into the equation early, students are able to lighten their cognitive load so that they can use it for other tasks. Dr. Harper suggests that we don’t push students to solve algebraically too early in their physics courses. Show them the power of it, keep revisiting it, and work them towards this skill once they have a stronger conceptual understanding. If I recall, this is the Sweller paper.
  3. If a problem takes a high school student more than 12 minutes to solve, they think it is impossible. I believe this is the Larkin paper. We’ve all had this experience with our students, but having a hard and fast number that I can tell my students to watch out for will make it easier for me to help them develop good problem solving habits.

The afternoon sessions consisted of the advanced workshop participants presenting on their work from the previous summer. This year we had groups working on circuits for physical science, a really cool equilibrium model for chemistry, and a rates of change model using wine glasses. I am always impressed by the level of thought and work that the presenters have put into their units. We always recommend modelers take that second year workshop, as it helps you move beyond the initial modeling curriculum provided in the first year.

With Modelpalooza 2014 behind us, we move into prepping for the next round of summer workshops. You can find out more about the workshops over at the AMTA website.

Standards-Based Grading: NCGE Presentation

Tomorrow, my colleagues and I will be presenting a session at the National Conference on Girls’ Education. Our session is titled “Bet on the Yet! Promoting a Growth Mindset” and it consists of four of us taking about 15 minutes to discuss methods we have developed within our classrooms that promote growth mindsets among our students. I’ll be discussing standards-based grading which I feel has had a sizable effect on how my students view themselves as learners. This is a version of the presentation we gave at the 21st Century Athenas Symposium through the Center for Research on Girls at Laurel School.

Rather than printing numerous copies of the one-sheet to hand out, I’m posting the handout here. Hopefully folks from the NCGE conference that want to start researching SBG will give it a look, as it contains links to a number of great resources that informed my adoption and use of standards-based grading. In time, I hope to include posts about working through the adoption of SBG and the versions I’ve gone through.

Multiple Choice

How exactly does one use multiple choice and SBG/SBAR together?

a. Make a problem be worth a 4 or 1.

b. Make each answer to a question correspond to a 1-4.

c. You don’t. Multiple choice is evil.

I hadn’t even considered this until around November of this year when my first AP physics test was approaching. I like to format the classroom tests similar to the actual AP test, so that my students have exposure to the style of test that they can expect to encounter on that fateful Monday in May. So, I let the class know they could expect approximately 20 multiple choice questions on the test.

“Umm Mr. C…how are you going to grade those?”

“Uhhh…I hadn’t thought of that.”

How do you grade multiple choice in SBG? Traditionally, the problem is either right or wrong. That seems ridiculous. If I want to use these on formative assessments a student should be able to use her choice of answer to inform herself about what she does and does not understand about the topic. Sure, a 4 for a correct answer let’s her know she understood it, but a 1 for any of the other four choices on a question doesn’t fairly report the range of knowledge students display. She may have chosen the wrong answer with switched signs, while a classmate chose the answer that confused energy and momentum equations. Students need to know that difference from the feedback I give them. So, I can’t choose a in my question above.

Prior to this year, I would have chosen c. Sure, when I first started teaching, I used multiple choice for efficiency. After about two years, I realized that I could learn much more about what my students knew by phrasing the same problem as a short question. Requiring them to write a short description revealed what words they didn’t understand and which concepts were turned around in their heads. So, I swore off of multiple choice for the next few years, only making an exception for AP for the reasons outlined above.

This year though, I’m mentoring a new teacher in our school. Her and I have had a few conversations about multiple choice questions and how, when properly formatted, they can do all of those things I wanted from my short answers. This requires time and planning on my part, rather than just trying to find four other choices to slap on the page. Inspired by our discussions, I decided to try it first for a short quiz prior to the test. Each of the four incorrect choices to a problem were written to reveal a common mistake or misunderstanding. The more severe the misunderstanding, the lower the score.

I tried it out and I feel the results are mixed. When we went over the quiz in class, the mistakes students made were those that I had anticipated. However, I began wondering if this does what it set out. Without work being shown, a student may have just guessed the answer that corresponds to a 4. Additionally, I feel like I’m laying traps for my students to walk into by purposefully putting tempting answers that look close to the right one. “Ha! I knew you thought centripetal forces pointed outward!” I’m not sure how that sits with me. At the same time, I think that seeing physics questions in this format helps them score better on the AP test. Ultimately, I think I may just set up these multiple choice quizzes as test prep/review and not grade them. I’m lucky enough that my students will still take them seriously.

My SBG Start

After spending the end of last year and most of the summer reading about standards based grading, I decided to jump on the SBG express. Like a travelin’ hobo with my bindle full of dreams of meaningful grades, I found a traveling companion (the other physics teacher in my department) and together we set off for the grading frontier. Our admin team was very supportive and they gave us plenty of freedom to develop the assessments that we feel will benefit our students learning. During the summer, we met to share our standards for our first units and to discuss any anticipated problems we felt might crop up.

School started two weeks ago, and we’ve both given our first assessment under SBG. Things have gone well (though I have yet to hand back the quiz), but I want to share a couple of the pitfalls we have encountered and how we’re choosing to deal with them.

Multiple grades for a single standard on one assessment.

Often, a quiz will contain multiple questions that address a single standard. Should we grade them all as a single attempt, average the scores together, or use the score on the last (i.e. most recent) problem? Maybe I should just write smaller assessments, but physics is such a cumulative science that standards are going to pop up over and over. I’ve decided to use all of the attempts on a single assessment in determining the score I report to the students. I think that gives them a more representative view of their current knowledge and avoids awarding coincidental correct answers.

General problem solving (arithmetic, algebra, units, sig figs, etc.)

So, what do you do with all of the math background skills that a student is supposed to be proficient in when they enter your physics class? Previously, I’d penalize a student 1/2 point for every missing unit or a careless algebra mistake. With SBG, I can’t in good conscience lower a student’s score on “Propagates error in sums/differences” when they forget to divide by 2π. Instead, I created a separate standard called “Practices good problem solving” which is my catch all standard for those math/technical skills needed to succeed in physics that I expect students to have when they enter the class. Additionally, if a student receives a low score in this, they still know what area to focus their efforts on.

The above is certainly not exhaustive and I imagine I’ll have more to share in the future. We’re still working out the translation of standards scores to percent/letter grades and how semester exams affect grades, but thus far, I think we are both happy with where this is leading.