Guest Blog by Brenda Moore and Kent Rockwell
Kent and Brenda currently teach science at Alpha Secondary in Burnaby, BC. Kent teaches junior science and Chemistry. He is very interested in new possibilities for assessment and hands on ways to show learning. Brenda teaches junior Science, Math, and Physics. She is passionate about integrating her engineering background into activities that allow students to explore the design thinking and skill development.
Looking for an innovative way to assess the curricular competencies in your secondary science classroom? We were inspired by the Ministry of Education’s Framework for Classroom Assessment to create our own “Case Study Assessment” (CSA) for junior science (see image above).
Following the Sample Application for Grade 9 Science, we sought out major thematic pieces within Science 9 and 10 to use as a foundation for each CSA. For Science 10, we chose acid-base chemistry, and for Science 9, we chose Earth’s spheres. Links to our final CSA’s can be found here. We’ll take this opportunity to walk you through the process behind the final product!
Taking our Science 10 CSA, for example, we were looking for ways to assess students’ development of curricular competencies through the content pieces of pH and indicators. Our vision was for a place-based narrative to frame the CSA and provide meaningful context for the assessment. Through brainstorming we came up with the idea of contaminated run-off, and quickly linked this to the historical operations at Britannia Mine. When we started researching Brittania Mine and its environmental impact on the Howe Sound, we discovered that this was an issue that had been researched for many years. We constructed the CSA by providing detailed and relevant background information, and then dived into context-specific questions aligned with the Criteria Categories above. Students were expected to work within the narrative to analyze the raw data in order to form realistic conclusions about the impact of the mining activities on the natural environment.
When we implemented this in classes, students worked collaboratively on the CSA in partners. Students were assessed using the proficiency scale (see image below) in each of the curricular competencies.
We observed that students were engaged, focused and most importantly actively communicating with one another to develop their responses. Their partner discussions were rich and lively as they sought to support their ideas with evidence and reasoning. Students seemed more relaxed than a traditional test, as they focused on their critical thinking and communication as opposed to rote facts and repetition. After the CSA, students began posing questions related to the narrative and were eager to know ways to improve their responses.
Going forward, our vision is for each Big Idea in Science 8, 9, and 10 to be assessed using a CSA in place of a traditional unit test. The open-ended nature of this assessment shifts student focus away from content to thinking processes and reasoning, which better allows the educator to assess student development aligned with the competencies.
Other Case Study Data Resources:
The Spring Intake for the BCScTA Roots Grant is underway. We are currently accepting applications until May 31, 2019. Please go to the following link to apply.
Communities of science teachers are the roots of the BCScTA.
The BScTA is offering another two Science Roots Grants to support collaborative groups in quality science education initiatives to its current members for the 2019-2020 School Year.
In recognition of this, The BCScTA is supporting several collaborative groups, consisting of BCTF members, by providing funding for quality science education initiatives. These initiatives can take many forms and we are open to unique and creative submissions.
Examples of such initiatives could be: a collaborative teacher group developing assessment tools or science lab activities; developing methods of integrating more inquiry-based approaches; an exchange or articulation between elementary and secondary science teachers so that both levels vertically align with each other; a book club to expand on a teaching approach or idea; or sound approaches to authentically integrate First People’s Principles of Learning.
For this intake, the Science World kindly supports the Roots Grant recipients with admission for up to 90 students from their respective schools and $750 each for reimbursement of transportation.
For more information, contact email@example.com
Presenter, Exhibitor, and Sponsor applications are now being accepted.
2019 Science Games Steering Committee
Engineers and Geoscientists BC is looking for two educators to volunteer on our 2019 Science Games Steering Committee. This group develops the activities for our annual Science Games event in March. At the Science Games students from Grades 1-6 work in teams as they complete various hands-on science challenges. Division 1 activities are designed for students in Grades 1-3 and Division 2 activities are designed for students Grades 4-6. We’re looking for educators to volunteer on the committee and provide insight on the science curriculum for these grades and how we can tailor our challenges so they are appropriate for these age groups. Learn more about this volunteer opportunity or apply online.
If you have questions about this volunteer opportunity, please contact firstname.lastname@example.org.
TRIUMF & The BC Association of Physics Teachers
Are pleased to announce
“Kindling your passion for physics teaching”
Are you going to be teaching physics next year?
Come and be inspired by award-winning physics researchers and educators while networking with colleagues sharing practical resources, and learning about physics applications.
A Conference and Workshops for Secondary Science Teachers
Provincial Pro-D Day, Friday October 19th, 2018 at TRIUMF, Vancouver BC.
We are delighted to announce this year’s keynote speaker, Officer of the Order of Canada,
Dr. Jaymie Matthews – UBC Physics and Astronomy
SAVE THE DATE!
Detailed program and registration information to follow in September 2018.
Fix 15: Don’t leave students out of the grading process. Involve students; they can and should play key roles in assessment and grading and promote achievement.
Brilliant. Happy to end this journey on a positive note note. In my last post I was questioning leaving my students more “in the dark” so to speak about assessment, but I’m pleased to see that this isn’t a helpful practice. It’s nice to see my instincts and / or training are largely congruent to Ken’s 15 Fixes.
So, unfortunately with some of my students, I don’t feel that their involvement in assessment promotes achievement… yet. This is more of a longer term cultural shift for them I think, and it just takes time and consistency. Hopefully another year at the same school and the same crew of kids brings out that ownership in their own learning.
On that note I’m going to cut it short and say thanks for tuning in. I had fun with this project, and I always enjoy looking critically at my own practice. I strive to find better ways to engage my students and help them feel a sense of pride and curiosity around my lessons and their school. I think assessment in general is a real driver of both positive and negative associations with school, and I’m hoping that these 15 Fixes “rekindle the fire” so to speak about your own forward thinking in your classroom. Have a great summer!
Fix 14: Don’t summarize evidence accumulated over time when learning is developmental and will grow with time and repeated opportunities; in those instances, emphasize more recent achievement
Have I mentioned that this book is amazing, and the lessons summarized are both intuitive and overwhelming all at the same time? I have dove in the deep end; I am on board, I have bought into these best practices for assessment. That said, organizing something like this, in my head or on my computer, is a seemingly difficult task. Maybe I just picture some of my parent teacher interviews I have had. I am a very organized person, but my system looks completely disorganized to a parent- especially a critical parent wondering why their child isn’t doing so hot. I usually have student names going along the Y axis, learning outcomes going across the X axis, and have somehow (even though it is on a screen or on paper) added a Z axis for time. I have both paper based records for my day to day observations, and I update marks “that count” every few weeks into our system. There is a number system to show levels of understanding (1, 2 or 3); I have tried colour coordinating, and I have a variety of shorthand letters or symbols that mean various things. I try to annotate any missing assignments for the student as they arise (to me an excused absence for the dentist is different than sleeping in or skipping). I think relationships are more important than record keeping, but I also don’t want to get caught with my pants down if a parent or principal has questions. It’s a delicate balance…
At the end of the day, learning takes time, and I really try to be clear with my students that their practice doesn’t count; taking risks in class is not going to bring their marks down. The term “weight” is a little confusing for some of them, even some grade 10’s and 11’s still don’t get that a 20 question test at the end of a unit will drive their grade disproportionately more than a 20 question assignment. I think students should know how they are being marked, and understand the reasons behind it. You can’t hit a bullseye if you don’t know where it is. Maybe that is my problem? I’m too transparent with my students, and they don’t have the maturity, vocabulary or training to understand it? Should I just say how the course will be on the syllabus and the first day of class and then leave more mystery to it all? Something to consider I guess…
Getting back to Fix 14 before I totally go off the rails. Maybe the difficulty I have scaffolding this is focussing on too big or long-term of learning goals. Is the learning goal I know the locations and charges of protons, neutrons and electrons in an atom or is it chemical processes require energy change as atoms are rearranged? One goal is 60 minutes tops, the other is 60 days. So keeping a running tally of where a student is at, so to speak, with one goal is much easier to document than the other.
So as usual, my reflection has both reaffirmed my philosophy about science education and assessment and also totally twisted everything around backwards and upside down at the same time. Thanks, as usual even though I don’t mention it specifically, for any comments or insight you can provide. As a related side note, please do keep in mind the Canadian Assessment for Learning Conference & Symposium – Location in Delta – May 1-3. I will be there will bells on.
Fix 13: Don’t use information from formative assessments and practice to determine grades; use only summative evidence.
I think I have mentioned this before, but I promise I am not looking ahead; I just mentioned this yesterday! If it was me learning something new I would only want summative evidence to determine my grade. If I had my 6 month performance evaluation at work scheduled (hypothetical) I would want to practice a specific set of skills, and have my final mark only based on the end of my learning journey. I think a student learning an introduction to Biology or Chemistry at their grade-specific level should be no different. Why would you want your mark to be an average of all your mixed results throughout?
The students, however, seem to think differently; maybe it is an age thing? My own daughter is in kindergarten. She would never have a final test on writing her name at the end of June. Her grade is based on evidence of an ongoing evolution of her skills. Do primary students have their reporting based entirely on Formative assessment? Then by Grade 11 and 12 students have their grade based mainly on Summative assessment? Upper intermediate and junior high school seem to be bridging the gap, from what I have seen. So how do I make it clear to a 14 or 15 year old that giving a final grade based on formative checkpoints in the middle isn’t really fair to them?
Here is how I have orchestrated the conversation, to get my students to “buy into” having a mark based mostly on Summative assessment. Most of my kids are starting to drive when I teach them. To get your driver’s license in BC you have to go through the following steps:
Students see the safety implication with this analogy, and agree that they wouldn’t want the ability to drive to be based on all of their early attempts “weighing down the average”. So back to the skills and knowledge being measured in a science class. I think the biggest psychological obstacle with basing the students mark entirely on summative assessment is what the students picture a test is; maybe it is a “test” thing? Fifty or more multiple choice questions, no talking, no phone, no binder. This environment is proven to cause anxiety for many, adults and children included. And is it really an accurate reflection of learning? Maybe for some students, but certainly not for all.
I really like the direction education is moving, whereby summative assessment is a conversation or a presentation of learning in a means of your choosing. Summative assessment can be include self-assessment, because let’s face it, students are much harder critics than we are. When was the last time you studied for a written or multiple choice test as an adult? Our “tests” are based mainly on observations and conversations, and sure may be a score at the end, but it is a much more authentic experience of where we are at in a given context. Think job interview or checking in on your quarterly sales goals. Don’t our students deserve the same time and respect?
Fix 12: Don’t include zeros in grade determination when evidence is missing or as punishment; use alternatives, such as reassessing to determine real achievement or use “I” for Incomplete or Insufficient Evidence
Ok readers, here we go, the final few fixes. Ironically I feel a little like my students do this time of year, in that I’m ready to accept the zero in my final 3 or 4 tasks because I’m burnt out. Almost there though, close to the finish line and ready to finish strong.
Throughout the school year I try to give students progress reports every two or three weeks, and have “missing” as a placeholder if they forgot to hand an assignment in, or chose not to do it for whatever reason. I have two very different students in mind who, by June 22nd when our report cards were due, had nearly half of the terms assignments missing. I did not put any zero’s in; both students were made aware of the catch-up day on Monday, but I did not make it mandatory for either of them. I Both students took a final unit exam: one student got 50% on the test, the other got 80%. I am pretty confident that this is a realistic evaluation of how much both students understand of the subject. I left the holes in my marks books as holes and posted their grades based solely on this final test.
This particular class I’m describing is Science 10; my breakdown is 30% formative assessment (labs, assignments) and 70% summative (Chapter Quizzes and Unit Tests). I actually consulted the class, and we came up with this breakdown collaboratively. I describe what formative and summative mean to them, and that I want to help them learn as much as possible during instruction, then measure it at the end; that to me is the point of assessment. Many students want some credit for “practice” assignments, and realise that daily work will (usually) bring their mark up, and their level of understanding up. It’s interesting to me that they don’t see their mark being the same thing as their level of understanding. Additionally, many students feel anxiety around tests, and don’t feel it is fair to base the whole course on summative assessments. It was interesting for me to hear their opinions about assessment, and I try to be honest and responsive to their opinions so we can come up with a fair plan for everyone.
I’m curious now to run the numbers on my 80 and 50 percentage students, to see how much completing their missing assignments would affect their mark, and I can only speculate how much it would affect their level of understanding. The potential disconnect between these two is something I will definitely keep in mind as I plan for next year…