Category Archives: Assessment

Flippity flash cards

I like tools that are flexible and easy to use. So while I was recently looking for a flash card tool to use with my students (there are many!) I found that lots of them required registering, or were pretty but limited in what they did, or wouldn’t allow images or text formatting. Formatting is important – I teach science, so I use a lot of subscripts and superscripts.

I was pleased to find (somewhere on page three or four of my search) flippity.net. This simple tool uses a google sheet to generate flashcards. You grab a template, enter your information in two columns (the two sides of the flash card), along with colour formatting if so desired. Publish your spreadsheet, grab the link, and paste the link into the second page of the sheet, and voila, instant flash card set. No sign in required.

flashcard image

There are ads displayed prominently on the site, which may make it unsuitable for some. But as a simple, non-login flashcard game/study aid, that allows html formatting and embedding urls for images and video, I think it has a lot of potential. I could see having students easily generate flashcards for themselves and each other as well.

The site also has templates for generating quizzes, a Jeopardy style game, name picker and  progress indicator, so there’s a lot of potential there.

 

More on my trip down the SBG road

I wrote earlier about my decision to go SBG, and my early observations of implementation. Well, at about the half-way mark through the year I compiled my thoughts about it, and put them into a video. So if you have a few minutes, let me walk you through my experience so far:

The straw that fixed the camel’s back – Moving to SBG

I am always on the lookout for ways to improve my courses. Recent(ish) innovations include flipped learning, layered curriculum, modelling, SBG, and on and on. I like them all – or rather, I like most of most of them, and parts of all of them. But inevitably there is something about them that either doesn’t fit, whether it’s with my subject, my teaching style, or the requirements of our Ontario curriculum, there always seems to be something.

But recently, while perusing again through resources on SBG (Standards Based Grading), I re-read this post by Kelly O’Shea. But this time, something clicked, and I realized how I could mesh SBG with the Ontario ministry requirements of assessment and evaluation, layer the content in a meaningful way, and have it all make sense. And It all works with how I like to do things, which is probably the most important thing.

So here’s what I’m doing:

I started by going through the list of ministry expectations for the course, and then through all of my tests and assignments, and figured out exactly what it is I want my students to know. The list came out at 82 things, which were further subdivided into categories of Knowledge, Inquiry, Communication and Application (it’s an Ontario thing…). I also identified which standards involved core knowledge and skills, and which were more advanced.

Every standard is graded on a 0-3 proficiency scale, and all standards are effectively weighted equally. The core skills, such as  I can draw and interpret d/t and v/t graphs in uniform motion, and I can identify/determine whether forces are balanced, will earn students a score up to B+ (we don’t officially have letter grades here, we have number levels, but they correlate: 1 is a D, 2 a C, 3 a B, 4 an A. You get the idea). Advanced skills add on top, bringing the mark up into A territory. Which means, technically, a student could get a B+ in the course without ever even attempting an advanced skill (but hey, if they are ninjas with the core skills, why not?). I have a few additional rules – mostly to force conversations of a student earns a 0 or 1 on a core standard, but you probably get the gist.

On any given assessment, I will typically have three or so questions for each standard (sometimes multiple standards per question), and will generate an aggregate grade of 0-3 (whole numbers only)  for each standard based on the results. The only way to get a 3 is to get 3’s on all questions addressing that standard. Two 3’s and a 2 is a 2 (since they have not fully mastered that standard). Errors on things that are not addressed by a standard in a question are given feedback, but not penalized. There are no overall grades for tests and assignments, only on standards.

Students will have regular opportunities to be re-assessed on standards.

I have only been using this method of assessment for a month now, and I have already noticed many  advantages. Because all standards are weighted equally, it forces me to create assessments that cover a balance of topics, as well as a balance of core and advanced level questions. Students and I know exactly where their strengths and weaknesses lie, and ask for specific assistance in order to achieve proficiency. And, frankly, as I start working on my first set of reports, It is ridiculously easy, as at a glance I can see a student’s progress through each standard.

I have to say, so far so good!

Random thoughts on things to implement in my class

There are lots of strategies I would like to try in my classroom, but I’m not always sure how they would work. But here are a few of the ideas I have been tossing around, in no particular order:

  • Make MUCH more use of google tools – I picked up a lot of great ideas at the GAFE summit in April, and I’m dying to put them to practical use. Pages, shared resources, research tools built in, no losing documents.
  • 20% time – based on the Google model, where employees spend 20% of their time on a project of their choice.
  • On the Fly response forms – using a generic response form and creating questions each day to go with the questions, and/or using it as an exit ticket
  • more portfolio, journaling, less testing – build emphasis on ongoing learning, break the dependency on cramming and memorization
  • “tests” as formative – Despite making practice tests available, I find students rarely make good use of them, and then doing poorly on a test comes as a complete surprise. I have considered giving tests, just as they are, as a means of  providing feedback on what students still ned to know before they complete their work on the unit – whatever that might be.
  • “streamed” course for layering/differentiation – allow students more choice in how they complete each unit. Offer perhaps three “pathways” through a unit, from more traditional reading/lecture/worksheet, to grad-school like complete independent research, with a kind of hybrid/pbl in between.
  • change the way I assess. I need to a) make students more independent and responsible for their own learning, b) make it more meaningful, ans c) make it less onerous for me.
  • flipped classroom/blended learning – get more videos up, migrate my notes online, build the course in google sites as a sort of online textbook, complete with embedded docs for students to contribute like a wiki
  • Project/inquiry based learning. I really like the concept of the modelling method. The problem is that much of the material in grades 9 and 10 is purely factual, which leaves little room for inquiry.
  • introduce students to formal logic early. Hey, it’s science. Causation vs correlation is something science students really need to know.
  • make simple interactives. Flash, Construct 2, whatever. But something that can be embedded.
  • 3 before me – help to emphasize that I may be AN expert but not THE expert, and help break their dependence on me as the sole source of knowledge. They have to consult three other sources (classmates, textbook, internet, for example) before they ask me.
  • provide a road map of the course, that students must fill out as they go, with links to their work – students often ask what we did last class, or what we are doing next class. If i provide them with a syllabus/sequence on google drive, they can make a copy, and turn each heading into a link to their own work as we go along.
  • change the way I assess – Definitely.
  • “do I get it” self-assessment checkpoints
  • Incorporate Karplus learning cycle – important, particularly in science, but tricky to make relevant when the information is predominantly fact-based.
  • have students measure and graph everything they possibly can – It’s science. Measuring and graphing are what we do.
  • Maker Spaces – I love the idea of a maker space classroom. Making something is an incredible exercise in problem solving in the real world, and students don’t get nearly enough of it.
  • Change the way I assess. ‘Nuff said.

I don’t yet know how I can implement any of this properly, and implementing all of it is nigh impossible. But I know I have to make changes, and starting with a list of possibilities seems like as good a place as any.

 

Look beyond the rhetoric

I am sure by now most educators reading this blog will have seen this cartoon:

I don’t actually know the origin of this frame, but it has been passed around a lot. The message is clear: one-size-fits-all testing is flawed, differentiation is important. While presumably intended as a critique of standardized testing, it has been spread wider, with resulting insinuations about classroom instruction as well. Today this cartoon was tagged onto the end of an email sent out to the staff, as I’m sure it has many times in many schools around the world. A colleague of mine – a veteran teacher teacher who has a knack for waving aside smoke and mirrors – sent a simple reply:

Perhaps they should not have all been registered in the course on tree climbing to begin with.

That one statement opens up a slew of issues, all of which require their own conversation around fairness, equality, homogenization vs streaming vs differentiation, the difference between primary and secondary education, who bears responsibility for ensuring students are in an appropriate program, and even the use of rhetorical devices in complex discussions of education as a whole.

I will not be elaborating further on this topic here, I have too much work to do. But feel free to talk amongst yourselves.

The answer is not “D”

I am waging a war (well, battle. Okay, skirmish) against the notion that the purpose of education is to get an answer on the paper. Nowhere is this more evident than when a student tells me the answer to a question is “D”. In a matching or labelling or multiple choice question, “D” gives no useful information. It doesn’t answer the question, unless the question is what letter comes after “C”.

The Sorcerer’s Apprentice, or Never Use a Formula You Don’t Understand

In my grade 10 Science class I recently gave my students an introductory microscope lab, and in my haste I used a “canned” lab from a textbook. Although there are some good activities in this lab, students are presented with a number of equations for determining FOV and magnification, including:

These equations, at face value, are straightforward – in other words they can plug in the numbers and get an answer. But there I something subtly insidious about them – they are just confusing enough that students are unable to apply these formula correctly later. Why? Because they are overly scripted, making the calculations look more complicated than they are, implying that without the formulas, they would not be able to achieve the “correct” answer. They build a reliance on formula, rather than concepts – and formulas without knowing what they mean can lead to trouble – much like poor Mickey’s spell in The Sorcerer’s Apprentice.*

So after an abysmal assessment (which was in part a setup – I could see they were becoming formula dependent), I gave them the following question:

Both images represent the view through the same microscope, with exactly the same settings. How big is the object in the second image?

Their first question? “Which formula do we use?”

My response was a shrug.

I watched as they struggled – one or two figured it out pretty quickly, but others tried dividing the object width (~12mm) by 7 (and some by 8!), some multiplied by 7, some divided by 40 (the circle diameter), but it was clear they were searching for a magic formula. Some, after scowling for a good long time, finally asked “which units do we use? Millimeters or UM’s?” (Aaaagh! That’s not a U! That’s a µ!)

It was challenging to subtly hint at how to simply measure the object without “giving” them the answer, because I didn’t want them to revert to the mindset of me, the teacher, as the sole gatekeeper of knowledge. Eventually they worked it out. Some estimated, some marked off the length of the object on a pencil or sheet of paper and held it to the millimeter scale, and the cleverer ones borrowed a friend’s sheet and held them together in front of a light. (And those that just used someone else’s numbers, well, I had multiple versions of the sheet, so they invariably had to redo it anyway!)

The next question was a bit more involved. I said the view in the image above was through a microscope with 10x ocular and 2x objective. I then asked what the FOV would be using a 20x objective. Despite my earlier warning stay clear of equations for this exercise, I saw many pulling out the equations from the previous lab. And that’s where they really got into trouble…

Numbers were thrown willy-nilly into the equations in the hope that somehow they were correct. Several students, despite correctly identifying the magnifications as 20x and 200x, wrote out

40 / 200 = 7mm / x

When I asked where the 40 came from, they said “low power on a microscope is 40x”.

“All microscopes?” I asked. That threw them.

Eventually I helped them work out that the higher magnification was ten times the lower magnification, so the view would be zoomed in ten times as close. The FOV should then be 10x as small (which is in itself a tricky concept, students are tempted to say 10 times the magnification means bigger, so the FOV is 10 times bigger). For most it eventually clicked that 10x the magnification means the field diameter is 10x smaller. Simple and no formulas to memorize.

It was remarkable, in a way, that a simple set of four of these questions took them a full 80 minute period – but that was mainly because I wouldn’t let them get away with wrong answers. One could call it a waste of a period, but I would not. It was absolutely necessary.

This is exactly the kind of thing Eric Mazur talks about. I will definitely be doing more of these exercises in the future!


*I mean the Fantasia version. Though that scene is included in the recent Nicholas Cage film.

Socrative: web based response system for the classroom

I like the idea of “clickers”, when used judiciously, as a means of quickly checking rates of comprehension of a topic in a non-threatening (ie anonymous) way. But there are hardware requirements – both the clickers and the receiver – and with some systems the questions have to be established ahead of time, which doesn’t always work in a dynamic classroom where the focus changes to meet the students’ needs (as opposed to the teacher’s agenda). So I had been looking for an online alternative to clickers for a while, even resorting to a Google docs form that I had to reset after each question. I guess I was looking something I could use to gauge understanding of any question, quickly, whenever I wanted. Not too much to ask, right?

Well, I recently discovered Socrative, a multi-platform web-based response system for classroom use, and it seems to meet all of my needs and then some. I would like to share my initial impressions after using it in a few of my classes.

Probably the best feature of Socrative is its simplicity. As a teacher, you connect to http://t.socrative.com, and register or sign in. When you register you designate a room number – this can include letters and numbers, so you can use a school name or your name as well as a room number. Once signed in, the screen looks like this:

Students connect using http://m.socrative.com/. No login is required, they just need your room number. They can log in using a computer or mobile device. Their screen looks like this:

Once students are connected, you can ask a question (T/F, MC, or short answer) – shout it out, write it on the board, pull it up on a PowerPoint – and simply click the question type on your screen.  The answer options appear to the students, and as they respond the results show up on your screen. This can be used for pre-planned understanding checks, or spur of the moment queries or polls.

You can also create and save quizzes, and then activate them when you want. Quizzes can be automatic or teacher paced, or they can be done as teams with the result showing up as a “space race”. There is even a selection for using Socrative as an exit ticket, using your own questions or the built in ones. For the quizzes and exit tickets, on completion the results are emailed to the teacher directly as a colour coded spreadsheet for later analysis.

As with just about everything, moderation seems to be the key. Using it judiciously seems to refocus and engage the students, while excessive use tends to be tiring. It is simple and fairly foolproof, works through our school firewall (not all web 2.0 sites do!), and works on any web-enable device. Feedback from students is positive. One student in particular who has a great deal of trouble remaining engaged in Science reported that he was complete “sucked in” by it and found himself engaged almost despite himself.

Socrative is in beta at the moment, and all features are free. When it goes to full release, they report that there will always be a free version, but that advanced features – such as uploading quizzes as a spreadsheet – would require a subscription.

 

A Foray into “non-traditional instruction”

This year, with the Ecology unit in my Grade 9 Science classes, we focused heavily on invasive species. So after seven weeks of class (we have each class every other day), I think I gave a total of 4 traditional lessons. Instead, we researched invasive species in Ontario, hiked into the park adjacent to the school to locate, identify, and map out the extent of invasive plants such as buckthorn, dog-strangling vine, Norway maple and European reed. We did further research on why these things are a problem, and then (with, I’ll admit, just a bit of prompting) the students discovered that local garden centres are selling several plants that are on the official Ontario invasive species list.

So we decided to do something about it. The students have been writing letters, and then peer-editing, and compiling and synthesizing the best bits into group letters, which I then went over with them in a serious way to ensure the message was clear, the tone was appropriate, and the information was factual. During this process several students asked “we’re not really going to mail these are we?” To which I replied there is no point in writing them if we aren’t going to mail them. Knowing these were now “real” letters, and not just mock letters for my benefit, got most of them working to make sure they were of high quality.

They are composing letters to the city councilor, the mayor, the parks department, the provincial Ministry of Natural Resources, the Minister of natural resources, the premier, the local MPP, the local federal MP, the federal Ministry of the Environment, and the CEO’s of the major garden centres. If we don’t get at least a couple of responses, I will be disappointed.

We have spent as much (if not more) time talking about the importance of a well-structured argument, the tone of a letter, conciseness, how to edit, and how to find out who to send letters to as we have on community interactions and nutrient cycles. And yet, this holistic approach hit as many of the curriculum expectations – though not as explicitly – as a series of lectures would.

For a unit assessment, I decided to go with two things, a poster where they can “brag” about the action they have taken to help with invasive species, and a portfolio of sorts. This is what I have asked of them to demonstrate what they have learned:

The Portfolio must include an item (either works or notes you have produced, or items that you have found) and an explanation of how that item can be used as evidence for each of the following open ended questions. The explanation part for each should be at least a few paragraphs – you are, after all, trying to convince me that you learned something:

  1. Something you learned that you found interesting or surprising
  2. Something you were particularly proud of learning, producing, or creating, or something you found particularly challenging
  3. What you learned about the dynamic nature of ecosystems
  4. Something you learned about the impact of human activities on the sustainability of ecosystems

I don’t know quite how I will evaluate this yet, I think I need to have the students help me with that. We’ll see how it goes.

It was kind of strange teaching like this, I’ll have to admit, but rewarding for the students (well, the ones who have taken it seriously) and me. I liked doing something real, and having the kids see for themselves the extent of invasiveness. But the sad thing is now we will be switching topics, and I have no idea how I can do something like this for the basics of atomic structure and the periodic table.

 

Baby steps to SBG

The other day I posted about my frustration with my 12 Physics class. One of the approaches I decided to take was to use a standards-based format. Or, at least, my interpretation of it. I re-read all the ministry curriculum expectations, and mapped them to specific concepts. For each concept I listed all available resources, and which sample questions from the text were appropriately representative of what I expected of them.Then I gave them the package and a checklist.

For each item, I have a few randomized questions that they must complete to demonstrate mastery. I gave students the option to work ahead, and made “lessons” (ie teacher instruction) optional.

The results? So far, tentatively positive. Many of them get the idea (but I suppose they weren’t the ones I was worried about), though some think it’s okay to just jump in and try a problem set without doing the legwork first. So in some respects completing the problem sets has become the focus, not the learning required to do so – but at least for the most part they are actually doing something constructive, rather than nothing at all, so it’s a start.

The exercise is encouraging enough that I may try something similar with the remaining units in my 9 Science and 11 Physics classes.