Category Archives: HDYMT

The straw that fixed the camel’s back – Moving to SBG

I am always on the lookout for ways to improve my courses. Recent(ish) innovations include flipped learning, layered curriculum, modelling, SBG, and on and on. I like them all – or rather, I like most of most of them, and parts of all of them. But inevitably there is something about them that either doesn’t fit, whether it’s with my subject, my teaching style, or the requirements of our Ontario curriculum, there always seems to be something.

But recently, while perusing again through resources on SBG (Standards Based Grading), I re-read this post by Kelly O’Shea. But this time, something clicked, and I realized how I could mesh SBG with the Ontario ministry requirements of assessment and evaluation, layer the content in a meaningful way, and have it all make sense. And It all works with how I like to do things, which is probably the most important thing.

So here’s what I’m doing:

I started by going through the list of ministry expectations for the course, and then through all of my tests and assignments, and figured out exactly what it is I want my students to know. The list came out at 82 things, which were further subdivided into categories of Knowledge, Inquiry, Communication and Application (it’s an Ontario thing…). I also identified which standards involved core knowledge and skills, and which were more advanced.

Every standard is graded on a 0-3 proficiency scale, and all standards are effectively weighted equally. The core skills, such as  I can draw and interpret d/t and v/t graphs in uniform motion, and I can identify/determine whether forces are balanced, will earn students a score up to B+ (we don’t officially have letter grades here, we have number levels, but they correlate: 1 is a D, 2 a C, 3 a B, 4 an A. You get the idea). Advanced skills add on top, bringing the mark up into A territory. Which means, technically, a student could get a B+ in the course without ever even attempting an advanced skill (but hey, if they are ninjas with the core skills, why not?). I have a few additional rules – mostly to force conversations of a student earns a 0 or 1 on a core standard, but you probably get the gist.

On any given assessment, I will typically have three or so questions for each standard (sometimes multiple standards per question), and will generate an aggregate grade of 0-3 (whole numbers only)  for each standard based on the results. The only way to get a 3 is to get 3’s on all questions addressing that standard. Two 3’s and a 2 is a 2 (since they have not fully mastered that standard). Errors on things that are not addressed by a standard in a question are given feedback, but not penalized. There are no overall grades for tests and assignments, only on standards.

Students will have regular opportunities to be re-assessed on standards.

I have only been using this method of assessment for a month now, and I have already noticed many  advantages. Because all standards are weighted equally, it forces me to create assessments that cover a balance of topics, as well as a balance of core and advanced level questions. Students and I know exactly where their strengths and weaknesses lie, and ask for specific assistance in order to achieve proficiency. And, frankly, as I start working on my first set of reports, It is ridiculously easy, as at a glance I can see a student’s progress through each standard.

I have to say, so far so good!

A little Google docs win – almost

Today I played the Deer Game with my grade nines to illustrate population dynamics, limiting factors and carrying capacity. We usually iterate a dozen or so times, counting and recording the dear population on each turn, and then enter the results on a spreadsheet and graph it when we get back to the classroom.

Well, this year I did something a little different. I created a Google sheet, and created a chart based on a set of (presently empty) cells. I left this display up in the classroom when we went outside. During the game, I recorded the results each round on my phone, in the same spreadsheet. So when we wrapped up and went back to class the graph was already waiting for us on the screen. As a bonus, we ran this activity with two classes combined, and both classes could see the same data.

This was a great little timesaver, and I would rave about it, except for one little thing – scatter plots. The scatter plots in Google sheets do not show connecting lines or curves, just the scatter points. For this activity a line graph is sufficient, because we are iterating equal intervals, but uncovering a glaring hole in the capabilities of Google sheets slightly tarnished my esteem for this set of tools.

How do you science?

Yesterday in class one of my students asked “am I sciencing right?”

After giving him a high five for cleverness, I got to thinking more seriously about that question. In courses like Drama, Art, Music, and Physical Education the students spend a minimal amount of time learning theory, and maximal time practising it. Doing it.

In Science, at least in the introductory courses, there seems to be so much emphasis on basic facts that there is little time left to do science, or as my student would to science. Now, admittedly, there is several centuries worth of background to what goes on in our daily lives. Even up through much of my undergrad studies my courses were pumping me full of background knowledge, with little emphasis on doing science. It wasn’t until my undergraduate thesis and grad school that I got to actually science.

Science is often touted as a subject that requires inquiry, but most science courses actually don’t. Researching facts is an important aspect of science, but it is the preliminary legwork before the actual science begins. When we actually science, we are actively investigating and experimenting, troubleshooting, problem solving, analyzing, and synthesizing. Using our brains, designing, building, testing.

I realize there are some programs, such as the modelling method, that allow (or require!) students to science in high school, but availability and training for these types of programs is not (yet) widespread. I think it’s time to spread the science. If you implement a course, particularly an introductory science course at any level, that is built on students actively doing science, I would like to hear from you. How do you science?

 

Check Out Project Noah!

 

I’m not sure why it took me so long to discover Project Noah. It is a citizen science community that, in their words, is intended to be “a fun, location-based mobile application to encourage people to reconnect with nature and document local wildlife”. Essentially, you take pictures of animals and plants, and upload to the Project Noah website. But it is based around smartphones. With the app installed on your iPhone or Android, users can snap anything interesting (or mundane, too) and upload. You have the option of identifying what you have uploaded, or requesting identification. The location information can optionally be attached to help learn more about geographic distribution. There is also a social network for chats and discussion, and even patches for accomplishments.

The images and locations are searchable online, so it can be used by amateurs and researchers alike, and as they say their “ultimate goal is to build the go-to platform for documenting all the world’s organisms and through doing this we hope to develop an effective way to measure Mother Nature’s pulse.”

I uploaded my first image today of a snail (me: “Oh! Gotta take a pic of this snail!” My wife: “Geek”),  and I took the time to double check the identification and enter that information. I think there is potential for this to be used in the classroom in many ways – an image resource, a class project or hands-on biodiversity lesson. Having to take a few minutes to identify and classify what has been found is an extra layer of analysis and engagement which requires a bit of patience, but pays off.

Of course, now that I have my first upload, I’m hooked. And, as with my snail picture, I expect to be called a geek a lot more often…

 

The Sorcerer’s Apprentice, or Never Use a Formula You Don’t Understand

In my grade 10 Science class I recently gave my students an introductory microscope lab, and in my haste I used a “canned” lab from a textbook. Although there are some good activities in this lab, students are presented with a number of equations for determining FOV and magnification, including:

These equations, at face value, are straightforward – in other words they can plug in the numbers and get an answer. But there I something subtly insidious about them – they are just confusing enough that students are unable to apply these formula correctly later. Why? Because they are overly scripted, making the calculations look more complicated than they are, implying that without the formulas, they would not be able to achieve the “correct” answer. They build a reliance on formula, rather than concepts – and formulas without knowing what they mean can lead to trouble – much like poor Mickey’s spell in The Sorcerer’s Apprentice.*

So after an abysmal assessment (which was in part a setup – I could see they were becoming formula dependent), I gave them the following question:

Both images represent the view through the same microscope, with exactly the same settings. How big is the object in the second image?

Their first question? “Which formula do we use?”

My response was a shrug.

I watched as they struggled – one or two figured it out pretty quickly, but others tried dividing the object width (~12mm) by 7 (and some by 8!), some multiplied by 7, some divided by 40 (the circle diameter), but it was clear they were searching for a magic formula. Some, after scowling for a good long time, finally asked “which units do we use? Millimeters or UM’s?” (Aaaagh! That’s not a U! That’s a µ!)

It was challenging to subtly hint at how to simply measure the object without “giving” them the answer, because I didn’t want them to revert to the mindset of me, the teacher, as the sole gatekeeper of knowledge. Eventually they worked it out. Some estimated, some marked off the length of the object on a pencil or sheet of paper and held it to the millimeter scale, and the cleverer ones borrowed a friend’s sheet and held them together in front of a light. (And those that just used someone else’s numbers, well, I had multiple versions of the sheet, so they invariably had to redo it anyway!)

The next question was a bit more involved. I said the view in the image above was through a microscope with 10x ocular and 2x objective. I then asked what the FOV would be using a 20x objective. Despite my earlier warning stay clear of equations for this exercise, I saw many pulling out the equations from the previous lab. And that’s where they really got into trouble…

Numbers were thrown willy-nilly into the equations in the hope that somehow they were correct. Several students, despite correctly identifying the magnifications as 20x and 200x, wrote out

40 / 200 = 7mm / x

When I asked where the 40 came from, they said “low power on a microscope is 40x”.

“All microscopes?” I asked. That threw them.

Eventually I helped them work out that the higher magnification was ten times the lower magnification, so the view would be zoomed in ten times as close. The FOV should then be 10x as small (which is in itself a tricky concept, students are tempted to say 10 times the magnification means bigger, so the FOV is 10 times bigger). For most it eventually clicked that 10x the magnification means the field diameter is 10x smaller. Simple and no formulas to memorize.

It was remarkable, in a way, that a simple set of four of these questions took them a full 80 minute period – but that was mainly because I wouldn’t let them get away with wrong answers. One could call it a waste of a period, but I would not. It was absolutely necessary.

This is exactly the kind of thing Eric Mazur talks about. I will definitely be doing more of these exercises in the future!


*I mean the Fantasia version. Though that scene is included in the recent Nicholas Cage film.

Mitosis lab as the basis for a discussion of error

We were doing a lab in Biology the other day. AP Biology teachers will be familiar with it, but the basic idea is that students count the number of cells of each phase in the meristem of an onion root tip using a microscope. By using the count of each phase, they can estimate the percentage of time each phase takes. At 400x, there are typically about 250 or so cells in the field of view, and the phase of each cell is not always certain, so it is not a trivial task. With care and patience, however, results can be pretty good. My students were quite confident with their counts. And, in truth, their combined data provided values that were not at all bad.

But here’s the interesting part. I also gave them a sheet with a couple of these images (click to enlarge), and asked them to find and circle one example of each phase.

And, collectively,  They did really poorly. Few of the students could correctly identify three or more phases. And yes, these are photos of the same microscope slides they were using.

I am still coming to grips with the ramifications of this discrepancy, but some of the conclusions are:

  1. Students’ confidence in their ability is not always a good measure of their ability
  2. This mitosis activity is is fairly immune to error
  3. Sometimes, biology can be harder than it looks, and it is NOT about memorizing
  4. Real life doesn’t look like the picture in the textbook.
  5. This is an excellent exercise to highlight and discuss points 1-4 above with the students.

One added observation – while the students were not all able to identify the phases of mitosis, they did notice instantly that they did not have the same images as the person next to them. Hmmm.

Dynamic video of the sun

Since this is TeachScience.net I think it is time to get back to the Science part, as I have been focusing on the Teaching part lately.

One of my interests is astronomy – I am an avid amateur astronomer, and one of my other websites is The Budget Astronomer. I have the opportunity to teach an Astronomy unit to my grade 9 Science class, and I like to incorporate as much real-life astronomy coolness as I can. There is lots of it, and frankly textbook diagrams just do not do justice to the universe.

The sun is the closest star to us, and one we have been studying for years, and yet it still has all kinds of excitement and mystery. There are several satellites that do nothing but image the sun in different wavelengths, such as SOHO, STEREO, and SDO, And it is the latter that produces the data for the tool I want to talk about. The tool is called JHelioviewer, which is a freeware Java utility that downloads images from the SDO database and compiles them into a hi resolution video.

When started, it downloads a video from the latest set of data for the day, with frames at 30 minute intervals. You can view, zoom, play, step, and enjoy the coolness, and then you can download your own datasets, choosing dates, time range, and wavelength. This last is important, as different wavelengths emphasize different features . Prominences, for example, show up well at 304 Å, but not at all at 4500 Å. In addition, you can download multiple wavelengths and have them superimposed on each other, each with its own colour scheme (since all of these are in UV wavelengths, they are all false-colour, but you can select the palette).

You can zoom in, and even track features over time, and then you can save the results as a video, like this:

Now show me a textbook diagram that can do that!

That video, by the way, is of the massive solar flare that took place on June 7 2011, so now anyone can go and make their own videos of the event in any wavelength, or even multiple wavelengths. Students could make their own videos, and track the movement of sunspots to estimate the period of rotation of the sun, and the evolution of sunspots and prominences.

Go play with it.

 

Using the Sloan SkyServer

In Grade 9 Science, we are currently doing the Astronomy unit. Today we were looking at the shapes of galaxies, and a variety of deep sky objects one can see with even a modest telescope. Instead of just rambling off a list of objects, I did a quick introduction to the different types of objects, and then divided them up into groups and set them on a scavenger hunt using the Sloan Digital Sky Survey data (http://skyserver.sdss3.org/dr8/en/proj/basic/scavenger/). This gave students an opportunity to explore real imaging data for specific types of objects.In other words, they had to observe, and think, while hunting for really cool stuff.

The best part of an activity like this is when students find something really great, and the gasps and squeels of joy spread through the class. Like when one student found NGC4030, a bright face-on spiral:

And then another found NGC4437, a grand, edge-on spiral:

For these students, they really felt the thrill of discovery, and that thrill can be contagious.

It was not a perfect class though. We had some technical issues that slowed us down – connection problems that prevented some students from accessing the site properly – and some students who were not engaged, despite the utter coolness of it. But that is another story for another entry; today’s post is about the awesomeness of hands-on learning with real data.

 

Getting to “I don’t know”

I like Isaac Asimov’s quote about Science:

The most exciting phrase to hear in science, the one that heralds new discoveries, is not ‘Eureka!’ (I found it!) but ‘That’s funny …’

I tell them repeatedly that all the interesting stuff in science is the unexpected. When things don’t quite turn out how we thought it doesn’t mean its wrong – just that there is more going on than the oversimplification of textbooks would lead us to believe. I love it when we encounter something odd – for me, the best thing that can happen in a Science class is the students push the boundaries of what I can explain, so we have to learn together. I love getting to “I don’t know”.

Today, we were examining resonance in a tube open at both ends, when it resonates after being “bopped” on one end. Here is the spectrogram:

Frequency is on the vertical axis, time horizontal, and intensity is the brightness. Hitting this tube clearly produces frequencies of 200 & 400 Hz. The fundamental frequency of a 40 cm tube open at both ends is 400Hz. So where does the 200Hz sound come from?

Ah, well, hitting the tube on one end temporarily makes it closed at that end, so it should, at that time, resonate at 200Hz. Fine. But why does it continue to resonate at 200Hz? The 200Hz resonance clearly continues well past the peak of the 400Hz resonance, and yet the tube should not be able to sustain it with both ends open. So why does that happen? I don’t know.

Isn’t Science cool?

Great free tools for sound recording and analysis

With my grade 11 Physics class we are currently studying sound, and we have been using a variety of tools. Here are some of the great free tools that we have found useful:

Free Audio Editor

The title of this software pretty much says it all. It is a compact, easy to use and fairly comprehensive piece of software for recording and editing audio. Perhaps not as well known in educational circles as Audacity, but I find it slightly more intuitive to use. By capturing the full audio signal, the user can see the entire envelope of sound, or zoom in to see the actual waveform. I also use it to record voiceovers for videos or presentations. Could be used for podcasts too. The most recent version I downloaded installed a browser toolbar, but this is easily disabled if unwanted.

Free Audio Editor

 

Visual Analyzer

Developed by Alfredo Accattatis for academic purposes, and just for the love of it, VA is an oscilloscope and spectrum analyzer that uses nothing more than the sound card on your computer. It shows a live waveform on the oscilloscope, and a live frequency spectrum on the analyzer. It even has a “3d” function so you can see how the frequency changes with time. It can record short snapshots of both the waveform and spectrum for detailed analysis.  A very powerful tool.

Visual Analyzer - oscilloscope on top, 3d frequency spectrum below

Here’s a video showing how VA is used:

 

Raven Lite

Raven is produced by the Cornell Lab of Ornithology Bioacoustics Research Program. It is designed to record, play back, visualize and analyze sounds – whether musical notes, complex bird calls or whale songs. The Lite version is free for hobbyists and educators, though it must be “purchased” through the online store to receive an activation license. This software combines many of the features of both Free Audio Editor and Visual Analyzer, in that it records and displays the amplitude trace (waveform/oscilloscope) as well as a frequency spectrogram. This spectrogram is actually a 3-d graph, showing frequency over time, with intensity or colour representing the amplitude. This makes it more complicated for novices to interpret, but shows changes in frequency in a very visual way. It also aligns the frequency response to the waveform, so the two can be compared together.

Waveform (top) and spectrogram (bottom) of my voice.

Analysis of a Cardinal song (well, my impression of one...)

 

Since each of these fills a slightly different niche, it is not an either/or thing – I use each of these differently, and I do use all of them. And since they are free, the price is definitely right!