by John Beisner
Okay before we begin the review of this curriculum, let me state the obvious: data and modeling is somewhat less than incredibly fun. I’d put it about two ticks below feral cat bathing on the ol’ fun-o-meter. But it is no less obvious (albeit less fun to observe) that data and modelling is important. It’s at least 15 ticks above cat bathing, and given that this curriculum was created in-house to help prepare our students to succeed on the CAASPP, and given that student performance on that mandatory exam has real implications for our success as an organization, I’d say it’s well worth an in-depth look.
So look: on the intranet under “ISP Academic Curriculum” you’ll find “CAASPP Prep” listed under “Academic Development” in the center column (towards the bottom). Clicking on that will lead you into a magical world of new curriculum designed to help our students boost their test scores. Have you ever been into this enchanted corner of the intranet? Few have, and yet behold the riches that await intrepid instructors! Come with me, bold persons, and click on the link called “here” (for the Data and Modeling page).
There they are! Three glittering new packets. Are they for Math credit? Science? Advanced English? What if I were to tell you that they were potentially available for all of the above? Feel free to gasp. I’m gasping so much I may just have the hiccups.
But wait, can you hear that voice echoing in the shadows? What’s it saying? Listen closely! It says
“It is not necessary for students to complete the units in any particular order, but please note that the third unit has a significantly higher GE than the first two units. Students will earn credit in general math and physical science for all three units, and they will also earn advanced English credit for the third unit.” Thanks, kind wizard! Now it’s clear: Units 1 and 2 require a TABE of 5.0, while Unit 3 requires a TABE of 8.0. Though I can hardly wait to review them all, diligent application of the scientific technique known as “eenie meenie miney mo” has clearly indicated that packet two, “Introduction to Experimental Design” will be our point of entry.
First things to note: there are separate Form 1’s for the Math and Science credits. Since we don’t use Form 1’s anymore (!) it seems these documents are of little use. This is a shame since they’re particularly beautiful Form 1’s and have some vital information: this packet is work HALF a credit in Science and half a credit in General Math. It might be important to tell a student up-front that, while they can receive credit in two subject areas, this is NOT a two credit packet.
Another thing to note: at present, no teacher guide or answer key is available on the intranet. Would this be enough to keep some teachers from assigning it? Maybe so. But is the packet so difficult and dense that it would take an instructor an exorbitant amount of time to grade? I think not, but let’s look.
Page 1 has a nice picture. That’s actually an important attribute since we (teachers, students and humans in general) really do tend to judge books by their covers. (This is not an aphorism. By “books” I literally mean books.) This looks like a packet that students might want to open.
So let’s open it. I won’t go page by page. Rather, let’s turn to our trusty Curriculum Assessment Rubric. Is the packet…
1) Is this packet Relevant?
In the sense that it’s preparing our students to take and do well on the CAASPP, it’s arguably one of the most relevant packets we can offer. Designing experiments is also essential to the scientific process, so developing an understanding of how science works could go a long way to helping students understand later science curriculum later on, or even to critically analyze science-related discourse they might encounter in their day to day lives. So in that sense too, it’s relevant. But this is all teacher talk. Is the packet relevant in a way that students will recognize? As in, is it relevant the way the DMV packet is relevant? Is it practical and engaging because students will be able to immediately implement its lessons into their lives? The answer is probably definitely “no.” This isn’t necessarily a knock against the packet though, since one could say the same thing about nearly all our packets or high school curriculum in general. Still, this lack of relevance does have the potential to diminish student enthusiasm. So it goes.
2) Is it Engaging?
There are some minor things that could help the packet grab the reader. Some of the formatting and layout can seem a bit busy and potentially intimidating. Key vocabulary could be bolded, for example. While some words such as “randomized” are treated as new vocabulary, other words like “causation” and even “manipulate” are not. These might constitute intimidating or discouraging vocabulary on a GE 5.0 packet. Though students are told “if you are still unsure about any of the words above, consult a dictionary or ask your instructor,” this is something that students often find uncomfortable or tedious. I think it’s incumbent on the curriculum itself to make these words accessible, especially for lower-level packets.
Also, we’re told that, in a good survey question, “the researcher's opinion shouldn’t be included.” This echos the opening anticipatory set wherein students are asked to compare and contrast subjective versus objective statements, yet the connection isn’t made explicit in this instance. Is an opinion by definition subjective? This seems too epistemologically complex for page 13. Perhaps an exercise in making leading questions neutral or neutral questions leading might help illustrate the essential point about good survey questions. I think this criticism could be applied to the packet in general: asking students to develop and demonstrate their understanding of what makes “good” or “bad” practices regarding discrete aspects of experimental design might be both more engaging, challenging and illustrative than asking students larger, summary-style questions such as “what is a research question” or “identify two characteristics [of a designed experimental study].” These are little things that might make the packet a bit more engaging.
3) Is it Accessible and learner oriented?
The above observations could apply to Accessibility as well. Now let’s look at if it’s Learner-Oriented. In section 3 on page 31, we’re finally invited to look at an experiment example and criticize it using the vocabulary and criteria we’ve accumulated so far. The sample experiment is followed by several pages of sample data. The learner is then asked to evaluate the experiment and the data in a series of questions. Yet missing from the final questions are those larger, more challenging questions that invite the learner to think critically and become cognizant of their own understanding or lack thereof. While those questions seemed a little too much to swallow earlier in the packet, at this later stage it could constitute a critical, summative step before the final task in the packet. That task is where students are asked to design their own experiments from start to finish much as they would be asked to do in on the CAASPP. Perhaps a little more guided practice would help the student move on to this big final task with a little more confidence. As it is, the packet seems oriented slightly more towards developing students skills, rather than prioritizing their learning experience. What does that mean? I guess we’ll have to ask Dr Geigenhopfer.
4) Is the packet Informative?
I’d say yes. It contains information and it shares it. It could contain more information, but then it’d be a longer packet. On the other hand it could contain less information, but then it wouldn’t be as informative. So I’d say it’s struck just about the perfect balance in order to become the packet that it is. If it had done otherwise, it would be different. Some might say that this has already happened in a parallel universe, and though there’s abundant evidence to suggest that this is factually accurate according to particle physics and the tenants of string theory, I say that’s a bunch of poppycock. Leave it to alternative wloggers to critique the alternative packets. I’m critiquing this one, and I say it has exactly as much information as it does and no more. So there.
Okay that’s it! So is the packet good? I can definitely say that it isn’t not good. Is it REALI good? That’s harder to say precisely since we sort of skipped the A and the I and conflated the L and the E. So let’s just say it’s TRULI good, and you, dear reader, can decide what the letters of that acronym stand for. At the very least, we can conclude that it’s worth a look. Find it on the intranet. Give it to a student and work on it together! See what they think and share their feedback with the rest of us on the comment section of this article! Or just check it out for yourself and tell me all the ways that I’ve got it wrong/right. Right? Thanks!
In the meantime, keep up the goodness!