Bruno: What Do "Critical Thinking" Assessments Look Like?
I wrote an essay for EdSource last month arguing that California - or any state adopting the Next Generation Science Standards (GSS) - would be wise to add specific factual knowledge to the new standards during the implementation process.
The NGSS are disappointingly lacking in scientific content, which they de-emphasize in favor of more general scientific thinking skills and "practices".
One point that's worth elaborating on is that it's not entirely clear what good science assessments look like when science standards are very vague on the details of what factual knowledge students should acquire in school.
There's an unfortunate tendency among many science educators to assume that specific knowledge isn't all that important, and that we should really be aiming to assess "higher-order" scientific thinking skills anyway.
In practice, however, that's easier said than done.
To see why, it's helpful to look at recent attempts to assess "technology and engineering literacy" by the National Assessment of Educational Progress. The purpose of the Technology and Engineering Literacy (TEL) assessment is to measure whether students are able to apply engineering principles - as opposed to specific knowledge - to real-life situations.
A few months back the TEL team released a sample assessment item meant to show how such a test might work.
If you're interested in assessment I encourage you to check it out because while it ostensibly represents the promise of "authentic" assessment of "21st century critical thinking skills", it's not obvious - at least to me - whether it's assessing much of interest.
The test item is an elaborate, interactive, animated scenario challenging you to help fix a water pump that a small village depends on for its livelihood.
Crucially, the test deliberately does not assume or require that you know much of anything about pumps or water. In a nod to the importance of such knowledge, the animation goes on at some length about how pumps work and how to determine the particular problem troubling this particular pump.
But because the test assumes little specific knowledge, the result for test-taker is a lot of reading and very little thinking. The task can be successfully completed by following the on-screen directions, which is tantamount to reading a straightforward troubleshooting guide from the defective pump's user manual.
Arguably, the ability to follow simple, step-by-step directions represents a sort of 'technological literacy", but once the required factual knowledge is stripped away that literacy seems a great deal less impressive.
And so it is likely to be with assessments targeting "scientific thinking". In the absence of clearly-defined, required factual knowledge, such tests are likely to seem rigorous in theory but banal in practice.
Of course, the Next Generation Science Standards are not entirely void of factual content. They are, however, often sufficiently vague that we should be worried about what the yet-to-be-designed tests will look like. - PB (@MrPABruno) (image source)