I just love the word 'obfuscate'. It means (in my words) to take something that is perfectly clear, and render it incomprehensible. As in "Using the word 'obfuscate' in a sentence will obfuscate its meaning".
I say this because I've just been reading an article on which (clearly) a statistician has been let loose – thus rendering an otherwise wonderful article incomprehensible – at least in places. Now, I'm not saying that the statistical analyses aren't necessary, rather that in my opinion the article would be much more readable if most of the statistics were presented neatly in an appendix, rather than being liberally splatted across the second half of the article. I mean, seriously, how many of us actually know what the "Kaiser-Meyer-Olkin measure of sampling adequacy" involves, what "Bartlett's test of sphericity" is (something FIFA use to assess the roundness of a football, maybe?), or what "Mahalanobis' Distance statistic" measures? Or am I just dumb in this regard?
That aside, the article I think is a little gem (though it's not often such a blatant apostrophe error finds its way into a journal title, especially one in pedagogy):
C.D. Smith, K. Worsfold, L. Davies, R. Fisher & R. McPhail. Assessment literacy and student learning: the case for explicitly developing students [sic] 'assessment literacy'. (2013). Assessment & Evaluation in Higher Education, 38(1), 44-60. http://dx.doi.org/10.1080/02602938.2011.598636
Here the authors talk about the need to educate students in what assessments are there for and how to interpret them. The over-arching message is clear (that is, unobfuscated). With reference to Francis (2008)*, they say
…first-year students in particular are likely to over-rate their understanding of the assessment process and … there is a disjuncture between what they think they are being assessed on and what the marking criteria and achievement standards require of them
The authors go on to describe a simple intervention – a workshop in which students get to undertake peer discussion on examples of submitted work – and how they shape up against the marking criteria. Such an intervention results, I believe (the paper is rather obfuscated here), in a good improvement in the quality of students' submissions in a similar assignment. In particular, two areas show marked improvement.
First, students develop the ability to judge for themselves what makes a good response to an assignment. By implication, then, it means they develop the ability to judge the quality of their own work. That is a skill required by any professional. Imagine you have an electrician do some work in your house and she's unable to say for herself whether she's done a good job of it. A frightening prospect!
Secondly, students develop the idea of 'assessment for learning' (as opposed to assessment of learning), that is, they can see that they are able to learn while doing the assignment, Moreover, they begin to grasp that assignments can be set with the very purpose of developing student learning as opposed to simply providing a summative measure – in other words their lecturers are using the assessment process in a carefully considered manner with the primary purpose of achieving student learning.
Also increased, though not by as much, was student understanding of the actual assessment at hand, and their desire to put effort into the assessment.
All this has me thinking about what we commonly ask on physics assignments, tests and exams, and whether the students really know what we mean. The answer, I am sure, is 'no'. We use words and phrases such as "show that…", "evaluate…", "discuss…" and "from first principles…" These might be clear to experienced physicists, but I wonder whether students find them not unobfuscating.
Time for a bit of research.
*R. A. Francis (2008). An investigation into the receptivity of undergraduate students to assessment empowerment. Assessment & Evaluation in Higher Education 34(4), 481-489.
P.S. Thank you to Dorothy Spiller of our Teaching Development Unit for drawing my attention to thie Smith article.