Higher Education Research and Development Society of Australasia
Process assessments are increasingly proposed in response to concerns about academic integrity and generative AI (GenAI) technologies. Understanding how learners learn is arguably the most important aspect of education. However, any approach to assessment—be it process, programmatic, or authentic — is challenging to implement effectively. Each of these approaches is an important part of our repertoire as we reform assessment, but requires a clear understanding of goals, contexts, learners, disciplinary requirements, level of study, and study conditions.
How can we assess the learning process?
In assessment contexts, “process” often refers to the steps taken to complete a task. For example, mathematics students might show their workings, while art students might provide sketches and reflective accounts. Learning probably happens while following these steps, and through reflecting on them, but the steps, and even the reflective accounts, are not “the learning”. And, of course, students may not have actually followed these steps, or may have followed them simply for the benefit of the process assessment. In any case, these elements of process are actually products: sanitised packages of evidence produced according to perceived expectations of assessors. Further, even with honest intentions, retrospective accounts inevitably transform complex, situated action into simplified and more coherent explanations.
To avoid this post-hoc transformation, we might directly observe learners doing tasks or solving problems. In healthcare education, for instance, Objective Structured Clinical Examinations (OSCE’s) allow assessors to observe students performing clinical tasks. These are often assessed via a mark sheet containing a list of steps that should be taken. While relatively straightforward for short, procedural tasks, this approach can be problematic for more complex activities, as observation tends to be fragmented and focused on overt, performative behaviour. Observation-based assessments of process actually assess performances.
Alternatively, process assessments may rely on digital traces left behind by students, such as version histories or tracked changes. While this may seem more objective than the retrospective accounts of students, it narrows what can be captured (self-evaluation is removed, for example) and is more invasive (students have little control over what staff see of their activity).
This raises several questions for those considering process assessments:
Are we implementing process assessments for the right reasons?
I see a significant difference between assessing processes through detached analysis and assessing from within the action. Is it not an educator’s responsibility to humanise the learning process and guide students in developing and articulating their values? It seems strange, for example, to address concerns about students using GenAI by relying on a datafied understanding of learning, produced through digital traces of activity in which the teacher may play no part whatsoever.
In education, the journey matters as much as the destination. How teachers teach influences learners’ ethical understandings and we want teachers to be involved in learning processes, particularly as we navigate the emergence of widely-available GenAI technologies. Therefore, the most important assessment of process may be that done by students—supported by teachers who are part of that process. We want students to be able to make decisions, long after graduation, about how they go about learning, doing and making, and to be capable of considering the potential short- and long-term implications of these decisions.
I like process assessments, even as a way of, potentially, promoting academic integrity. However, I see their primary value as providing opportunities for teachers to become more involved in helping students to develop processes that suit them, and to develop agency and evaluative judgement around their practices (or situated and contextually-attuned ways) of learning, doing and making.
Banner image source:
Used with permission: https://pixabay.com/illustrations/ai-generated-gears-mechanism-8896684/
The HERDSA Connect Blog offers comment and discussion on higher education issues; provides information about relevant publications, programs and research and celebrates the achievements of our HERDSA members.
HERDSA Connect links members of the HERDSA community in Australasia and beyond by sharing branch activities, member perspectives and achievements, book reviews, comments on contemporary issues in higher education, and conference reflections.
Members are encouraged to respond to articles and engage in ongoing discussion relevant to higher education and aligned to HERDSA’s values and mission. Contact Daniel Andrews Daniel.Andrews@herdsa.org.au to propose a blog post for the HERDSA Connect blog.
HERDSA members can login to comment and subscribe.