ACODE60 Learning Analytics Workshop
ACODE 60's workshop was today and focused on the area of Learning Analytics. Although MOOCs seem to have overtaken analytics as the issue of greatest interest in e-learning there is a case to be made that analytics are a more significant challenge to institutions and their staff. Outside of the US at least, the public higher education sector in Western countries is experiencing a sustained focus on quality. Quality in this case is that measured by performance indicators as scrutinised by the funding agencies of government, and consequently as seen in funding bestowed or withdrawn. These indicators are increasingly dominated by student outcomes such as qualification completion, but also include measures of retention and success within a course of study.
This focus on measures is nothing new of course, the UK was well experienced with the results of such regimes a decade ago:
"Appearing frequently are state-established blockages involving efforts to steer all universities in a system by enforced performance budgeting and other top-down oversight in which no good deed goes unpunished: incentives turn into punishments for three out of four institutions. Miss the targets that Ministers an tick off from performance 'agreements', and warnings will be made and funds cut, a sure way to depress initiative. To be entrepenurial, public universities need first of all a very light touch by the state that, among operational advantages, signifies increasing rather than decreasing trust." (Shattock 2003 as summarised in Clark, 2004 p173)
Today's meeting was opened by the Murdoch University PVC for Quality and Standards, Professor Bev Thiele who reflected on the recent experience of complying with the Australian TEQSA quality process, which is now seen as very much of regulation rather than assurance. The range of information required by this process is driving very much the engagement of Australian institutions with a range of management information systems, and institutional research of a variety of forms.
The morning's session that followed was a series of presentations from a variety of folk who have started exploring Learning Analytics. An important distinction being made is between Academic Analytics and Learning Analytics. The former reflects the organisation, the latter the pedagogical experience of students and todays meeting very much focussed on the latter. A couple of useful resources identified were the SNAPP tool which can be used to visualise activity of discussion forums and analyse the interactions between the various participants. This looks very useful as it can be used on any Blackboard forum retrospectively to help analyse the impact of different strategies or diagnose issues a course might be experiencing.
The second resource is the newly formed Society for Learning Analytics Research which is attempting to sustain a scholarly community examining the newly forming field of Learning Analytics. They have already established a journal, they have a very eminent group of academics engaged in the community and are running events internationally. Definitely worth participating in, or at the very least watching for developments.
The afternoon's conversation I think was where things really got interesting. We started with a presentation from the Murdoch Ethics Manager Erich von Dietz who explored the ethical challenges associated with Learning Analytics. The presentation was followed by a panel and a very spirited conversation traversing the issues of ethics, legality, pastoral care, privacy, the distinction between research and institutional quality management and the intent of Learning Analytics work done by both individuals and as an organisational activity.
Potentially, institutions and individual researchers have access to a vast body of data and rich information on student activities and contextual information which is able to be used to identify a variety of potentially useful things about student learning. We saw examples today of research examining the different ways students use technologies like discussion forums and video of lectures, and the potential ways that information can be used to identify patterns of student engagement which can possibly predict student success or failure.
There are, of course deep complexities to this data, and we definitely saw and appreciated the complexity that simple visualisations obscure as well as the need for analyses that reflect an understanding of pedagogical variety. We did however start to see the real potential for Learning Analytics to open up the possibility of proactive responses to students' learning. The idea that we might be able to help students address learning issues before they start failing assessments and courses is very attractive, not only for institutions concerned about performance, but also for staff who are generally very committed to student success.
This is where it gets tricky however. If we have the potential to do this type of analysis, and if there is evidence suggesting that it has impact on student success (both of which are increasingly likely to be true) - do we have an obligation to act? Does our duty of care and responsibility as ethical educators require us to use these tools, and (mostly prickly of all) are we exposed legally to the risk of litigation because we fail to do so for particular, ultimately unsuccessful, students?
There are many analogies that can be drawn to the experience of medical researchers who have had to consider many analogous situations. In my own research career I had to deal with the ethical obligations that arose from testing human blood for HIV and Hepatitis B - tests that were done to ensure my safety but which generated data on individuals so significant that we were ethically obligated to ensure they could be informed of their infected status and appropriately supported through the consequent issues. All this despite the fact that I would never know their identity or see anything other than a small blood sample collected for an entirely different purpose.
A real question is what is the intent of the activity being done. Clearly there is a difference between the ethical situation of an institution trying to improve the experience of the students and their educational outcomes, and that of the researcher exploring the newly emerging field of Learning Analytics for the benefit of their intellectual curiosity and academic career. But these can be flipped as well, the institution can and does use analytics to monitor student (and staff) internet use for 'inappropriate content,' the researcher is also often a teacher researching their own practice to be a more effective teacher for their students. There's also the question of the ways in which analytics measure and monitor staff, the ethical consequence that can arise from systems explicitly or identifying individuals through their associations with particular courses.
'Who owns the data' also came up, a question that transects issues of ethics, legality, privacy and intellectual property. Legal precedence implies that institutions may 'own' the data but whether they can use it is another question. Ethically, universities would claim to hold themselves to a higher standard than commercial organisations, certainly we are perceived as more ethical and trustworthy than business people, lawyers and politicians. The problem is that it is very unclear that we can ethically use the data in a research sense - actually publish real data as opposed to fake data simulating reality. Students can't easily consent to their data being used as an institution making such a request is automatically in a relationship with a severe power imbalance - the potential for perceived coercion is just too great. We can talk about ideas like the 'greater good' but those have been dismissed in the medical context and almost certainly cannot be justified in the educational space either.
It is possible to consider analyses and responses that avoid individual ethical and privacy concerns by aggregating data and responding in universal ways to student learning issues but this seems somehow a lessening of the potential from Learning Analytics.
Ultimately I'm left feeling that this is a space where we are going to have to accept that there are things we can do, that we choose not to do as ethical professionals. Mostly that actually just means we can't publish real data, we might have to accept the need to use simulated examples to demonstrate the utility of systems, and public, aggregated data to show the impact on cohorts. That doesn't mean we don't work internally as teachers with individual students, it just means that as researchers we can't get the fullest personal benefit. If that means we can continue to distinguish ourselves as more ethical practitioners and institutions than the commercial exploiters operating in the educational space online, then I think we can be proud of that distinction.