May 082015
Image of surgical student activity data presented by Paula Smith at the Learning Analytics Event

Today I am at the UK Learning Analytics Network organised by the University of Edinburgh in Association with Jisc. Read more about this on the Jisc Analytics blog. Update: you can also now read a lovely concise summary of the day by Niall Sclater, over on the Jisc Analytics blog.

As this is a live blog there may be spelling errors, typos etc. so corrections, comments, additions, etc. are welcome. 

Introduction – Paul Bailey

I’m Paul Bailey, Jisc lead on the Learning Analytics programme at the moment. I just want to say a little bit about the network. We have various bits of project activities, and the network was set up as a means for us to share and disseminate the work we have been doing, but also so that you can network and share your experience working in Learning Analytics.

Housekeeping – Wilma Alexander, University of Edinburgh & Niall Sclater, Jisc

Wilma: I am from the University of Edinburgh and I must say I am delighted to see so many people who have traveled to be here today! And I think for today we shouldn’t mention the election!

Niall: I’m afraid I will mention the election… I’ve heard that Nicola Sturgeon and Alex Salmond have demanded that Tunnucks Teacakes and Caramel Wafers must be served at Westminster! [this gets a big laugh as we’ve all been enjoying caramel wafers with our coffee this morning!]

I’ll just quickly go through the programme for the day here. We have some really interesting speakers today, and we will also be announcing the suppliers in our learning analytics procurement process later on this afternoon. But we kick off first with Dragan.

Doing learning analytics in higher education: Critical issues for adoption and implementation – Professor Dragan Gašević, Chair in Learning Analytics and Informatics, University of Edinburgh

I wanted to start with a brief introduction on why we use learning analytics. The use of learning analytics has become something of a necessity because of the growing needs of education – the growth in the number of students and the diversity of students, with MOOCs being a big part of that realisation that many people want to learn who do not fit our standard idea of what a student is. The other aspect of MOOCs is their scale: as we grow the number of students it becomes difficult to track progress and the feedback loops between students and instructions are lost or weakened.

In learning analytics we depend on two types of major information systems… Universities have had student information systems for a long time (originally paper, computerised 50-60 years ago), but they also use learning environments – the majority of universities have some online coverage of this kind for 80-90% of their programmes. But we also don’t want to exclude other platforms, including communications and social media tools. And no matter what we do with these technologies we leave a digital trace, and that is not a reversible process at this point.

So, we have all this data but what is the point of learning analytics? It is about using machine learning, computer science, etc. approaches in order to inform education. We defined learning analytics as being “measurement, collection, analysis, and reporting” of education but actually that “how” matters less than “why”. It should be about understanding and optimising learning and the environments in which learning occurs. And it is important not to forget that learning analytics are there to understand learning and are about understanding what learning is about.

Some case studies include Course Signals at Purdue. They use Blackboard for their learning management system. They wanted to predict students who would successfully complete students, and to identify those at risk. They wanted to segment their students into at high risk, at risk, or not at risk at all. Having done that they used a traffic light system to reflect that, and they used that traffic light system for students was shown both to staff and students. When they trialed that (Arnold and Pistilli 2012) with a cohort of students, they saw greater retention and success. But if we look back at how I framed this, we need to think about this in terms of whether this changes teaching…

So, also at Purdue, they undertook a project analysing the email content of instructors to students. They found that more detailed feedback, they just increased the summative feedback. So this really indicates that learning analytics really has to feed into changes in teaching practices in our institutions, and we need our learning analytics to provide signalling and guidance that enables teaching staff to improve their practice, and give more constructive feedback. (see Tanes, Arnold, King and Remnet 2011).

University of Michigan looked at “gateway course” as a way to understand performance in science courses (see Wright, McKay, Hershock, Miller and Triz 2014). They defined a measure for their courses, which was “better than expected”. There were two measures for this: previous GPA, and goals set by students for the current course. They then used predictive models for how students could be successful, and ways to help students to perform better than expected. They have also been using technology designed for behavioural change, which they put to use here… Based on that work they generated personalised messages to every students, based on rational for these students, and also providing predicted performance for particular students. For instance an example here showed that a student could perform well beyond their own goals, which might have been influenced by the science course not being their major. The motivator for students here was productive feedback… They interviewed successful students from previous years, and used that to identify behaviours etc. that led to success, and they presented that as feedback from peers (rather than instructors). And i think this is a great way to show how we can move away from very quantitative measures, towards qualitative feedback.

So, to what extent are institutions adopting these approaches? Well, there are very few institutions with institution-wide examples of adoptions. For instance University of Michigan only used this approach on first year science courses. They are quite a distributed university – like Edinburgh – which may be part of this perhaps. Purdue also only used this on some course.

Siemans, Dawson and Lynch (2014) surveyed the use of learning analytics in the HE sector, asking about the level of adoption and type of adoption, ranking these from “Awareness” to “Experimentation” to “Organisation/Students/Faculty”, “Organisational Transformation” and “Sector Transformation”. Siemens et al found that the majority of HE is at the Awareness and Experimentation phase. Similarly Goldstein and Katz (2005) found 70% of institutions at phase 1, it is higher now but bear in mind that 70% doesn’t mean others are further along that process. There is still much to do.

So, what is necessary to move forward? What are the next steps? What do we need to embrace in this process? Well lets talk a bit about direction… The metaphors from business analytics can be useful, borrow lessons from that process. McKinsey offered a really interesting business model of: Data – Model – Transform (see Barton and Court 2012). That can be a really informative process for us in higher education.

Starting with Data – traditionally when we choose to measure something in HE we refer to surveys, particularly student satisfaction surveys. But this is not something with a huge return rate in all countries. More importantly surveys are not the accurate thing. We also have progress statistics – they are in our learning systems as are data but are they useful? We can also find social networks from these systems, from interactions and from course registration systems – and knowing who students hang out with can predict how they perform. We also find that we can get this data, but then how do we process and understand that data? I know some institutions find a lack of IT support can be a significant barrier to the use of learning analytics.

Moving onto Model… Everyone talks about predictive modelling, the question has to be about the value of a predictive model. Often organisations just see this as an outsourced thing – relying on some outsider organisation and data model that provides solutions, but does not do that within the context of understanding what the questions are. And the questions are critical.

And this is, again, where we can find ourselves forgetting that learning analytics is about learning. So there are two things we have to know about, and think about, to ensure we understand what analytics mean:

(1) Instructional conditions – different courses in the same school, or even in the same programme will have a different set of instructional conditions – different approaches, different technologies, different structures. We did some research on an University through their Moodle presence and we found some data that was common to 20-25% of courses, but we did identify some data you could capture that were totally useless (e.g. time online). And we found some approaches that explained 80% of variance, so for example extensive use of Turnitin – not just for plagiarism but also by students for gathering additional feedback. One of our courses defied all trends… they had a Moodle presence but when we followed up on this, found that most of their work was actually in social media so data from Moodle was quite misleading and certainly a partial picture. (see Gasevic, Dawson, Rogers, Gasevic, 2015)

(2) Learner agency – this changes all of the time. We undertook work on the agency of learners, based on log data from a particular course. We explored 6 clusters using cluster matching algorithms… We found that there was a big myth that more time on task would lead to better performance… One of our clusters spent so much time online, another was well below. When we compared clusters we found the top students were that group spending the least time online, the other cluster spending time online performed average. This shows that this is a complex questions. Learning styles isn’t the issue, learning profiles is what matters here. In this course, one profile works well, in another a different profile might work much better. (see Kovanovic, Gasevic, Jok… 201?).

And a conclusion for this section is that our analytics and analysis cannot be generalised.

Moving finally to Transform we need to ensure participatory design of analytics tools – we have to engage our staff and students in these processes early in the process, we won’t get institutional transformation by relying on the needs of statisticians. Indeed visualisations can be harmful (Corrin and de Barba 2014). The University of Melbourne looked at the use of dashboards and similar systems and they reported that for students that were high achieving, high GPA, and high aspirations… when they saw that they were doing better than average, or better than their goals, they actually under-perform. And for those doing less well we can just reinforce issues in their self efficacy. So these tools can be harmful if not designed in a critical way.

So, what are the realities of adoption? Where are the challenges? In Australia I am part of a study commissioned by the Australian Government in South Australia. This is engaging with the entire tertiary Australian institution. We interviewed every VC and management responsible for learning analytics. Most are in phase 1 or 2… Their major goal was to enable personalised learning… the late phases… They seemed to think that magically they would move from experimentation to personalised learning, they don’t seem to understand the process to get there…

We also saw some software driven approaches. They buy an analytics programme and perceive job is done.

We also see a study showing that there is a lack of a data-informed decision making culture, and/or data not being suitable for informing those types of decisions. (Macfadyen and Dawson 2012).

We also have an issue here that researchers are not focused on scalability here… Lots of experimentation but… I may design beautiful scaffolding based on learning analytics, but I have to think about how that can be scaled up to people who may not be the instructors for instance.

The main thing I want to share here is that we must embrace the complexity of educational systems. Learning analytics can be very valuable for understanding learning but they are not a silver bullet. For institutional or sectoral transformation we need to embrace that complexity.

We have suggested the idea of Rapid Outcome Mapping Approach (ROMA) (Macfadyen, Dawson, Pardo, Gasevic 2014) in which once we have understood the objectives of learning analytics, we also have to understand the political landscape in which they sit, the financial contexts of our organisations. We have to identify stakeholders, and to identify the desired behaviour changes we want from those stakeholders. We also have to develop engagement strategy – we cannot require a single approach, a solution has to provide incentives for why someone should/should not adopt learning analytics. We have to analyse our internal capacity to effect change – especially in the context of analytics tools and taking any value form them. And we finally have to evaluate and monitor chance. This is about capacity development, and capacity development across multiple teams.

We need to learn from successful examples – and we have some to draw upon. The Open University adopted their organisational strategy, and were inspired by the ROMA approach (see Tynan and Buckingham Shum 2013). They developed the model of adoption that is right for them – other institutions will want to develop their own, aligned to their institutional needs. We also need cross-institutional experience sharing and collaboration (e.g. SOLAR, the Society for Learning Analytics Research). This meeting today is part of that. And whilst there may be some competition between institutions, this process of sharing is extremely valuable. There are various projects here, some open source, to enable different types of solution, and sharing of experience.

Finally we need to talk about ethical and privacy consideration. There is a tension here… Some institutions hold data, and think students need to be aware of the data held… But what if students do not benefit from seeing that data? How do we prepare students to engage with that data, to understand this data. The Open University is at the leading edge here and have a clear policy on ethical use of student data. Jisc also have a code of practice for learning analytics which I also welcome and think will be very useful for institutions looking to adopt learning analytics.

I also think we need to develop an analytics culture. I like to use the analogy of, say, Moneyball, where analytics make a big difference… but analytics can be misleading. Predictive models have their flaws, their false positives etc. So a contrasting example would be the Trouble with the Curve – where analytics mask underlying knowledge of an issue. We should never reject our tacit knowledge as we look at adopting learning analytics.


Q – Niall): I was struck by your comments about asking the questions… But doesn’t that jar with the idea that you want to look at the data and exploring questions out of that data?

A – Dragan): A great question… As a computer scientist I would love to just explore the data, but I hang out with too many educational researchers… You can start from data and make sense of that. It is valid. However, whenever you have certain results you have to ask certain questions – does this make sense in the context of what is taking place, does this make sense within the context of our institutional needs, and does this make sense in the context of the instructional approach? That questioning is essential no matter what the approach.

Q – Wilma) How do you accommodate the different teaching styles and varying ways that courses are delivered?

A – Dragan) The most important part here is about the development of capabilities – at all levels and in all roles including students. So in this Australian study we identified trends, found these clusters… But some of these courses are quite traditional and linear, others are more ambitious… They have a brilliant multi-faceted approach. Learning analytics would augment this… But when we aggregate this information… But when you have more ambitious goals, the more there is to do. Time is required to adopt learning analytics with sophistication. But we also need to develop tools to the needs of tasks of stakeholders… so stakeholders are capable to work with them… But also not to be too usable. There aren’t that many data scientists so perhaps we shouldn’t use visualisations at all, maybe just prompts triggered by the data… And we also want to see more qualitative insights into our students… their discussion… when they are taking notes… And that then really gives an insight… Social interactions are so beneficial and important to benefit student learning.

Q – Wilbert) You mentioned that work in Australia about Turnitin… What was the set up there that led to that… Or was it just the plagiarism prediction use?

A – Dragan) Turned out to be the feedback being received through Turnitin… Not plagiarism side. Primarily it was on the learner side, not so much the instructors. There is an ethical dilemma there if you do expose that to instructors… If they are using the system to get feedback… Those were year one students, and many were international and from Asia and China where cultural expectation of reproducing knowledge is different… So that is also important.

Q) Talking about the Purdue email study, and staff giving formative feedback to students at risk – how did that work?

A) They did analysis of these messages, and the content of them, and found staff mainly giving motivational messages. I think that was mainly because traffic light system indicated at risk nature but not why that was the case… you need that information too..

Q) Was interested in rhetoric of personalised learning by Vice Chancellors, but most institutions being at stage 1 or 2… What are the institutional blockers? How can they be removed?

A) I wish I had an answer there! But the senior leaders are sometimes forced to make decisions based on financial needs, not just about being driven by data or unaware of data. So in Australian institutions many are small organisations, with limited funding… and existence of the institutions is part of what they have to face, quite aside from adoption of learning analytics. But also University of Melbourne is a complex institution, a leading researcher there but cannot roll out same solution across very different schools and courses….

Niall: And with that we shall have to end the Q&A and hand over to Sheila, who will talk about some of those blockers…

Learning Analytics: Implementation Issues – Sheila MacNeill, Glasgow Caledonian University

I was based at CETIS involved in learning analytics for a lot of that time… But for the last year and a half I have been based at Glasgow Caledonian University. And today I am going to talk about my experience of moving from that overview position to being in an institution and actually trying to do it… I’m looking for a bit of sympathy and support, but hoping to also contextualise some of what Dragan talked about.

Glasgow Caledonian University has about 17,000 students, mostly campus based although we are looking at online learning. We are also committed to blended learning. We provide central support for the university, working with learning technologies across the institution. So I will share my journey… joys and frustrations!

One of the first things I wanted to do was to get my head around what kind of systems we had around the University… We had a VLE (Blackboard) but I wanted to know what else people were using… This proved very difficult. I spoke to our IS department but finding the right people was challenging, a practical issue to work around. So I decided to look a big more broadly with a mapping of what we do… looking across our whole technology position. I identified the areas and what fitted into those areas:

  • (e) Assessment and feedback – Turnitin – we see a lot of interest in rubrics and marking and feedback processes that seem to be having a big impact on student success and actually plagiarism isn’t its main usefulness the more you use it, Gradecentre, Wikis/blogs/journals, peer/self assessment, (e)feedback.
  • (e) Portfolios – wikis/blogs/journals, video/audio – doing trials with nursing students of a mobile app in this space.
  • Collaboration – discussion boards, online chat, video conferencing etc.
  • Content – lectures, PDFs, etc….

I’ve been quite interested in Mark (?) idea of a “core VLE”. Our main systems group around SRS (students records system – newly renamed from it’s former name, ISIS), GCU Learn, the Library, 3rd Party Services. When I did hear from our IS team I found such a huge range of tools that our institution has been using, it seems like every tool under the sun has been used at some point.

In terms of data… we can get data from our VLE, from Turnitin, from wikis etc. But it needs a lot of cleaning up. We started looking at our data, trying it on November data from 2012 and 2013 (seemed like a typical month). And we found some data we would expect, changes/increases of use over time. But we don’t have data on a module level, or programme level, etc. Hard to view in detail or aggregate up (yet). We haven’t got data from all of our systems yet. I would say we are still at the “Housekeeping” stage… We are just seeing what we have, finding a baseline… There is an awful lot of housekeeping that needs to be done, a lot of people to talk to…

But as I was beginning this process I realised we had quite a number of business analysts at GCU who were happy to talk. We have been drawing out data. We can make dashboards easily, but USEFUL dashboards are proving more tricky! We have meanwhile been talking about Blackboard about their data analytics platform. It is interesting thinking about that… given the state we are in about learning analytics, and finding a baseline, we are looking at investing some money to see what data we can get from Blackboard that might enable us to start asking some questions. There are some things I’d like to see from, for example, combining on campus library card data with VLE data. And also thinking about engagement and what that means… Frustratingly for me I think that it is quite hard to get data from Blackboard… I’m keen that next license we sign we actually have a clause about the data we want, in the format we want, when we want it… No idea if that will happen but I’d like to see that.

Mark Stubbs (MMU) has this idea of a tube map of learning… This made me think of the Glasgow underground map – going in circles a bit, not all joining up. We really aren’t quite there yet, we are having conversations about what we could, and what we should do. In terms of senior management interest in learning analytics… there is interest. But when we sent out the data we had looked we did get some interesting responses. Our data showed a huge increase in mobile use – we didn’t need a bring your own device policy, students were already doing it! We just need everything mobile ready. Our senior staff are focused on NSS and student survey data, that’s a major focus. I would like to take that forward to understand what is happening, and more structured way…

And I want to finish by talking about some of the issues that I have encountered. I came in fairly naively to this process. I have learned that…

Leadership and understanding is crucial – we have a new IS director which should make a great difference. You need both carrots and stick, and that takes a real drive from the top to make things actually start.

Data is obviously important. Our own colleagues have issues access data from across the institution. People don’t want to share, they don’t know if they are allowed to share. There is a cultural thing that needs investigating – and that relates back to leadership. There are also challenges that are easy to fix such as server space. But that bigger issue of access/sharing/ownership all really matter.

Practice can be a challenge. Sharing of experience and engagement with staff, having enough people understanding systems, is all important for enabling learning analytics here. The culture of talking together more, having a better relationship within an institution, matters.

Specialist staff time matters – as Dragan highlighted in his talk. This work has to be prioritised – a project focusing on learning analytics would give the remit for that, give us a clear picture, establish what needs to be done. To not just buy in technology but truly assess needs before doing that, and in adopting technology. There is potential but learning analytics has to be a priority if it is to be adopted properly.

Institutional amnesia – people can forget what they have done, why, and what they do not do it before… More basic house keeping again really. Understanding, and having tangible evidence of, what has been done and why is also important more broadly when looking at how we use technologies in our institutions.

Niall: Thanks for such an honest appraisal of a real experience there. We need that in this community, not just explaining the benefits of learning analytics. The Open University may be ahead now, but it also faced some of those challenges initially for instance. Now, over to Wilma.

Student data and Analytics work at the University of Edinburgh – Wilma Alexander, University of Edinburgh

Some really interesting talks already to do, I’ll whiz through some sections in fact as I don’t need to retread some of this. I am based in Information Services. We are a very very large, very old University, and it is very general. We have a four year degree. All of that background makes what we do with student data, something it is hard to generalise about.

So, the drivers for the project I will focus on, came out of the understanding we already have about the scale and diversity of this institution. Indeed we are increasingly encouraging students to make imaginative cross overs between schools and programmes which adds to this. Another part of the background is that we have been seriously working in online education, and in addition to a ground breaking digital education masters delivered online, we also have a number of online masters. And further background here is that we have a long term set of process that encourages students to contribute to the discussions within the university, owners and shapers of their own learning.

So, we have an interest in learning analytics, and understanding what students are doing online. We got all excited by the data and probably made the primary error of thinking about how we could visualise that data in pretty pictures… but we calmed down quite quickly. As we turned this into a proper project we framed it much more in the context of empowering students around their activities, about data we already have about our students. We have two centrally supported VLEs at Edinburgh (and others!) which are Blackboard Learn, our main largest system with virtually all on campus programmes use that VLE in some way, but for online distance programmes we took the opportunity to try out Moodle – largely online programmes, and largely created as online distance masters programmes. So, already there is a big distance between how these tools are used in the university, never mind how they are adopted.

There’s a video which shows this idea of building an airplane whilst in the air… this projects first phase, in 2014, has felt a bit like that at times! We wanted to see what might be possible but also we started by thinking about what might be displayed to students. Both Learn and Moodle give you some data about what students do in your courses… but that is for staff, not visible to students. When we came to looking at the existing options… None of what Learn offers quite did what we wanted as none of the reports were easily made student facing (currently Learn does BIRT reports, course reports, stats columns in grade center etc). We also looked at Moodle and there was more there – it is open source and developed by the community so we looked at available options there…

We were also aware that there were things taking place in Edinburgh elsewhere. We are support not research in our role, but we were aware that colleagues were undertaking research. So, for instance my colleague Paula Smith was using a tool to return data as visualisations to students.

What we did as a starting point was to go out and collect user stories. We were asking both staff and students, in terms of information available in the VLE(s), what sort of things would be of interest. We framed this as a student, as a member of staff, as a tutor… as “As a… I want to… So that I can…”. We had 92 stories from 18 staff and 32 students. What was interesting here was that much of what was wanted was already available. For staff much of the data they wanted they really just had to be shown and supported to find the data already available to them. Some of the stuff that came in as “not in scope” was not within the very tight boundaries we had set for the project. But a number of things of interest, requests for information, that we passed on to appropriate colleagues – so one area for this was reading lists and we have a tool that helps with that so we passed that request onto library colleagues.

We also pooled some staff concerns… and this illustrates what both Dragan and Sheila have said about the need to improve the literacy of staff and students using this kind of information, and the need to contextualise it… e.g: “As a teacher/personal tutor I want to have measures of activity of the students so that I can be alerted to who are “falling by the wayside” for instance – a huge gap between activity and that sort of indicator.

Student concerns were very thoughtful. They wanted to understand how they compare, to track progress, they also wanted information on timetables of submissions, assignment criteria/weighting etc. We were very impressed by the responses we had and these are proving valuable beyond the scope of this project…

So, we explored possibilities, and then moved on to see what we could build. And this is where the difference between Learn and Moodle really kicked in. We initially thought we could just install some of the Moodle plugins, and allow programmes to activate them if they wanted to… But that fell at the first hurdle as we couldn’t find enough staff willing to be that experimental with a busy online MSc programme. The only team up for some of that experimentation were the MSc in Digital Education team, which was done as part of a teaching module in some strands of the masters. This was small scale hand cranked from some of these tools. One of the issues with pretty much all of these tools is that they are staff facing and therefore not anonymous.So we had to do that hand cranking to make the data anonymous.

We had lots of anecdotal and qualitative information through focus groups and this module, but we hope to pin a bit more down on that. Moodle is of interest as online distance students… there is some evidence that communication, discussion etc. activity is a reasonable proxy for performance here as they have to start with the VLE.

Learn is a pretty different beast as it is on campus. Blended may not have permeated as strongly on campus. So, for Learn what we do have this little element that produces a little click map of sorts (engagements, discussion, etc)… For courses that only use the VLE for lecture notes, that may not be useful at all, but for others it should give some idea of what is taking place. We also looked at providing guidebook data – mapping use of different week’s sections/resources/quizzes to performance.

We punted those ideas out. The activity information didn’t excite folk as much (32% thought it was useful). The grade information was deemed much more useful (97% thought it was useful)… But do we want our students hooked on that sort of data? Could it have negative effects, as Dragan talked about. And how useful is that overview?

When it came to changes in learning behaviour we had some really interesting and thoughtful responses here. Of the three types of information (discussion boards, grade, activity) it was certainly clear though that grade was where the student interest was.

We have been looking at what courses use in terms of tools… doing a very broad brush view of 2013/14 courses we can see what they use and turn on from: some social/peer network ability – where we think there really is huge value, the percentage of courses actively using those courses on campus are way below those using the VLE for the other functions of Content+Submission/Assessment and Discussion Boards.

So context really is all – reflecting Dragan again here. It has to work for individuals on a course level. We have been mapping our territory here – the university as a whole is hugely engaged in online and digital education in general, and very committed to this area, but there is work to do to join it all up. When we did information gathering we found people coming out of the woodwork to show their interest. The steering group from this project has a representative from our student systems team, and we are talking about where student data lives, privacy and data protection, ethics, and of course also technical issues quite apart from all that… So we also have the Records Management people involved. And because Jisc has these initiatives, and there is an EU initiative, we are tightly engaging with the ethical guidance being produced by both of these.

So, we have taken a slight veer from doing something for everyone in the VLEs in the next year. The tool will be available to all but what we hope to do is to work very closely with a small number of courses, course organisers, and students, to really unpick on a course level how the data in the VLE gets built into the rest of the course activity. So that goes back into the idea of having different models, and applying the model for that course, and for those students. It has been a journey, and it will continue…

Using learning analytics to identify ‘at-risk’ students within eight weeks of starting university: problems and opportunities – Avril Dewar, University of Edinburgh

This work I will be presenting has been undertaken with my colleagues at the Centre for Medical Education, as well as colleagues in the School of Veterinary Medicine and also Maths.

There is good evidence that performance in first year will map quite closely to performance as a whole in a programme. So, with that in mind, we wanted to develop an early warning system to identify student difficulties and disengagement before they reach assessment. Generally the model we developed worked well. About 80% of at risk students were identified. And there were large differences between the most and least at-risk students – the lowest risk score and the highest risk score which suggests this was a useful measure.

The measures we used included:

  • Engagement with routine tasks
  • Completion of formative assessment – including voluntary formative assessment
  • Tutorial attendance (and punctuality where available) – but this proved least useful.
  • Attendance at voluntary events/activities
  • Virtual Learning Environment (VLE) exports (some)
    • Time until first contact proved to be the most useful of these

We found that the measures sometimes failed because the data exports were not always that useful for appropriate (e.g. VLE tables of 5000 colums). Patterns of usage were hard to investigate as raw data on, e.g. time of day of accesses, not properly usable though we think that is useful. Similarly there is no way to know if long usage means a student has logged in, then Googled or left their machine, then returned – or whether it indicates genuine engagement.

To make learning analytics useful we think we need the measures, and the data supporting them, to be simple, to be comprehensible, accessible – and also comparable to data from other systems (e.g. we could have used library data alongside our VLE issues), to scale easily – e.g. common characteristics between schools, not replicating existing measures, discriminates between students – some of the most useful things like the time to first contact, central storage.

We also found there were things that we could access but didn’t use. Some for ethical and some for practical reasons. IP addresses for location was an ethical issue for us, discussion boards similarly we had concern about – we didn’t want students to be put off participating in discussions. Or time taken to answer individual questions. We are concerned that theoretical issues that could be raised could include: evidence that student has been searching essay-buying websites; student is absent from class and claims to be ill but IP address shows another location, etc.

There were also some concerns about the teacher-student relationship. Knowing too much can create a tension in the student-teacher relationship. And the data one could gather about a student could become a very detailed tracking and monitoring system… for that reason we always aim to be conservative, rather than exhaustive in our data acquisition.

We have developed training materials and we are making these open source so that we can partner with other schools, internationally. Whilst each school will have their own systems and data but we are keen to share practice and approaches. Please do get in touch if you would like access to the data, or would like to work with us.


Q – Paula) Do you think there is a risk of institutions sleep walking into student dissatisfaction. We are taking a staged approach… but I see less effort going into intervention, to the staff side of what could be done… I take it that email was automated… Scalability is good for that, but I am concerned students won’t respond to that as it isn’t really personalised at all. And how were students in your project, Avril, notified.

A – Avril) We did introduce peer led workshops… We are not sure if that worked yet, still waiting for results of those. We emailed to inform our students if they wanted to be part of this and if they wanted to be notified of a problem. Later years were less concerned, saw the value. First year students were very concerned, so we phrased our email very carefully. When a student was at risk emails were sent individually by their personal tutors. We were a bit wary of telling students of what had flagged them up – it was a cumulative model… were concerned that they might then engage just with those things and then not be picked up by the model.

Niall: Thank you for that fascinating talk. Have you written it up anywhere yet?

Avril: Soon!

Niall: And now to Wilbert…

The feedback hub; where qualitative learning support meets learning analytics – Wilbert Kraan, Cetis

Before I start I have heard about some students gaming some of the simpler dashboards so I was really interested in that.

So, I will be sort and snappy here. The Feedback Hub work has just started… this is musings and questions at this stage. This work is part of the larger Jisc Electronic Management of Assessment (EMA) project. And we are looking at how we might present feedback and learning analytics side by side.

The EMA project is a partnership between Jisc, UCISA and HeLF. It builds on earlier Jisc Assessment and Feedback work And it is a co-design project that identifies priorities, solution areas… and we are now working on solutions. So one part of this is about EMA requirements and workflows, particularly the integration of data (something Sheila touched upon). There is also work taking place on an EMA toolkit that people can pick up and look at. And then there is the Feedback Hub, which I’m working on.

So, there is a whole assessment and feedback lifecycle (as borrowed from a model developed by Manchester Metropolitan, with they permission), This goes from Specifying to Setting, Supporting, Submitting, Marking and production of feedback, Recording of grades etc… and those latter stages is where the Feedback Hub sits.

So, what is a feedback hub really? It is a system that provides a degree programme of life wide view of assignments and feedback. The idea is that it moves beyond the current module that you are doing, to look across modules and across years. There will be feedback that is common across areas, that gives a holistic view of what has already been done. So this is a new kind of thing… When I look at nearest tools I found VLE features – database view of all assignments for a particular student for learner and tutor to see. A simple clickable list that is easy to do and does help. Another type is a tutoring or assignment management system – capturing timetables of assignments, tutorials etc. These are from tutor perspective. Some show feedback as well. And then we have assignment services – including Turnitin – about plagiarism, but also management of logistics of assignments, feedback etc.

So, using those kinds of tools you can see feedback as just another thing that gets put in the learning records store pot in some ways. But feedback can be quite messy, hard to disentangle in line feedback from the document itself. Teachers approach feedback differently… though pedagogically the qualitative formative feedback that appears in these messy ways can be hugely valuable.  Also these online assessment management tools can be helpful for mapping and developing learning outcomes and rubrics – connecting that to the assignment you can gain some really interesting data… There is also the potential for Computer Aided Assessment feedback – sophisticated automated data on tests and assignments which work well in some subjects. And possibly one of the most interesting learning analytics data is on the engagement with feedback. A concern from academic staff is that you can give rich feedback, but if the students don’t use it how useful it is really? So capturing that could be useful…

So, having identified those sources, how do we present such a holistic view? One tool presents this as an activity stream – like Twitter and Facebook – with feedback part of that chronological list of assignments… We know that that could help. Also an expanding learning outcomes rubric – click it to see feedback connected to it, would it be helpful? We could also do text extraction, something like Wordle, but would that help? And the other thing that might see is clickable grades – to understand what a grade means… And finally should we combine feedback hub with analytics data visualisations.

Both learning analytics and feedback track learning progress over time, and try to predict the future. Feedback related data can be a useful learning analytics data source.


Q – Me) Adoption and issues of different courses doing different things? Student expectations and added feedback?

A) This is an emerging area… IET in London/University of London have been trialing this stuff… they have opened that box… Academic practice can make people very cautious…

Comment) Might also address the perennial student question of wanting greater quality feedback… Might address deficit of student satisfaction

A) Having a coordinated approach to feedback… From a pedagogical point of view that would help. But another issue there is that of formative feedback, people use these tools in formative ways as well. There are points of feedback before a submission that could be very valuable, but the workload is quite spectacular as well. So balancing that could be quite an interesting thing.

Jisc work on Analytics – update on progress to date– Paul Bailey, Jisc and Niall Sclater. 

Paul: We are going to give you a bit of an update on where we are on the Learning Analytics project, and then after that we’ll have some short talks and then will break out into smaller groups to digest what we’ve talked about today.

The priorities we have for this project are: (1) basic learning analytics solution, an interventions tool and a student tool; (2) code of practice for learning analytics; and (3) learning analytics support and network.

We are a two year project, with the clock ticking from May 2015. We have started by identifying suppliers to initiate contracts and develop products; then institutions will be invited to participate in the discovery stage or pilots (June-Sept 2015). Year 1 in Sept 2015-2016 we will run that discovery stage (10-20 institutions), pilots (10+ institutions); institutions move from discovery to pilot. Year 2 will be about learning from and embedding that work. And for those of you that have worked with us in the past, the model is a bit different: rather than funding you then learning from that, we will be providing you with support and some consultancy and learning from this as you go (rather than funding).

Michael Webb: So… we have a diagram of the process here… We have procured a learning records warehouse (the preferred supplier there is H2P). The idea that VLEs, Student Information Systems and Library Systems feeding into that. There was talk today of Blackboard being hard to get data out of, we do have Blackboard on-board.

Diagram of the Jisc Basic Learning Analytics Solution presented by Paul Bailey and Michael Webb

Diagram of the Jisc Basic Learning Analytics Solution presented by Paul Bailey and Michael Webb

Paul: Tribal are one of the solutions, pretty much off the shelf stuff. Various components and we hope to role it out to about 15 institutions in the first year. The second option there will be the open solution, which is partly developed but needs further work. So the option will be to engage with either one of those solutions, or to engage with both perhaps.

The learning analytics processors will feed the staff dashboards, into a student consent service, and both of those will connect to the alert and intervention system. And there will be a Student App as well.

Michael: The idea is that all of the components are independent so you can buy one, or all of them, or the relevant parts of the service for you.

Paul: The student consent service is something we will develop in order to provide some sort of service to allow students to say what kinds of information can or cannot be shared (of available data from those systems that hold data on them). The alert and intervention system is an area that should grow quite a bit…

So, the main components are the learning records warehouse, the learning analytics processor – for procurement purposes the staff dashboard is part of that, and the student app. And once you have that learning records warehouse is there, you could build onto that, use your own system, use Tableau, etc.

Just to talk about the Discovery Phase, we hope to start that quite soon. The invitation will come out through the Jisc Analytics email list – so if you want to be involved, join that list. We are also setting up a questionnaire to collect readiness information and for institution to express interest. Then in the discovery process (June/July onward) there will be a select preferred approach for the discovery phase. This will be open to around 20 institutions. We have three organisations involved here: Blackboard; a company called DTP Solution Path (as used by Nottingham Trent); and UniCom. For the pilot (September(ish) onward) we have a select solution preference (Year 1-15 (proprietary – Tribal) and 15 open).

Niall: the code of practice is now a document of just more than two pages around complex legal and ethical issues. They can be blockages to move that forward… so this is an attempt to have an overview document to help institution to overcome those issues. We have a number of institutions who will be trialing this. That’s at draft stage right now, and with advisory group to suggest revisions. It is likely to launch by Jisc in June. Any additional issues are being reflected in a related set of online guidance documents.

Effective Learning Analytics project can be found:

Another network on 24th June at Nottingham Trent University. At that meeting we are looking to fund some small research type projects – there is an Ideascale page for that. About five ideas in the mix at the moment. Do add ideas (between now and Christmas) and do vote on those. There will be pitches there, for ones to take forward. And if you want funding to go to you as a sole trader rather than to a large institution, that can also happen.


Q) Will the open solution be shared on something like GitHub so that people can join in

A) Yes.

Comment – Micheal: Earlier today people talked about data that is already available, that’s in the discovery phase when people will be on site for a day or up to a week in some cases. Also earlier on there was talk about data tracking, IP address etc, and the student consent system we have included is to get student buy-in for that process, so that you are legally covered for what you do as well. And there is a lot of focus on flagging issues, and intervention. The intervention tool is a really important part of this process, as you’ll have seen from our diagram.

For more information on the project see:

Open Forum – input from participants, 15 min lightning talks.

Assessment and Learning Analytics – Prof Blazenka Divjak, University of Zagreb (currently visiting University of Edinburgh)

I have a background in work with a student body of 80,000 students, and use of learning analytics. And the main challenge I have found has been the management and cleansing of data. If you want to make decisions, learning analytics are not always suitable/in an appropriate state for this sort of use.

But I wanted to today about assessment. What underpins effective teaching? Well this relates to the subject, the teaching methods, the way in which students develop and learn (Calderhead, 1996), and awareness of the relationship between teaching and learning. Assessment is part of understanding that.

So I will talk to two case studies across courses using the same blended approach with open source tools (Moodle and connected tools).

One of these examples is Discrete Math with Graph Theory, a course for the Master of Informatics course with around 120 students and 3 teachers. This uses problem (authentic) posing and problem solving. We have assessment criteria and weighted rubrics (AHP method). So here learning analytics are used for identification of performance based on criteria. We also look at differences between groups (gender, previous study, etc.). Correlation of authentic problem solving with other elements of assessments – hugely important for future professional careers but not always what happens.

The other programme, Project Management for the Master of Entrepreneurship programme, has 60 students and 4 teachers. In this case project teams work on authentic tasks. Assessment criteria + weighted rubrics – integrated feedback. The course uses self-assessment, peer-assessment, and teacher assessment. Here the learning analytics are being used to assess consistency, validity, reliability of peer-assessment. Metrics here can include the geometry of learning analytics perhaps.

Looking at a graphic analysis of one of these courses shows how students are performing against criteria – for instance they are better at solving problems than posing problems. Students can also benchmark themselves against the group, and compare how they are doing.

The impact of student dashboards – Paula Smith, UoE

I’m going to talk to you about an online surgery course – the theory not the practical side of surgery I might add. The MSc in Surgical Sciences has been running since 2007 and is the largest of the medical distance learning programmes.

The concept of learning analytics may be relatively new but we have been interested in student engagement and participation, and how that can be tracked and acknowledged for a long time as it is part of what motivates students to engage. So I am going to be talking about how we use learning analytics to make an intervention but also to talk about action analytics – to make changes as well as interventions.

The process before the project I will talk about had students being tracked via an MCQ system – students would see a progress bar but staff could see more details. At the end of every year we would gather that data, and present a comparative picture so that students could see how they were performing compared to peers.

Our programmes all use bespoke platforms and that meant we could work with the developers to design measures on student engagement – for example number of posts. A crude way to motivate students. And that team also created activity patterns so we could understand the busier times – and it is a 24/7 programme. All of our students work full time in surgical teams so this course is an add on to that. We never felt a need to make this view available to students… this is a measure of activity but how does that relate  to learning? We need more tangible metrics.

So, in March last year I started a one day a week secondment for a while with Wilma Alexander and Mark Wetton at IS. That secondment has the objectives of creating a student “dashboard” which would allow students to monitor their progress in relation to peers; to use the dashboard to identify at-risk students for early interventions; and then evaluate what (if any) impact that intervention had.

So, we did see a correlation between in-course assessment and examination marks. The exam is 75-80% (was 80, now 75) in the first year. It is a heavily weighted component. You can do well in the exam, and get a distinction, with no in course work during the year. The in-course work is not compulsory but we want students to see the advantage of in course assessments. So, for the predictive modelling regression analysis revealed that only two components had any bearing on end of year marks, which were discussion board ratings, and exam performance (year 1); or exam performance (year 2). So, with that in mind we moved away from predictive models we decided to do a dashboard for students to present a snapshot of their progress against others’. And we wanted this to be simple to understand…

So, here it is… we are using Tableau to generate this. Here the individual student can see their own performance in yellow/orange and compare to the wider group (blue). The average is used to give a marker… If the average is good (in this example an essay has an average mark of 65%) that’s fine, if the average is poor (discussion board which are low weighted has an average of under 40, which is a fail at MSc level) that may be more problematic. So that data is provided with caveats.

Paula Smith shows visualisations created using Tableu

Paula Smith shows visualisations created using Tableu

This interface has been released – although my intervention is just an email which points to the dashboard and comments on performance. We have started evaluating it: the majority think it is helpful (either somewhat, or a lot). But worryingly a few have commented “no, unhelpful”, and we don’t know the reasons for that. But we have had positive comments on the whole. We asked about extra material for one part of the course. And we asked students how the data makes them feel… although the majority answered ‘interested’, ‘encouraged’, and ‘motivated’, one commented that they were apathetic about it – actually we only had a 15% response rate for this survey which suggests that apathy is widely felt.

Most students felt the dashboard provided feedback, which was useful. And the majority of students felt they would use the dashboard – mainly monthly or thereabouts.

I will be looking further at the data on student achievement and evaluating it over this summer, and should be written up at the end of the year. But I wanted to close with a quote from Li Yuan, at CETIS: “data, by itself, does not mean anything and it depends on human interpretation and intervention“.

Learning Analytics – Daley Davis, Altis Consulting (London) 

We are a consulting company and we are well established in Australia so I thought it would be relevant to talk about what we do there on learning analytics. Australia are ahead on learning analytics and that may well be because they brought in changes to funding fees in 2006 so they view students differently. They are particularly focused on retention. And I will talk about work we did with UNE (University of New England), a university with mainly online students and around 20,000 students in total. They wanted to improve student attrition. So we worked with them to set up a system for a student early alert system for identifying students at risk on disengaging. It used triggers of student interaction as predictors. And this work cut attrition from 18% to 12% and saving time and money for the organisation.

The way this worked was that students had an automated “wellness” engine, with data aggregated at school and head of school levels. And what happened was that staff were ringing students every day – finding out about problems with internet connections, issues at home etc. Some of these easily fixed or understood.

The system picked up data from their student record system, their student portal, and they also have a system called “e-motion” which asks students to indicate how they are feeling every day – four ratings and also a free text box (that we also mined).

Data was mined with weightings and a student who had previously failed a course, and a student who was very unhappy were both aspects weighted much more heavily. As was students not engaging for 40 days or more (versus other levels, weighted more lightly).

Daley Davis shows the weightings used in a Student Early Alert System at UNE

Daley Davis shows the weightings used in a Student Early Alert System at UNE

Universities are looking at what they already have, coming up with a technical roadmap. But they need to start with the questions you want to answer… What do your students want? What are your KPIs? And how can you measure those KPIs. So, if you are embarking on this process I would start with a plan for 3 years, toward your perfect situation, so you can then make your 1 year or shorter term plans in the direction of making that happen…

Niall: What I want you to do just now is to discuss the burning issues… and come up with a top three…

And [after coffee and KitKats] we are back to share our burning issues from all groups…

Group 1:

  • Making sure we start with questions first – don’t start with framework
  • Data protection and when you should seek consent
  • When to intervene – triage

Group 2:

  • How to decided on what questions to decide on, and what questions and data are important anyway?
  • Implementing analytics – institutional versus course level analytics? Both have strengths, both have risks/issues
  • And what metrics do you use, what are reliable…

Group 3:

  • Institutional readiness for making use of data
  • Staff readiness for making use of data
  • Making meaning from analytics… and how do we support and improve learning without always working on the basis of a deficit model.

Group 4:

  • Different issues for different cohorts – humanities versus medics in terms of aspirations and what they consider appropriate, e.g. for peer reviews. And undergrads/younger students versus say online distance postgrads in their careers already
  • Social media – ethics of using Facebook etc. in learning analytics, and issue of other discussions beyond institution
  • Can’t not interpret data just because there’s an issue you don’t want to deal with.

Group 5:

  • Using learning analytics at either end of the lifecycle
  • Ethics a big problem – might use analytics to recruits successful people; or to stream students/incentivise them into certain courses (both already happening in the US)
  • Lack of sponsorship from senior management
  • Essex found through last three student surveys that students do want analytics.

That issue of recruitment is a real ethical issue. This is something that arises in the Open University as they have an open access policy so to deny entrance because of likely drop out or likely performance would be an issue there… How did you resolve that?

Kevin, OU) We haven’t exactly cracked it. We are mainly using learning analytics to channel students into the right path for them – which may be about helping select the first courses to take, or whether to start with one of our open courses on Future Learn, etc.

Niall: Most universities already have entrance qualifications… A-Level or Higher or whatever… ethically how does that work

Kevin, OU) I understand that a lot of learning analytics is being applied in UCAS processes… they can assess the markers of success etc..

Comment, Wilma) I think  the thing about learning analytics is that predictive models can’t ethically applied to an individual…

Comment, Avril) But then there is also quite a lot of evidence that entry grades don’t necessarily predict performance.

Conclusions from the day and timetable for future events – Niall Sclator

Our next meeting will be in June in Nottingham and I hope we’ll see you then. We’ll have a speaker, via Skype, who works on learning analytics for Blackboard.

And with that, we are done with a really interesting day.

Apr 012011

Today I will be liveblogging the eLearning@Ed 2011 conference which is taking place at the National eScience Centre at Edinburgh University today. The usual rules of liveblogging apply of course – posts will be updated through the day and there will be typos, errors, etc. that I will be correcting as I spot them or when I clean up this blog post at the end of the day.

Welcome by Tim O’Shea, Vice-Principal of UoE

This is the 8th eLearning @Ed conference. Emphasized that Informatics one of the top 5 depts in the world, a leader in ISG etc.  All 3 colleges and ISG are here represented today. We are a really huge university. We could have just had elearning in Moray House or the Vet School but a fundamental of the approach that Jeff Hayward and I have taken is to make sure there isn;t just one unit where all elearning is based but a community across the university. The university’s mission is to ensure we have valuable non trivial use of elearning across the university. We are doing something new here, we are creating new spaces for students to learn in. At events like this we can think about taking best advantage of the unique aspects of elearning.It is not about economics but about the student experience.

Continue reading »