May 132015
 

Today I am attending Holyrood Connect’s Learning Through Technology event in Glasgow. This is Day Two of the event and I plan to liveblog talks etc. that I attend today.

Welcome and introduction by the Chair – Mark Stephen, Journalist and Broadcaster

Session 1: Planning and leading the digitisation of learning and teaching

University Digital Education Comes of Age – Professor Sir Timothy O’Shea, Principal & Vice-Chancellor, University of Edinburgh

I want to start with an iconic image for us at the University of Edinburgh – an image on the Masters that we give in Digital Education, and this is a student graduating. It is an online masters, in how to teach online. The students who graduate from that programme can either come along in person in McEwan Hall, or they can graduate virtually in real time – graduating electronically. Last year in the graduation season something very interesting happened – a student graduated in person with his iPad so that he graduated in person and electronically… So those online could see him graduate twice. If you have a serious interest in this area do look at our Online Masters in Digital Education or the MOOC that derives from it…

It is always good to remind ourselves of the history here. Computers really came about in the 1940s as part of code breaking. Vannevar Bush wrote the essay “As we may think” which is really the first essay to pose how we might use computing. We see Crowder’s Branching theory in the 1950s (which still powers modern tools like Scholar), Pask’s Conversation Theory work in the 1950s. Then in the 1960s Smallwood wrote the first self-improving computers; Papert looked at self-expression and the visual language Scratch very much came out of that – and is very much going strong, in fact we have a MOOC on Scratch at Edinburgh University, and worked on the first Spanish version of that MOOC; and Alan Kay came up with the idea of the Dynabook – effectively the netbook/tablet idea – at Xerox PARC; then in the 1970s Kimbell and I worked on computer based learning and Open University came up with CAL. The 1980s saw home computing coming into the Open University, 90’s brought collaborative learning and indeed mobile and “speckled computing” – wearables, internet of things type technologies. Open Educational Resources came about in 2000, and indeed MIT used OER to make courses freely available… didn’t seem to go anyway but in 2012 those resources became MOOCs and that really has changed things. I would also point out that, if you have interest in educational computing, go to Uraguay. For a long time Nicolas Negroponte tried the One Laptop Per Child programme… tried in various places but Uraguay it really took off (see Plan Ceibal) – and that’s part of why the University of Edinburgh is working with Scratch and MOOCs in Spanish. And recently the University of Arizona has announced a discount on first year of conventional undergraduate degrees for those completing their MOOCs…

So… We are seeing a move from Blackboard/Learn etc. to those sorts of systems sitting alongside other softwares, including search, social networks, blogs, video content – a rich world of content that the university does not necessarily build/support but which benefits and sits alongside central University resources and tools. There is no single technology platform anymore.

At Edinburgh our MOOCs cover a range of topics – from Andy Warhol – collaborating with the National Galleries – to chickens! Our most popular course has been philosophy – leading to new masters programmes, books, all sorts of things. And we see many pre-entry students taking that MOOC to find out what philosophy is all about.

We have run 24 MOOCs built, 7 under constructions, 12 MOOCs under consideration; 4 platforms (mostly Coursera and Futurelean) over 1.7m enrolements and we had the first ever real time MOOC last year on the Scottish Referendum – it changed every day in response to the polls and developments. So, why do we do that? Well it’s about reputation – we are early adopters of educational technology. MOOCs allow us to explore a new pedagogical space to inform practice. And we wish to reach as widely as we can with our courses. We also run 64 online masters programmes so it is not unhelpful that some of our MOOCs give some taste of those areas of teaching.

Our MOOC students particularly come from the US and UK, China very much unrepresented. Lots of age ranges – including some very motivated under-18 year olds. Few are motivated by certificates. And in terms of prior academic study we have a highly educated population – these are Edinburgh figures but this is seen across the board in MOOCs – many learners in these spaces have a degree (or several) already.

There are some real competing models of MOOCs… The xMOOC and the cMOOC model. Our #edcmooc kind of breaks these models – with open platforms and collaboration on cMOOC model, but also xMOOC characteristics. Of course MOOCs offer some possibilities for scaling… One thing you really can’t scale is one to one interaction, although you do see a lot of peer learning in MOOCs. And we are also experimenting with automated teaching in these spaces [see my notes on Sian Bayne’s talk].

So, where is the University of Edinburgh going? Well we have more and more online masters… Perhaps our most surprising is an award by the Queen to run an advanced surgery course at an online masters. This is a massively successful course but to take it you need to be a practicising surgeon, you need to be based at a surgical unit, you also need to attend a two week assessment in Edinburgh – but we see online masters takers getting better results than some of those taking similar courses on campus.

So what does all this mean for our mainstream business? Well it is not one or the other for us… on campus and online is hybrid, it’s about what percentage is on campus, what percentage online – which may be courses or resources. Right now we expect to have, by about 2020, about 40,000 students, all with at least one fully online course, we see open studies extended (and expect around 17,000 learners enrolled), and 10,000 fully online/remote students, 100,000s of MOOC learners and 100s of OERs. When we look at that fully online percentage of students by the way, we expect to surpass that estimate I think.

I want to quickly thank some key folk around University of Edinburgh including Jeff Hayward, Sian Bayne, Amy Woodgate, etc. all of whom have been hugely influential in our online learning work.

So, my conclusions? Well, elearning is not new; elearning is now mature. Hybrid will be the new normal. Leading university brands dominate. Better to borrow than to do badly – don’t build your own platform for the sake of it. Learning at scale is real – a successful MOOC is 100,00-200,000 with maybe 30k completing those courses. And the biggest contribution of MOOCs for us has been access – reaching out to schools we never would have been able to reach for philosophy courses (for instance), coming to us for that. And reaching new communities.

And, with that Tim O’Shea is done and, pausing only for an excellent unsavoury equine nutrition joke from our chair, we are moving onto Paul Saunders… 

The changing role of IT leaders – Paul Saunders, Chief Technology Officer and Director of Information Technology, University of Dundee

Any of you who have been to Dundee lately will know that it is undergoing huge change. Back in the 1980s Dundee was quite depressed but now the city is thriving, becoming one of the best cities in the UK. [and here we have a nice quote from Stephen Fry about the perfection of Dundee]. And the University of Dundee is also undergoing change, transforming from a College to School based system, we aim to be the best University in Scotland – and we have tough competition – and want to take this opportunity to transform ourselves and how we support our users.

We are quite a small university but even we have silos, so over the last few years we have been trying to join up what we do. This is not the same as centralisation, it’s about us all working together to deliver on our transformation agenda. We want to have a fundamentally different approach to the way we deliver services, conduct our business and function as a University. But universities don’t like change – I’ve only been working in the sector a few years but I’ve learned that! I used to work at Yahoo! when it was the market leader, before Google’s IPO, and I would say that in terms of change education shares characteristics with many industries, change can be hard.

In terms of IT, we need to work out what we provide, what we support. That doesn’t mean other things will not be used, it means that we focus on what we directly provide. Dr Eddie Obeng said in a recent TED talk that “we spend our time responding rationally to a worls that we understand, and recognise, but which no longer exists”. That applies to Dundee as a city I think, and to IT as a sector.

I worked in a group with Jisc and Educause to look at the changing role of an IT leader. What defines the skills and abilities to be an IT leader – where are the gaps? We also looked at what skills and abilities would be needed in the future (5 years ish). We worked together on a paper which is now available from Jisc and Educause.

We came up with the idea of an IT wheel as a model for IT leadership. We thought it was essential that you, as a human, were part of this. So, at the core of this model is a strategist… It is surrounded with Information Technology, but at Jisc Digifest we had some debate about whether that is an essential set of skills (my own background is in IT, but before that in performance art!). Surrounding the strategist there are roles and skills as Trusted Advisors, as a Visionary, and as a Relationship builder. You need to have that vision, but you also have to deliver on that, otherwise you will have no credibility. There are too many competing products/solutions/providers for IT services to not deliver to expectations. In the outer ring of our model we have Change driver; Promoter/Persuader; Master Communicator – not always a set of skills we, as IT professionals, have; Team builder – we really have to be great team builders, you have to engage people and you have to make sure your people want to do what you want to achieve; Ambassador – IT does not have a positive image in many spheres… ; Coach – you have to mentor people, to nurture your successess.

So how do you use this model? It’s freely available online for CPD, for coaching and useful for spotting talent – it’s much easier to build technical expertise than to develop some of those skills. You need to really take advantage and encourage areas of strength – encourage people to follow what they are passionate about. And that model can also be used in job descriptions for HERA profiles, along with SOPHIA from the BCS, so we can find the right people for the roles.

So take a look at the report! Thank you.

Analytics – creating a student’s “digital ecosystem” – Terry Trundley, Head of IT, Edinburgh College

I’m new to the education sector but I am experienced at working with computers in companies who use customer data in ways that we don’t yet do in education, we don’t exploit these tools like we should be. Back in the 90s I worked with a mobile phone company and we were working with leading edge technologies – working with a CRM (Customer Relationship Management) syste, IVR (Interactive Voice Response), analytical data etc. in 1996. Those are all still around, alongside social analytics, etc. And then we have all the data you have in your institution from your learning systems, from Google Analytics, etc. So we do that from first approach by a student, when we add them to the CRM, and can work with and track them through to alumni stage…

What do airlines and colleges have in common? Bums on seats! You need a lot of people for this to work. So, when I joined Edinburgh College two years ago that was very much the challenge… I spoke to the development team… experts from outside had suggested the website was the issue… blamed IT… But then they hadn’t had a spec, and they hadn’t been given a lot of the content needed. And behind the scenes our call management and enquiry processes weren’t working well – again they blamed IT. But I pointed out that course content could not come from IT, so we asked colleagues for that content… And we also then used Google Analytics to point out where the problems were… This showed that students came into the website, but when they looked for information they were getting stumped. Having gotten the trust, showing those analytics, and reviewing those processes, where we are now is a completely different situation. Part of the model we are using is that, say, for hairdressing (one of our most popular courses) we can look at job vacancies, previous graduates who have gone into those jobs, how many are studying – we can actually ensure that our courses fit into a supply and demand model.

And now over to my colleague Gavin, who will give a live demo of the system we are using.

Gavin: We were running courses without looking across the portfolio for uptake. We used an airline type model to understand our courses, and likely uptake, before we even run the courses. We had enterprise applications data… We could see unique applicants for number of places, we could break it into courses, and use analytics of views and applications to those courses to create a live conversion rates. And we created some gamification to allow the product managers to aim to be working on leading courses. We could also monitor uptake – with traffic lighting of red (low uptake), amber (reasonable uptake), green (full or oversubscribed uptake).

We can also look across our applicants and compare with SIMD (Scottish Index of Multiple Deprivation) to understand how we allocate our places to meet our targets. We plot our applications across the board, and across the UK. And if we look at a map of Edinburgh we can see what percentage of our students come from areas ranked high for SIMD so we can target and shape applications accordingly.

Terry: We are really just starting out with this, if anyone else is interested or working in this area we’d welcome your comments or feedback.

Questions and discussion

Q) Can I ask Terry two questions: Do we need to employ people with a degree in common sense? And how do we turn those models into applications?

A – Terry) That’s about working with marketing and with the communities. But Gavin showed you applications… But to increase those you have to get out there with marketing, to schools, campaigning, lobbying… We don’t have an electronic way to do that at present. And we have a CRM so if students don’t get onto one course, or haven’t applied but have made enquiries, we can go back to those students and engage them.

A – Gavin) And you can target places to those in high SIMD areas.

Q again) We find it hard to move students from one campus to another too…

A – Terry) When we mapped applications we did see students didn’t always apply to their nearest campus, in fact applying from all over the place.

A – Tim) When I worked at Birkbeck, a part time college, we mapped the public transport links to our institution and particularly noted that we had four key Northern Line Stations where we had a lot of students already, and Euston station… And that led to us advertising on those routes, in those stations as they aligned with suitable commuter routes to the institution. Doing analytics on learner data is a big big plus.

Q – Mark, Chair) Going back to your use of Google Analytics to identify the problem, I’m astonished you needed that. Why did it take that to demonstrate the issue.

A – Terry) Well we were in a merger situation which is quite difficult. The website had built up over time, through the marketing team… But we had changed a lot of courses etc. and we needed a new process. It was the breakdown of the process, and where that occurred, that particularly needed highlighting.

Q – Mark) How do you predict and project the performance of courses?

A – Gavin) We use historical data as an indicator – we might exclude outlier data there. Also starting to use market forces too – so if downturn in oil industry we’ll see drop there, but a rise in uptake of renewable data.

A – Tim) You also have to use demographic data – the numbers of school leavers etc – and that can really change a lot. It’s amazing how few institutions use that data of how many school leavers will they be, how likely are they to want to go to university or college… helps you raise or lower projected numbers.

Q – Mark) And how does that work for new course decisions?

A – Gavin) You can project likely uptake, or whether or not a course will meet required targets. And not run courses that will not

A – Tim) MOOCs are incredibly good for marketing, the interest from MOOCs can show interest and help locate demand for online masters, for evening courses, for degree programmes. ASking people hypothetical questions on courses they might apply for, that’s no use. Taster courses of different types (online and offline) are a good way to test market demand.

[Note from me as a graduate of the MSc in Digital Education (then the MSc in eLearning), and as a tutor on several online programmes: I think one of the reasons why online learners do perform well is because they are part-time learners with professional contexts and responsibilities, and often family responsibilities as well. To fit studies around other commitments, and to find and justify the use of time (and cost) of studying, these students tend to be very highly motivated and engaged. I think that is as much about the part time nature of courses as it is about them being delivered online. This is something I believe the Open University also sees when it comes to the success of it’s part time learners – online and offline/hybrid.]

After some particularly tasty biscuits we are back for workshop sessions…

Session 2: Innovative teaching and learning in colleges and universities

Workshop session 1: Virtual Classroom: Observe the Student Experience in a Virtual Classroom Environment – Tracy Matheson, West Highland College

This session is a walk through of how Blackboard Collaborate works in practice, exploring the roles available for those participating, use of screen sharing, the ways in which students can interact with the content, etc. I won’t blog this in detail as I suspect many reading this will be used to seeing and engaging in Blackboard Collaborate sessions. I do, however, really like that those leading the session are split between those in the room, and a colleague dialling in from their main Fort William College. That does give a real sense of being a student in this type of virtual classroom space (including some of the challenges associated with these spaces, and the internet connections they rely upon).

Workshop session 2: Building Your Online Professional Learning Network – Jaye Richards Hill, Managing Director, Tablet Academy Africa

Jave has begun by taking us through the idea of networks as tube maps – and the power of those interconnecting:

Networks have changed the way that we work, the way that we learn. We keep in touch with our colleagues, no matter where they are, through various online networks – Yammer, Twitter, direct messages… Much less so email for me now. And I do work like that, as part of a network. They enable me to listen to buzz and the rumble of what is going on, and allows me to tap into expertise in the subjects and areas I am working on. And if I listen, I can pick up so much about what is going on. And it changes the way that you do things, allows you to adapt and to grow as a professional. This is one of the reasons I love the idea of a personal learning network. I gave a presentation with Olly Bray in 2008 on personal learning networks, and that has always been a real favourite of mine because I work like this…

Our work these days is not linear, its disorganised self-directed learning. Wikipedia isn’t something you can read without clicking links – you learn things you didn’t expect to, it’s haphazard learning but your network is like that, and you find out great stuff… For me it all came into play in my probation year in teaching, which happened to be in Tenerife. I had to come back to the UK after that, in 2005, and I’d just gotten into computers and become a member of the Times Education Supplement Connect discussion boards – a brilliant way to follow what was going on in Scottish education. I found out about a job in Glasgow through that networking space, then as that contract was due to end, I found another opportunity, again through that space and through following up with contacts. That was the beginning of my networking. This is a very personal journey for me. Networking got me a job, which at the time was really important for where I was at.

Because I was seen as a bit of a computer person, because I put all my S3 biology teaching materials in PowerPoint, I got involved when Glow started off and started blogging about it, writing about what I was doing with Glow. At a conference I was astounded to fine out that the LTS team were reading the blog and wanted me to present on them, they were commenting and following those links and commenting on each others blogs enabled me to build up a network, serendipitous spreading… Then one day a contact suggested that we move that conversation to Twitter, and that was a game changer for me. It still is a game changer for me. I have work Twitter, a private Twitter, a Twitter for South Africa where I live. It’s still my go-to professional learning resource. For me I stay in contact with colleagues by DM – quicker responses too.

Then Tess Watson nudged me onto Facebook. I’m not sure about the value for professional learning, but it is useful for personal learning, and there is a bit of an overlap there… But I tend to keep Facebook more personal… I’ll stay in touch with grandchildren there for instance. But there is a joining of personal and professional. And we have Facebook pages for our companies, wherever they are in the world, so there is a connection there.

LinkedIn is a real professional space for me. I pay for LinkedIn professional now, and find I write more for LinkedIn Pulse than for my own blog. It’s a great way to stay in touch with contacts, with other corporations, to find new opportunities. It’s good for business and extended my network out there. And it’s particularly useful if you join groups, so many resources and writing to explore. But many struggle to use it professionally. It tends to be private sector who use it more… Does it have the mileage for public sector education? It’s choice I guess… Although professional networks, they are private too.

Andrew Brown got me onto Slideshare, and I find it a great resource for finding information really quite quickly. People post great presentations, many are willing to share them for downloading and reuse. And I post my work there, and I get comments, again find new connections… So I have this big network for really good quality professional learning.

The last time I gave this presentation was in India and the idea of a network with many options – that works with the Delhi metro too… That idea of having so many more options through many connected networks.

So, where am I now? Well things can get pulled very quickly. Things that are free can go… Twitter seems to have legs… Hopefully it won’t change too much because it works and works really well. But others come and go, so you have to be judicious in what you do.

Yammer is now part of Office 365 – huge potential for education. Not sure about plans for Glow but I’d like to see Yammer in schools some time soon as it’s safe and secure to your network. It’s safe for you to communicate with students as staff, there are records of what you discuss, you can attach photos, links, etc. And it’s now built into collaborative documents in Office365 online. And when learner management comes into Office 365 that will also help Yammer. And Sway, when that comes into Office365 will also have Yammer.

And there are other tools too. Skype is really useful – and I get it in Office365 too – but I’m not sure how that space would work for making new connections. And Lync, which is now Skype for Business, is also a great tool for professional networking.

The future of learning will be crowdsourced, as Andrew Brown has suggested. And for me, my network allows me to find the experts in the crowd, to make connections with people, to look for different points of view, to gather personal and social information. And I can create content, ask questions, evaluate information, devise solutions.

Comment) You need to discover what is coming next… When Twitter came out people were wondering what the potential of it would be… We didn’t see it’s potential as a community… But it’s hard to know… We’ve abandoned things that have been hot at some point. A lot of my learning is done via a sidebar on YouTube… the related content…

A) That’s the haphazard nature of self-organised learning… Some really interesting content can be the stuff that you don’t expect. And search engines, and tools like Delve, are getting better at predicting what you will find interesting, what you may use. That predictive element is becoming more important. Google work on that, both for delivering adverts and with content. And in Office365 Delve is going more that way too – I’ve just written a guide to using Delve in education. Are there plans for Delve to be in Glow in the future? [no comments from the crowd]

Lunch, exhibition and networking…

Session 3: Using technology to improve learning, teaching and student support

Exploring the use of data to support student engagement: learning analytics at the University of Edinburgh – Wilma Alexander, Educational Design and Engagement Team, Information Services, University of Edinburgh

I’m starting from a slightly different place to our analytics colleagues this morning, who were looking more at marketing and recruitment. What I’d like to talk about this afternoon is learning analytics. And in fact I’ll be talking about quite a bounded project to look at how we can look at student learning analytics, to inform and support their learning. This isn’t a new idea, it’s at least ten years that the analysis of data has been taking place, but learning analytics is something else…

There is now a Society of Learning Analytics Research and they have a clear definition of learning analytics.

To give you a bit of background about the University of Edinburgh: We are a huge university, with a huge range of types of study that students undertake. And more recently there is the whole digital profile that you heard about from Tim O’Shea this morning – work into online programmes, MOOCs, and increasingly online support for on campus undergraduates are part of that too. Recruitment isn’t as much the focus, generally we don’t have too much difficulty attracting students but that may be an area that is quite different from other organisations, in terms of motivations and focus of this work.

Getting started with learning analytics, I feel, has been a bit like trying to build a plane whilst it’s already flying. We started off very excited by the data, and what we thought we could do with it. We have two VLEs at Edinburgh: Blackboard Learn is our main supplier, the centrally supported VLE for on campus students, and for some online distance courses as well; but we also have Moodle, an open source tool used in some of our online distance courses. And when it came to looking for data we had one vendor quite unresponsive, or slow, to requests, whereas our open source community around Moodle can be really quite responsive and creative.

There are already some examples of data analytics in use. Purdue University use a traffic light system to flag up a student who could be in trouble – as a way to flag up to students and staff where intervention may be needed. We looked across these types of examples, but also looked at what would be possible with tools already at our disposal in Blackboard Learn and also in Moodle – and in research already taking place in the University. For instance my colleague Paula Smith has been doing some work with the online surgical skills course that Tim O’Shea mentioned earlier. Here they looked at individual performance against the cohort -and this makes sense in a highly competitive cohort in a hugely competitive field – motivating them to improve performance, based on the key structural elements of that course.

We also decided to look at what staff and students might like, what they thought they might want to get out of this data. I’m somewhat avoiding using the term analytics here as I think without analysis and context what you have is data. So we explored this potential use of data through user stories – we collected 92 stories from 18 staff and 32 students. The first interesting finding was how many of the “I want to…” stories could already be done – without developing anything – we just had to show users how to access that information, and to improve our documentation for the VLEs.

When it came to why people would want to do, we found staff that had given some thought about what they wanted but that was information like activity data – the use of materials etc. The idea that activity is a useful metric of engagement is not neccassarily the case in all contexts – some students can log in once, gather all materials, and that will appear very differently to someone doing that download week by week, but does not neccassarily indicate lower/different engagement.

So, we are now at the build stage but we proposed that we give students a view of their activity – a click count for any given day for instance. And also a way to view their marks against others in their cohort. We surveyed students on these proposals – 32% felt that the activity information might be useful, whilst 97% thought the grade information would be useful. Meanwhile our steering group had some concerns about the potential gamification of the system… The students seemed less concerned about that. And when we asked students about changing learning behaviour because of the data, most said no. We also asked what information students would find useful… And here we had some wonderful thoughtful responses.

When we look at student disinterest in this, we have to be aware of the context of how the on campus courses make use of the VLEs – few use discussions, social functions, most are just sharing resources. So activity data may reflect in part the way that the course is being used.

So, all of this information has led us to a slightly different place than we expected to be… The outcomes here are that:

  • Context is all – this VLE is used in thousands of courses, in many different ways. Part of this is putting course organisers in charge of whether these analytics are switched on, and how that is done
  • Must work for individuals and course-level  – it must be meaningful and contextualised for individuals on the course.
  • Building block and plug-ins
  • Mapping our territory – we’ve used the process as a way to map out where we want to go, and that also means understanding where we deal with or choose to focus in such a way as to work around legal and ethical needs, bounding ourselves so as not to raise some of those (e.g. not linking up to library and student records). That is less complex ethically, and in terms of security and privacy – those issues must be tackled very much head on. But another positive outcome of this project has been…
  • Staff awareness – has increased and startegy and policy for the institution as a whole are being looked at right now.
  • Student awareness – also raised in this process.

We are in this brave new world, with such potential, but we have to continue to be led by the pedagoguey in this process. And we really want this to be a really positive process, for students seeing their own data as a positive part of their learning. And over the next year we will be focusing more on this, and how we can support students with learning analytics.

Digital technology for students with additional support needs – Craig Mill, Assistive Technology Advisor, CALL Scotland and Edinburgh Napier University

I’ll be talking about support for older learners. Edinburgh Napier University has students from diverse backgrounds, and we do a lot of work on widening access, and students with additional support needs (ASN). Thinking back about 15 years the support for students would be through the “Disabled Computer” – which was labelled like that, attached to special kit… and no-one used it despite it being really great stuff. Then we had a student hub – but going there did mark you out as having, say, dyslexia, and our students really want to be like everyone else… And now we have a real shift away from that specialist technology idea, towards using every day technology. So iPads for instance come with lifechanging programmes built in, great for dyslexic students and visually impaired students. Chrome books offer great opportunites. There are super every day tools that empower students.

At Edinburgh Napier we have a range of provision. Students can be assessed and receive DSA funding/support – there is talk of students having to pay £200 towards this themselves so will be interesting to see if incidence of dyslexia goes up or down as a result. We provide resources including laptop loans, VPN, etc. Bring your own device, cloud apps, Office365 etc. are also provided.

Over the last few years we saw a huge growth in the number of students requiring support for dyslexia, but we are seeing that level off and I think that may partly be about bring your own device – students are more able to manage for themselves. Having Chrome Apps available can, for instance, make a big difference. Chrome extensions can also be very helpful – and most of these are free – because you can use those extensions to help you manage web based resources (Wikipedia, VLEs, etc) and see them in “Easy Reader” to view them in a more simple format. And you can also use text to speech on that text. All there and free to use – students love this!

But there is more we can do. You can use a free and open source software tool, called “My StudyBar” which lets you highlight parts of the text, or customising the interface, etc. to meet students needs. And that StudyBar also includes a mind mapping document that enables you to put down ideas in that format, then convert into a Word document to start planning your text.

That’s just a snapshot of the technologies that we use. We use tools like TextHelp and ClarRead but I think that actually they don’t always do students justice. Some do need that specialist hardware and software but for many students these widely available tools are hugely helpful.

Questions and discussion

Q) Do you think we should be blurring boundaries between assistive technologies and useful technologies – to stop that labelling?

A – Craig) For some people there is a real need for those specialist technologies… and that label matters. There are children who would have needed a £7-8000 piece of specialist kit, can now be done with an iPad for £7-800.

Q) So do we need a whole new label perhaps?

A – Wilma) In terms of assistive technologies for online learning, if we do something to make materials accessible, all students benefit. There is something there about mainstreaming good practice, so that specialists like Craig, and specialist technologies can focus on those who really need it. That allows you to support many students easily, then intensely focus resources on those with the greater needs.

A – Craig) The legislation is interestingly worded for that, but the more accessible your teaching, the more it is for all of your learners.

Q) In a professional sense how do you keep ahead of the students on technologies?

A – Craig) The students are really knowledgeable on Twitter, Facebook etc… But they don’t know about heading structures, speak tools for text etc. Students know what they know, but there is still lots they can pick up.

Q) What about students use of VLEs?

A – Wilma) I think for us one of the things we find is that there is really no time of day or day of the week where students are not using the VLE, are not learning online. That brings some support challenges – for instance for maintaining those systems.

Q) The idea of moving away from a deficit model of support, moving to proactive rather than reactive systems… In the old days the reactive systems might only kick in too late, so proactive technology can have real impact here.

A – Wilma) It is equally true that the more we can design everything we do to be accessible… There will still be some students that still require some specialist support but the more mainstream the tools and approach, the more you move from the deficit idea that the student somehow lacks something…

Q) And what are the differences between campus and on line systems?

A – Wilma) In on campus courses you will have some familiarity with your students, your systems will flag up changing assignment performance, etc. There is no need to automate that… But something like a traffic light system helps to flag that up – clearly a good lecturer will spot that too.

Q) You commented about the possible change in number of dyslexia after the £200 levy… Can you expand that…

A – Craig) I do a huge amount of work for Dyslexia Scotland but it is a term that covers a lot of very different needs and I’m not always sure the label is always helpful.

Session 4: Can technology help widening access to further and higher education? – Panel debate

Panellists:

  • Dr Muir Houston (MH) – Lecturer, School of Education, University of Glasgow
  • Lucy MacLeod (LM) – Depute Director (Students), Open University in Scotland
  • Tracy Matheson (TM), Curriculum Manager (Business, IT and Tourism), West Highland College
  • Dr Graeme Thomson (GT), Access Academy Co-ordinator, FOCUS West 

LM: The OU of course uses technology but actually it is about flexibility, it is about tutors, and about an open model of education, rather than the tools that we do. The other thing I wanted to raise is that the internet is full of stuff – many open educational resources, and you can quickly get into a debate about I have more stuff than you do… But does that actually widen access? Well, the jury seems to be out. We heard from Tim O’Shea this morning that 80% of those doing MOOCs have degrees, half of them have post graduate degrees. OK 20% do not but what is the experience for a learner on that course… It is about how you use this material. If we are about access to qualifications, learners really need that guide. The OU has tried to get learners together across communities, to look at pathways to degrees. Digital participation matters – 23% of adults don’t have access to the internet, 43% don’t use their phone to get online, 53% don’t use social networking. How do we get to these people? Wilma talked about some students understanding some online tools… But do they understand research libraries… To think about learning analytics it is really only useful if you know what you plan to do with that information, and I’m a firm believer that that is most useful when you use that information to trigger and inform conversations between tutors and students.

GT: FOCUS West work with schools in the West of Scotland, with funding from the Scottish Funding Council, to widen access. We have just built an online tool called “FOCUS Point” to share information and advice about post school routes, from schools that don’t have a tradition of sending students into college and university. So, introducing learners about what colleges and universities are about, what that experience is like, and practical advice about applying and taking up places. There are activities around subject choices, routes after school, entry routes, assistance with personal statement writing. And also getting students to set up a login that enables them to record their engagement, build up a portfolio, and build a certain element of social networking – to reduce potential isolation of being perhaps the only pupil in a school interested in pursuing a particular route/degree. So I’m here to say that whilst there may be some scepticism about use of technology, what we do has been well received but this stuff only work well when connected up with face to face experiences. I fear that MOOCs can potentially increase that sense of isolation…

TM: For us our face to face tends to have to be through virtual classroom. To do that face to face would mean not being able to access that education in some cases.

MH: Most non traditional students tend to be represented in main universities, but there are issues of the experience, inequalities, and also costs. It can be hard to convince an adult that it is worth paying for their child to go to university and leave with debts, and a job in a fast food restaurant. That’s where credit transfer can make a big difference – in theory that should work… Universities don’t like each others credits, everyone is quite protective of their own income streams.

Chair: So, whose responsibility is it to force those cautious institutions?

MH: The Funding Council.

Chair: What is the experience with the Open University in terms of credit transfer?

LM: The average age of OU student is 37 at the moment – and it’s dropping. We don’t have entry requirements, that’s one of our founding principles, so that is a barrier that simply isn’t there. And the courses are designed to be a ladder that takes you to a level 7 over the first year. The other big thing that the government has done for part time study and the OU, has been the part time fee grant. To allow people to study part time not to pay fees – that is not always well understood so students studying part time in Scotland do pay fees, and pay up front. In Scotland we have seen OU applications be stable, down south it has dropped due to the higher fees that students are now facing there due to the cut in government funding for the OU there, requiring students to take out loans.

MH: Learning paths can really go in different ways… It might start with a language course because a shopfloor worker is working in Spain, say, and that may then lead to the OU, and maybe a route to do an engineering degree. The union has negotiated a collective bargaining agreement so that their employer pays 40% of costs but that is still a huge financial and personal commitment – to study perhaps 6 years for a BEng alongside a 37 hour week. But that’s a great thing to do, and I know the OU does more of these sots of projects.

Chair: Is the ease of access for a lot of kids, a reason they are not engaged? Difficulty can be motivating?

GT: We find those that who do a free access programme are far more likely to continue progressing than those with a similar background without access to that programme. But people at Govan High, their local university is Glasgow which has very demanding grades, so you have to be really dedicated to get there really. But I think we’ll continue to see that…

Q) We’re having a regular conversation in the Scottish Borders about the drop out rate for our high school students as they go to university. What do we have to do as head teachers to help with that… Hearing Graeme talk about the social networks maybe we need to do more of that, or interventions we can make earlier… I’m not sure which way we should be going…

GT: I think just preparing students for what universities and colleges is actually like can make a big difference. There are many opportunities there but there can be some competition rather than collaboration between universities sometimes – blurring of marketing and recruitment with widening access. But activities like critical thinking, self led study, working with different sources, etc. those can be very valuable – and programmes offering that can have a big impact. Some HEIs can do more as well – with academic staff giving a sense of level 1 social science programmes for schools for instance.

MH: It’s not just pupils who need to understand social and cultural issues, it’s the parents too. I stole an idea from the OU – they used to have a guide for significant others which we adapted for parents as well. Things like timetable structures, when assessments are due… If you don’t know what your child is up to and what is expected of them, how can you support that. An understanding of important times in that calendar etc. can make a huge difference. It was a great tool the OU made. Knowing about that helps parents to work with their child, motivate them, help them manage stress.

Chair: But surely for your child, once they are there, it’s up to them?

TM: I think for rural students that can be a real challenge, and can really effect drop out rates. So we have some study skills modules designed for high schools, to encourage students to take them at high school to prepare them. But actually even if you’ve sent your child off to the big city parental support does still matter – and that’s not just financial, that’s about encouragement and emotional support. We also have three Highers for access to learners, using virtual learning, that are for students to take and manage themselves. We are quite strict about assignments etc. to help there. But working with colleges, universities, that your students will be going to can make a big difference to preparing students, and ensuring they have the skills they need to do well.

Chair: Occasionally you might be the only student in a school taking a subject, you said that you have this social network for students – does that work?

GT: It’s perhaps too early to say. Schools have been welcoming the stuff that we do, and it intersects with what they do for PSE, and eProfiles work. What hasn’t been embraced yet is the social networks side – we have more work to do there. Everyone have said it is a good idea, but you need enough people to make it worthwhile but it could be pretty innovative and worthwhile.

LM: A couple of things that occurred to me here, that I think are just as relevant for us. Some research we have done suggests “struggling students want to be noticed” and there is a responsibility for universities to use the sorts of analytics Wilma was talking about to really identify those students. At a big university you can easily feel lost, it’s really quite tough, and you are faced with being an independent leaner as well. The other project that may be worth mentioning. The OU, on behalf of the sector, is running something called “Back on Course” – we are working with 7 universities about drop outs from those universities, and follow up to see if they are OK, see if they are ok, if they would like a guided interview, if they want to adjust study plans, and I think there is potential there to come up with that sort of shared solution.

Chair: How easy is it to monitor outcomes of students once they have dropped out or finished?

TM: It’s really quite hard. In small communities there can be word of mouth and good will of organisations in some areas. But a telephone interview three months after school leaving gives a one off snapshot. I’m not sure what Skills Scotland do with tools like social networks. High schools generally have some idea – but only because they are smaller school.

Comment) It is becoming more critical… But I would like to be part of that conversation you are having with students who drop out, as in your work at OU for the moment.

MH: If you used the Scottish Candidate Number throughout Universities that would be hugely helpful. The dropping of that in HE breaks that pipeline. In the US they use the Social Security number – and that gives income as well. We don’t capture that but that would be really useful. I was on a working group with the Scottish Funding Council and UUK and income was deemed to be so useful, but there is a lot of resistance. I’m not sure if the issue is security of information. Postcodes are crude. SIMD 40 is useless, need SIMD 10 to really target support here.

LM: Another point about school leavers… When we talk about university I think we have to get away from the idea that the people who go to university are all young people. And also decrease the emphasis on what university leavers then do. We don’t talk about lifelong learning anymore, but that concept does matter. And 17 or 19 is maybe not the time to go to university for some people…

MH: And actually that may be where your drop out rates may come in… It may be that at 30, when you really proactively want to learn, you will be a much more motivated. In London there is an aspiration of 90% of students who want to go to university, and that may well not be right for them…

Comment: And apprentices, vocational education, etc. can be really good routes, without the debt etc.

MH: And in Germany those skilled jobs have real standing and less stigma about them as qualifications, as routes…

Chair: To finish, if you could change one thing, what would it be?

GT: I think we could achieve more as a country if there was more collaboration between institutions, and if widening participation was more separated from recruitment and marketing.

LM: I agree with that! I think I might take away money given to universities to work on widening access, and instead distribute it to primary schools in the poorest areas.

TM: I think that everyone should have access to the internet, to enable learning to take place no matter where they are – no matter what stage of education you are at, including school leavers, adult learners. Internet and transport infrastructures both need. I also think our college infrastructure is getting stronger and that lets young people stay at home longer, to find work locally, and for doing even one year of college can boost confidence and that reduces drop out rates if/when they then go into HE.

MH: I would like us to return to the thinking of education as a public good. And that education is about your own potential, the community, civic education and about quality of life issues. Increasingly degree programmes are focused on very narrowly defined jobs, when that job goes or changes your degree will be less useful than a broad degree will. These days everyone not only have degrees, you need postgraduate degrees! So you need to look at what you are doing and why, for there to be a broad skills such as critical thinking, personal reflection, etc.

Summary and conclusions by the Chair – Mark Stephen

And with that Mark thanks sponsors and all for taking part and attending.

May 082015
 
Image of surgical student activity data presented by Paula Smith at the Learning Analytics Event

Today I am at the UK Learning Analytics Network organised by the University of Edinburgh in Association with Jisc. Read more about this on the Jisc Analytics blog. Update: you can also now read a lovely concise summary of the day by Niall Sclater, over on the Jisc Analytics blog.

As this is a live blog there may be spelling errors, typos etc. so corrections, comments, additions, etc. are welcome. 

Introduction – Paul Bailey

I’m Paul Bailey, Jisc lead on the Learning Analytics programme at the moment. I just want to say a little bit about the network. We have various bits of project activities, and the network was set up as a means for us to share and disseminate the work we have been doing, but also so that you can network and share your experience working in Learning Analytics.

Housekeeping – Wilma Alexander, University of Edinburgh & Niall Sclater, Jisc

Wilma: I am from the University of Edinburgh and I must say I am delighted to see so many people who have traveled to be here today! And I think for today we shouldn’t mention the election!

Niall: I’m afraid I will mention the election… I’ve heard that Nicola Sturgeon and Alex Salmond have demanded that Tunnucks Teacakes and Caramel Wafers must be served at Westminster! [this gets a big laugh as we’ve all been enjoying caramel wafers with our coffee this morning!]

I’ll just quickly go through the programme for the day here. We have some really interesting speakers today, and we will also be announcing the suppliers in our learning analytics procurement process later on this afternoon. But we kick off first with Dragan.

Doing learning analytics in higher education: Critical issues for adoption and implementation – Professor Dragan Gašević, Chair in Learning Analytics and Informatics, University of Edinburgh

I wanted to start with a brief introduction on why we use learning analytics. The use of learning analytics has become something of a necessity because of the growing needs of education – the growth in the number of students and the diversity of students, with MOOCs being a big part of that realisation that many people want to learn who do not fit our standard idea of what a student is. The other aspect of MOOCs is their scale: as we grow the number of students it becomes difficult to track progress and the feedback loops between students and instructions are lost or weakened.

In learning analytics we depend on two types of major information systems… Universities have had student information systems for a long time (originally paper, computerised 50-60 years ago), but they also use learning environments – the majority of universities have some online coverage of this kind for 80-90% of their programmes. But we also don’t want to exclude other platforms, including communications and social media tools. And no matter what we do with these technologies we leave a digital trace, and that is not a reversible process at this point.

So, we have all this data but what is the point of learning analytics? It is about using machine learning, computer science, etc. approaches in order to inform education. We defined learning analytics as being “measurement, collection, analysis, and reporting” of education but actually that “how” matters less than “why”. It should be about understanding and optimising learning and the environments in which learning occurs. And it is important not to forget that learning analytics are there to understand learning and are about understanding what learning is about.

Some case studies include Course Signals at Purdue. They use Blackboard for their learning management system. They wanted to predict students who would successfully complete students, and to identify those at risk. They wanted to segment their students into at high risk, at risk, or not at risk at all. Having done that they used a traffic light system to reflect that, and they used that traffic light system for students was shown both to staff and students. When they trialed that (Arnold and Pistilli 2012) with a cohort of students, they saw greater retention and success. But if we look back at how I framed this, we need to think about this in terms of whether this changes teaching…

So, also at Purdue, they undertook a project analysing the email content of instructors to students. They found that more detailed feedback, they just increased the summative feedback. So this really indicates that learning analytics really has to feed into changes in teaching practices in our institutions, and we need our learning analytics to provide signalling and guidance that enables teaching staff to improve their practice, and give more constructive feedback. (see Tanes, Arnold, King and Remnet 2011).

University of Michigan looked at “gateway course” as a way to understand performance in science courses (see Wright, McKay, Hershock, Miller and Triz 2014). They defined a measure for their courses, which was “better than expected”. There were two measures for this: previous GPA, and goals set by students for the current course. They then used predictive models for how students could be successful, and ways to help students to perform better than expected. They have also been using technology designed for behavioural change, which they put to use here… Based on that work they generated personalised messages to every students, based on rational for these students, and also providing predicted performance for particular students. For instance an example here showed that a student could perform well beyond their own goals, which might have been influenced by the science course not being their major. The motivator for students here was productive feedback… They interviewed successful students from previous years, and used that to identify behaviours etc. that led to success, and they presented that as feedback from peers (rather than instructors). And i think this is a great way to show how we can move away from very quantitative measures, towards qualitative feedback.

So, to what extent are institutions adopting these approaches? Well, there are very few institutions with institution-wide examples of adoptions. For instance University of Michigan only used this approach on first year science courses. They are quite a distributed university – like Edinburgh – which may be part of this perhaps. Purdue also only used this on some course.

Siemans, Dawson and Lynch (2014) surveyed the use of learning analytics in the HE sector, asking about the level of adoption and type of adoption, ranking these from “Awareness” to “Experimentation” to “Organisation/Students/Faculty”, “Organisational Transformation” and “Sector Transformation”. Siemens et al found that the majority of HE is at the Awareness and Experimentation phase. Similarly Goldstein and Katz (2005) found 70% of institutions at phase 1, it is higher now but bear in mind that 70% doesn’t mean others are further along that process. There is still much to do.

So, what is necessary to move forward? What are the next steps? What do we need to embrace in this process? Well lets talk a bit about direction… The metaphors from business analytics can be useful, borrow lessons from that process. McKinsey offered a really interesting business model of: Data – Model – Transform (see Barton and Court 2012). That can be a really informative process for us in higher education.

Starting with Data – traditionally when we choose to measure something in HE we refer to surveys, particularly student satisfaction surveys. But this is not something with a huge return rate in all countries. More importantly surveys are not the accurate thing. We also have progress statistics – they are in our learning systems as are data but are they useful? We can also find social networks from these systems, from interactions and from course registration systems – and knowing who students hang out with can predict how they perform. We also find that we can get this data, but then how do we process and understand that data? I know some institutions find a lack of IT support can be a significant barrier to the use of learning analytics.

Moving onto Model… Everyone talks about predictive modelling, the question has to be about the value of a predictive model. Often organisations just see this as an outsourced thing – relying on some outsider organisation and data model that provides solutions, but does not do that within the context of understanding what the questions are. And the questions are critical.

And this is, again, where we can find ourselves forgetting that learning analytics is about learning. So there are two things we have to know about, and think about, to ensure we understand what analytics mean:

(1) Instructional conditions – different courses in the same school, or even in the same programme will have a different set of instructional conditions – different approaches, different technologies, different structures. We did some research on an University through their Moodle presence and we found some data that was common to 20-25% of courses, but we did identify some data you could capture that were totally useless (e.g. time online). And we found some approaches that explained 80% of variance, so for example extensive use of Turnitin – not just for plagiarism but also by students for gathering additional feedback. One of our courses defied all trends… they had a Moodle presence but when we followed up on this, found that most of their work was actually in social media so data from Moodle was quite misleading and certainly a partial picture. (see Gasevic, Dawson, Rogers, Gasevic, 2015)

(2) Learner agency – this changes all of the time. We undertook work on the agency of learners, based on log data from a particular course. We explored 6 clusters using cluster matching algorithms… We found that there was a big myth that more time on task would lead to better performance… One of our clusters spent so much time online, another was well below. When we compared clusters we found the top students were that group spending the least time online, the other cluster spending time online performed average. This shows that this is a complex questions. Learning styles isn’t the issue, learning profiles is what matters here. In this course, one profile works well, in another a different profile might work much better. (see Kovanovic, Gasevic, Jok… 201?).

And a conclusion for this section is that our analytics and analysis cannot be generalised.

Moving finally to Transform we need to ensure participatory design of analytics tools – we have to engage our staff and students in these processes early in the process, we won’t get institutional transformation by relying on the needs of statisticians. Indeed visualisations can be harmful (Corrin and de Barba 2014). The University of Melbourne looked at the use of dashboards and similar systems and they reported that for students that were high achieving, high GPA, and high aspirations… when they saw that they were doing better than average, or better than their goals, they actually under-perform. And for those doing less well we can just reinforce issues in their self efficacy. So these tools can be harmful if not designed in a critical way.

So, what are the realities of adoption? Where are the challenges? In Australia I am part of a study commissioned by the Australian Government in South Australia. This is engaging with the entire tertiary Australian institution. We interviewed every VC and management responsible for learning analytics. Most are in phase 1 or 2… Their major goal was to enable personalised learning… the late phases… They seemed to think that magically they would move from experimentation to personalised learning, they don’t seem to understand the process to get there…

We also saw some software driven approaches. They buy an analytics programme and perceive job is done.

We also see a study showing that there is a lack of a data-informed decision making culture, and/or data not being suitable for informing those types of decisions. (Macfadyen and Dawson 2012).

We also have an issue here that researchers are not focused on scalability here… Lots of experimentation but… I may design beautiful scaffolding based on learning analytics, but I have to think about how that can be scaled up to people who may not be the instructors for instance.

The main thing I want to share here is that we must embrace the complexity of educational systems. Learning analytics can be very valuable for understanding learning but they are not a silver bullet. For institutional or sectoral transformation we need to embrace that complexity.

We have suggested the idea of Rapid Outcome Mapping Approach (ROMA) (Macfadyen, Dawson, Pardo, Gasevic 2014) in which once we have understood the objectives of learning analytics, we also have to understand the political landscape in which they sit, the financial contexts of our organisations. We have to identify stakeholders, and to identify the desired behaviour changes we want from those stakeholders. We also have to develop engagement strategy – we cannot require a single approach, a solution has to provide incentives for why someone should/should not adopt learning analytics. We have to analyse our internal capacity to effect change – especially in the context of analytics tools and taking any value form them. And we finally have to evaluate and monitor chance. This is about capacity development, and capacity development across multiple teams.

We need to learn from successful examples – and we have some to draw upon. The Open University adopted their organisational strategy, and were inspired by the ROMA approach (see Tynan and Buckingham Shum 2013). They developed the model of adoption that is right for them – other institutions will want to develop their own, aligned to their institutional needs. We also need cross-institutional experience sharing and collaboration (e.g. SOLAR, the Society for Learning Analytics Research). This meeting today is part of that. And whilst there may be some competition between institutions, this process of sharing is extremely valuable. There are various projects here, some open source, to enable different types of solution, and sharing of experience.

Finally we need to talk about ethical and privacy consideration. There is a tension here… Some institutions hold data, and think students need to be aware of the data held… But what if students do not benefit from seeing that data? How do we prepare students to engage with that data, to understand this data. The Open University is at the leading edge here and have a clear policy on ethical use of student data. Jisc also have a code of practice for learning analytics which I also welcome and think will be very useful for institutions looking to adopt learning analytics.

I also think we need to develop an analytics culture. I like to use the analogy of, say, Moneyball, where analytics make a big difference… but analytics can be misleading. Predictive models have their flaws, their false positives etc. So a contrasting example would be the Trouble with the Curve – where analytics mask underlying knowledge of an issue. We should never reject our tacit knowledge as we look at adopting learning analytics.

Q&A

Q – Niall): I was struck by your comments about asking the questions… But doesn’t that jar with the idea that you want to look at the data and exploring questions out of that data?

A – Dragan): A great question… As a computer scientist I would love to just explore the data, but I hang out with too many educational researchers… You can start from data and make sense of that. It is valid. However, whenever you have certain results you have to ask certain questions – does this make sense in the context of what is taking place, does this make sense within the context of our institutional needs, and does this make sense in the context of the instructional approach? That questioning is essential no matter what the approach.

Q – Wilma) How do you accommodate the different teaching styles and varying ways that courses are delivered?

A – Dragan) The most important part here is about the development of capabilities – at all levels and in all roles including students. So in this Australian study we identified trends, found these clusters… But some of these courses are quite traditional and linear, others are more ambitious… They have a brilliant multi-faceted approach. Learning analytics would augment this… But when we aggregate this information… But when you have more ambitious goals, the more there is to do. Time is required to adopt learning analytics with sophistication. But we also need to develop tools to the needs of tasks of stakeholders… so stakeholders are capable to work with them… But also not to be too usable. There aren’t that many data scientists so perhaps we shouldn’t use visualisations at all, maybe just prompts triggered by the data… And we also want to see more qualitative insights into our students… their discussion… when they are taking notes… And that then really gives an insight… Social interactions are so beneficial and important to benefit student learning.

Q – Wilbert) You mentioned that work in Australia about Turnitin… What was the set up there that led to that… Or was it just the plagiarism prediction use?

A – Dragan) Turned out to be the feedback being received through Turnitin… Not plagiarism side. Primarily it was on the learner side, not so much the instructors. There is an ethical dilemma there if you do expose that to instructors… If they are using the system to get feedback… Those were year one students, and many were international and from Asia and China where cultural expectation of reproducing knowledge is different… So that is also important.

Q) Talking about the Purdue email study, and staff giving formative feedback to students at risk – how did that work?

A) They did analysis of these messages, and the content of them, and found staff mainly giving motivational messages. I think that was mainly because traffic light system indicated at risk nature but not why that was the case… you need that information too..

Q) Was interested in rhetoric of personalised learning by Vice Chancellors, but most institutions being at stage 1 or 2… What are the institutional blockers? How can they be removed?

A) I wish I had an answer there! But the senior leaders are sometimes forced to make decisions based on financial needs, not just about being driven by data or unaware of data. So in Australian institutions many are small organisations, with limited funding… and existence of the institutions is part of what they have to face, quite aside from adoption of learning analytics. But also University of Melbourne is a complex institution, a leading researcher there but cannot roll out same solution across very different schools and courses….

Niall: And with that we shall have to end the Q&A and hand over to Sheila, who will talk about some of those blockers…

Learning Analytics: Implementation Issues – Sheila MacNeill, Glasgow Caledonian University

I was based at CETIS involved in learning analytics for a lot of that time… But for the last year and a half I have been based at Glasgow Caledonian University. And today I am going to talk about my experience of moving from that overview position to being in an institution and actually trying to do it… I’m looking for a bit of sympathy and support, but hoping to also contextualise some of what Dragan talked about.

Glasgow Caledonian University has about 17,000 students, mostly campus based although we are looking at online learning. We are also committed to blended learning. We provide central support for the university, working with learning technologies across the institution. So I will share my journey… joys and frustrations!

One of the first things I wanted to do was to get my head around what kind of systems we had around the University… We had a VLE (Blackboard) but I wanted to know what else people were using… This proved very difficult. I spoke to our IS department but finding the right people was challenging, a practical issue to work around. So I decided to look a big more broadly with a mapping of what we do… looking across our whole technology position. I identified the areas and what fitted into those areas:

  • (e) Assessment and feedback – Turnitin – we see a lot of interest in rubrics and marking and feedback processes that seem to be having a big impact on student success and actually plagiarism isn’t its main usefulness the more you use it, Gradecentre, Wikis/blogs/journals, peer/self assessment, (e)feedback.
  • (e) Portfolios – wikis/blogs/journals, video/audio – doing trials with nursing students of a mobile app in this space.
  • Collaboration – discussion boards, online chat, video conferencing etc.
  • Content – lectures, PDFs, etc….

I’ve been quite interested in Mark (?) idea of a “core VLE”. Our main systems group around SRS (students records system – newly renamed from it’s former name, ISIS), GCU Learn, the Library, 3rd Party Services. When I did hear from our IS team I found such a huge range of tools that our institution has been using, it seems like every tool under the sun has been used at some point.

In terms of data… we can get data from our VLE, from Turnitin, from wikis etc. But it needs a lot of cleaning up. We started looking at our data, trying it on November data from 2012 and 2013 (seemed like a typical month). And we found some data we would expect, changes/increases of use over time. But we don’t have data on a module level, or programme level, etc. Hard to view in detail or aggregate up (yet). We haven’t got data from all of our systems yet. I would say we are still at the “Housekeeping” stage… We are just seeing what we have, finding a baseline… There is an awful lot of housekeeping that needs to be done, a lot of people to talk to…

But as I was beginning this process I realised we had quite a number of business analysts at GCU who were happy to talk. We have been drawing out data. We can make dashboards easily, but USEFUL dashboards are proving more tricky! We have meanwhile been talking about Blackboard about their data analytics platform. It is interesting thinking about that… given the state we are in about learning analytics, and finding a baseline, we are looking at investing some money to see what data we can get from Blackboard that might enable us to start asking some questions. There are some things I’d like to see from, for example, combining on campus library card data with VLE data. And also thinking about engagement and what that means… Frustratingly for me I think that it is quite hard to get data from Blackboard… I’m keen that next license we sign we actually have a clause about the data we want, in the format we want, when we want it… No idea if that will happen but I’d like to see that.

Mark Stubbs (MMU) has this idea of a tube map of learning… This made me think of the Glasgow underground map – going in circles a bit, not all joining up. We really aren’t quite there yet, we are having conversations about what we could, and what we should do. In terms of senior management interest in learning analytics… there is interest. But when we sent out the data we had looked we did get some interesting responses. Our data showed a huge increase in mobile use – we didn’t need a bring your own device policy, students were already doing it! We just need everything mobile ready. Our senior staff are focused on NSS and student survey data, that’s a major focus. I would like to take that forward to understand what is happening, and more structured way…

And I want to finish by talking about some of the issues that I have encountered. I came in fairly naively to this process. I have learned that…

Leadership and understanding is crucial – we have a new IS director which should make a great difference. You need both carrots and stick, and that takes a real drive from the top to make things actually start.

Data is obviously important. Our own colleagues have issues access data from across the institution. People don’t want to share, they don’t know if they are allowed to share. There is a cultural thing that needs investigating – and that relates back to leadership. There are also challenges that are easy to fix such as server space. But that bigger issue of access/sharing/ownership all really matter.

Practice can be a challenge. Sharing of experience and engagement with staff, having enough people understanding systems, is all important for enabling learning analytics here. The culture of talking together more, having a better relationship within an institution, matters.

Specialist staff time matters – as Dragan highlighted in his talk. This work has to be prioritised – a project focusing on learning analytics would give the remit for that, give us a clear picture, establish what needs to be done. To not just buy in technology but truly assess needs before doing that, and in adopting technology. There is potential but learning analytics has to be a priority if it is to be adopted properly.

Institutional amnesia – people can forget what they have done, why, and what they do not do it before… More basic house keeping again really. Understanding, and having tangible evidence of, what has been done and why is also important more broadly when looking at how we use technologies in our institutions.

Niall: Thanks for such an honest appraisal of a real experience there. We need that in this community, not just explaining the benefits of learning analytics. The Open University may be ahead now, but it also faced some of those challenges initially for instance. Now, over to Wilma.

Student data and Analytics work at the University of Edinburgh – Wilma Alexander, University of Edinburgh

Some really interesting talks already to do, I’ll whiz through some sections in fact as I don’t need to retread some of this. I am based in Information Services. We are a very very large, very old University, and it is very general. We have a four year degree. All of that background makes what we do with student data, something it is hard to generalise about.

So, the drivers for the project I will focus on, came out of the understanding we already have about the scale and diversity of this institution. Indeed we are increasingly encouraging students to make imaginative cross overs between schools and programmes which adds to this. Another part of the background is that we have been seriously working in online education, and in addition to a ground breaking digital education masters delivered online, we also have a number of online masters. And further background here is that we have a long term set of process that encourages students to contribute to the discussions within the university, owners and shapers of their own learning.

So, we have an interest in learning analytics, and understanding what students are doing online. We got all excited by the data and probably made the primary error of thinking about how we could visualise that data in pretty pictures… but we calmed down quite quickly. As we turned this into a proper project we framed it much more in the context of empowering students around their activities, about data we already have about our students. We have two centrally supported VLEs at Edinburgh (and others!) which are Blackboard Learn, our main largest system with virtually all on campus programmes use that VLE in some way, but for online distance programmes we took the opportunity to try out Moodle – largely online programmes, and largely created as online distance masters programmes. So, already there is a big distance between how these tools are used in the university, never mind how they are adopted.

There’s a video which shows this idea of building an airplane whilst in the air… this projects first phase, in 2014, has felt a bit like that at times! We wanted to see what might be possible but also we started by thinking about what might be displayed to students. Both Learn and Moodle give you some data about what students do in your courses… but that is for staff, not visible to students. When we came to looking at the existing options… None of what Learn offers quite did what we wanted as none of the reports were easily made student facing (currently Learn does BIRT reports, course reports, stats columns in grade center etc). We also looked at Moodle and there was more there – it is open source and developed by the community so we looked at available options there…

We were also aware that there were things taking place in Edinburgh elsewhere. We are support not research in our role, but we were aware that colleagues were undertaking research. So, for instance my colleague Paula Smith was using a tool to return data as visualisations to students.

What we did as a starting point was to go out and collect user stories. We were asking both staff and students, in terms of information available in the VLE(s), what sort of things would be of interest. We framed this as a student, as a member of staff, as a tutor… as “As a… I want to… So that I can…”. We had 92 stories from 18 staff and 32 students. What was interesting here was that much of what was wanted was already available. For staff much of the data they wanted they really just had to be shown and supported to find the data already available to them. Some of the stuff that came in as “not in scope” was not within the very tight boundaries we had set for the project. But a number of things of interest, requests for information, that we passed on to appropriate colleagues – so one area for this was reading lists and we have a tool that helps with that so we passed that request onto library colleagues.

We also pooled some staff concerns… and this illustrates what both Dragan and Sheila have said about the need to improve the literacy of staff and students using this kind of information, and the need to contextualise it… e.g: “As a teacher/personal tutor I want to have measures of activity of the students so that I can be alerted to who are “falling by the wayside” for instance – a huge gap between activity and that sort of indicator.

Student concerns were very thoughtful. They wanted to understand how they compare, to track progress, they also wanted information on timetables of submissions, assignment criteria/weighting etc. We were very impressed by the responses we had and these are proving valuable beyond the scope of this project…

So, we explored possibilities, and then moved on to see what we could build. And this is where the difference between Learn and Moodle really kicked in. We initially thought we could just install some of the Moodle plugins, and allow programmes to activate them if they wanted to… But that fell at the first hurdle as we couldn’t find enough staff willing to be that experimental with a busy online MSc programme. The only team up for some of that experimentation were the MSc in Digital Education team, which was done as part of a teaching module in some strands of the masters. This was small scale hand cranked from some of these tools. One of the issues with pretty much all of these tools is that they are staff facing and therefore not anonymous.So we had to do that hand cranking to make the data anonymous.

We had lots of anecdotal and qualitative information through focus groups and this module, but we hope to pin a bit more down on that. Moodle is of interest as online distance students… there is some evidence that communication, discussion etc. activity is a reasonable proxy for performance here as they have to start with the VLE.

Learn is a pretty different beast as it is on campus. Blended may not have permeated as strongly on campus. So, for Learn what we do have this little element that produces a little click map of sorts (engagements, discussion, etc)… For courses that only use the VLE for lecture notes, that may not be useful at all, but for others it should give some idea of what is taking place. We also looked at providing guidebook data – mapping use of different week’s sections/resources/quizzes to performance.

We punted those ideas out. The activity information didn’t excite folk as much (32% thought it was useful). The grade information was deemed much more useful (97% thought it was useful)… But do we want our students hooked on that sort of data? Could it have negative effects, as Dragan talked about. And how useful is that overview?

When it came to changes in learning behaviour we had some really interesting and thoughtful responses here. Of the three types of information (discussion boards, grade, activity) it was certainly clear though that grade was where the student interest was.

We have been looking at what courses use in terms of tools… doing a very broad brush view of 2013/14 courses we can see what they use and turn on from: some social/peer network ability – where we think there really is huge value, the percentage of courses actively using those courses on campus are way below those using the VLE for the other functions of Content+Submission/Assessment and Discussion Boards.

So context really is all – reflecting Dragan again here. It has to work for individuals on a course level. We have been mapping our territory here – the university as a whole is hugely engaged in online and digital education in general, and very committed to this area, but there is work to do to join it all up. When we did information gathering we found people coming out of the woodwork to show their interest. The steering group from this project has a representative from our student systems team, and we are talking about where student data lives, privacy and data protection, ethics, and of course also technical issues quite apart from all that… So we also have the Records Management people involved. And because Jisc has these initiatives, and there is an EU initiative, we are tightly engaging with the ethical guidance being produced by both of these.

So, we have taken a slight veer from doing something for everyone in the VLEs in the next year. The tool will be available to all but what we hope to do is to work very closely with a small number of courses, course organisers, and students, to really unpick on a course level how the data in the VLE gets built into the rest of the course activity. So that goes back into the idea of having different models, and applying the model for that course, and for those students. It has been a journey, and it will continue…

Using learning analytics to identify ‘at-risk’ students within eight weeks of starting university: problems and opportunities – Avril Dewar, University of Edinburgh

This work I will be presenting has been undertaken with my colleagues at the Centre for Medical Education, as well as colleagues in the School of Veterinary Medicine and also Maths.

There is good evidence that performance in first year will map quite closely to performance as a whole in a programme. So, with that in mind, we wanted to develop an early warning system to identify student difficulties and disengagement before they reach assessment. Generally the model we developed worked well. About 80% of at risk students were identified. And there were large differences between the most and least at-risk students – the lowest risk score and the highest risk score which suggests this was a useful measure.

The measures we used included:

  • Engagement with routine tasks
  • Completion of formative assessment – including voluntary formative assessment
  • Tutorial attendance (and punctuality where available) – but this proved least useful.
  • Attendance at voluntary events/activities
  • Virtual Learning Environment (VLE) exports (some)
    • Time until first contact proved to be the most useful of these

We found that the measures sometimes failed because the data exports were not always that useful for appropriate (e.g. VLE tables of 5000 colums). Patterns of usage were hard to investigate as raw data on, e.g. time of day of accesses, not properly usable though we think that is useful. Similarly there is no way to know if long usage means a student has logged in, then Googled or left their machine, then returned – or whether it indicates genuine engagement.

To make learning analytics useful we think we need the measures, and the data supporting them, to be simple, to be comprehensible, accessible – and also comparable to data from other systems (e.g. we could have used library data alongside our VLE issues), to scale easily – e.g. common characteristics between schools, not replicating existing measures, discriminates between students – some of the most useful things like the time to first contact, central storage.

We also found there were things that we could access but didn’t use. Some for ethical and some for practical reasons. IP addresses for location was an ethical issue for us, discussion boards similarly we had concern about – we didn’t want students to be put off participating in discussions. Or time taken to answer individual questions. We are concerned that theoretical issues that could be raised could include: evidence that student has been searching essay-buying websites; student is absent from class and claims to be ill but IP address shows another location, etc.

There were also some concerns about the teacher-student relationship. Knowing too much can create a tension in the student-teacher relationship. And the data one could gather about a student could become a very detailed tracking and monitoring system… for that reason we always aim to be conservative, rather than exhaustive in our data acquisition.

We have developed training materials and we are making these open source so that we can partner with other schools, internationally. Whilst each school will have their own systems and data but we are keen to share practice and approaches. Please do get in touch if you would like access to the data, or would like to work with us.

Q&A

Q – Paula) Do you think there is a risk of institutions sleep walking into student dissatisfaction. We are taking a staged approach… but I see less effort going into intervention, to the staff side of what could be done… I take it that email was automated… Scalability is good for that, but I am concerned students won’t respond to that as it isn’t really personalised at all. And how were students in your project, Avril, notified.

A – Avril) We did introduce peer led workshops… We are not sure if that worked yet, still waiting for results of those. We emailed to inform our students if they wanted to be part of this and if they wanted to be notified of a problem. Later years were less concerned, saw the value. First year students were very concerned, so we phrased our email very carefully. When a student was at risk emails were sent individually by their personal tutors. We were a bit wary of telling students of what had flagged them up – it was a cumulative model… were concerned that they might then engage just with those things and then not be picked up by the model.

Niall: Thank you for that fascinating talk. Have you written it up anywhere yet?

Avril: Soon!

Niall: And now to Wilbert…

The feedback hub; where qualitative learning support meets learning analytics – Wilbert Kraan, Cetis

Before I start I have heard about some students gaming some of the simpler dashboards so I was really interested in that.

So, I will be sort and snappy here. The Feedback Hub work has just started… this is musings and questions at this stage. This work is part of the larger Jisc Electronic Management of Assessment (EMA) project. And we are looking at how we might present feedback and learning analytics side by side.

The EMA project is a partnership between Jisc, UCISA and HeLF. It builds on earlier Jisc Assessment and Feedback work And it is a co-design project that identifies priorities, solution areas… and we are now working on solutions. So one part of this is about EMA requirements and workflows, particularly the integration of data (something Sheila touched upon). There is also work taking place on an EMA toolkit that people can pick up and look at. And then there is the Feedback Hub, which I’m working on.

So, there is a whole assessment and feedback lifecycle (as borrowed from a model developed by Manchester Metropolitan, with they permission), This goes from Specifying to Setting, Supporting, Submitting, Marking and production of feedback, Recording of grades etc… and those latter stages is where the Feedback Hub sits.

So, what is a feedback hub really? It is a system that provides a degree programme of life wide view of assignments and feedback. The idea is that it moves beyond the current module that you are doing, to look across modules and across years. There will be feedback that is common across areas, that gives a holistic view of what has already been done. So this is a new kind of thing… When I look at nearest tools I found VLE features – database view of all assignments for a particular student for learner and tutor to see. A simple clickable list that is easy to do and does help. Another type is a tutoring or assignment management system – capturing timetables of assignments, tutorials etc. These are from tutor perspective. Some show feedback as well. And then we have assignment services – including Turnitin – about plagiarism, but also management of logistics of assignments, feedback etc.

So, using those kinds of tools you can see feedback as just another thing that gets put in the learning records store pot in some ways. But feedback can be quite messy, hard to disentangle in line feedback from the document itself. Teachers approach feedback differently… though pedagogically the qualitative formative feedback that appears in these messy ways can be hugely valuable.  Also these online assessment management tools can be helpful for mapping and developing learning outcomes and rubrics – connecting that to the assignment you can gain some really interesting data… There is also the potential for Computer Aided Assessment feedback – sophisticated automated data on tests and assignments which work well in some subjects. And possibly one of the most interesting learning analytics data is on the engagement with feedback. A concern from academic staff is that you can give rich feedback, but if the students don’t use it how useful it is really? So capturing that could be useful…

So, having identified those sources, how do we present such a holistic view? One tool presents this as an activity stream – like Twitter and Facebook – with feedback part of that chronological list of assignments… We know that that could help. Also an expanding learning outcomes rubric – click it to see feedback connected to it, would it be helpful? We could also do text extraction, something like Wordle, but would that help? And the other thing that might see is clickable grades – to understand what a grade means… And finally should we combine feedback hub with analytics data visualisations.

Both learning analytics and feedback track learning progress over time, and try to predict the future. Feedback related data can be a useful learning analytics data source.

Q&A

Q – Me) Adoption and issues of different courses doing different things? Student expectations and added feedback?

A) This is an emerging area… IET in London/University of London have been trialing this stuff… they have opened that box… Academic practice can make people very cautious…

Comment) Might also address the perennial student question of wanting greater quality feedback… Might address deficit of student satisfaction

A) Having a coordinated approach to feedback… From a pedagogical point of view that would help. But another issue there is that of formative feedback, people use these tools in formative ways as well. There are points of feedback before a submission that could be very valuable, but the workload is quite spectacular as well. So balancing that could be quite an interesting thing.

Jisc work on Analytics – update on progress to date– Paul Bailey, Jisc and Niall Sclater. 

Paul: We are going to give you a bit of an update on where we are on the Learning Analytics project, and then after that we’ll have some short talks and then will break out into smaller groups to digest what we’ve talked about today.

The priorities we have for this project are: (1) basic learning analytics solution, an interventions tool and a student tool; (2) code of practice for learning analytics; and (3) learning analytics support and network.

We are a two year project, with the clock ticking from May 2015. We have started by identifying suppliers to initiate contracts and develop products; then institutions will be invited to participate in the discovery stage or pilots (June-Sept 2015). Year 1 in Sept 2015-2016 we will run that discovery stage (10-20 institutions), pilots (10+ institutions); institutions move from discovery to pilot. Year 2 will be about learning from and embedding that work. And for those of you that have worked with us in the past, the model is a bit different: rather than funding you then learning from that, we will be providing you with support and some consultancy and learning from this as you go (rather than funding).

Michael Webb: So… we have a diagram of the process here… We have procured a learning records warehouse (the preferred supplier there is H2P). The idea that VLEs, Student Information Systems and Library Systems feeding into that. There was talk today of Blackboard being hard to get data out of, we do have Blackboard on-board.

Diagram of the Jisc Basic Learning Analytics Solution presented by Paul Bailey and Michael Webb

Diagram of the Jisc Basic Learning Analytics Solution presented by Paul Bailey and Michael Webb

Paul: Tribal are one of the solutions, pretty much off the shelf stuff. Various components and we hope to role it out to about 15 institutions in the first year. The second option there will be the open solution, which is partly developed but needs further work. So the option will be to engage with either one of those solutions, or to engage with both perhaps.

The learning analytics processors will feed the staff dashboards, into a student consent service, and both of those will connect to the alert and intervention system. And there will be a Student App as well.

Michael: The idea is that all of the components are independent so you can buy one, or all of them, or the relevant parts of the service for you.

Paul: The student consent service is something we will develop in order to provide some sort of service to allow students to say what kinds of information can or cannot be shared (of available data from those systems that hold data on them). The alert and intervention system is an area that should grow quite a bit…

So, the main components are the learning records warehouse, the learning analytics processor – for procurement purposes the staff dashboard is part of that, and the student app. And once you have that learning records warehouse is there, you could build onto that, use your own system, use Tableau, etc.

Just to talk about the Discovery Phase, we hope to start that quite soon. The invitation will come out through the Jisc Analytics email list – so if you want to be involved, join that list. We are also setting up a questionnaire to collect readiness information and for institution to express interest. Then in the discovery process (June/July onward) there will be a select preferred approach for the discovery phase. This will be open to around 20 institutions. We have three organisations involved here: Blackboard; a company called DTP Solution Path (as used by Nottingham Trent); and UniCom. For the pilot (September(ish) onward) we have a select solution preference (Year 1-15 (proprietary – Tribal) and 15 open).

Niall: the code of practice is now a document of just more than two pages around complex legal and ethical issues. They can be blockages to move that forward… so this is an attempt to have an overview document to help institution to overcome those issues. We have a number of institutions who will be trialing this. That’s at draft stage right now, and with advisory group to suggest revisions. It is likely to launch by Jisc in June. Any additional issues are being reflected in a related set of online guidance documents.

Effective Learning Analytics project can be found: http://www.jisc.ac.uk/rd/projects/

Another network on 24th June at Nottingham Trent University. At that meeting we are looking to fund some small research type projects – there is an Ideascale page for that. About five ideas in the mix at the moment. Do add ideas (between now and Christmas) and do vote on those. There will be pitches there, for ones to take forward. And if you want funding to go to you as a sole trader rather than to a large institution, that can also happen.

Q&A

Q) Will the open solution be shared on something like GitHub so that people can join in

A) Yes.

Comment – Micheal: Earlier today people talked about data that is already available, that’s in the discovery phase when people will be on site for a day or up to a week in some cases. Also earlier on there was talk about data tracking, IP address etc, and the student consent system we have included is to get student buy-in for that process, so that you are legally covered for what you do as well. And there is a lot of focus on flagging issues, and intervention. The intervention tool is a really important part of this process, as you’ll have seen from our diagram.

For more information on the project see: http://analytics.jiscinvolve.org/wp/

Open Forum – input from participants, 15 min lightning talks.

Assessment and Learning Analytics – Prof Blazenka Divjak, University of Zagreb (currently visiting University of Edinburgh)

I have a background in work with a student body of 80,000 students, and use of learning analytics. And the main challenge I have found has been the management and cleansing of data. If you want to make decisions, learning analytics are not always suitable/in an appropriate state for this sort of use.

But I wanted to today about assessment. What underpins effective teaching? Well this relates to the subject, the teaching methods, the way in which students develop and learn (Calderhead, 1996), and awareness of the relationship between teaching and learning. Assessment is part of understanding that.

So I will talk to two case studies across courses using the same blended approach with open source tools (Moodle and connected tools).

One of these examples is Discrete Math with Graph Theory, a course for the Master of Informatics course with around 120 students and 3 teachers. This uses problem (authentic) posing and problem solving. We have assessment criteria and weighted rubrics (AHP method). So here learning analytics are used for identification of performance based on criteria. We also look at differences between groups (gender, previous study, etc.). Correlation of authentic problem solving with other elements of assessments – hugely important for future professional careers but not always what happens.

The other programme, Project Management for the Master of Entrepreneurship programme, has 60 students and 4 teachers. In this case project teams work on authentic tasks. Assessment criteria + weighted rubrics – integrated feedback. The course uses self-assessment, peer-assessment, and teacher assessment. Here the learning analytics are being used to assess consistency, validity, reliability of peer-assessment. Metrics here can include the geometry of learning analytics perhaps.

Looking at a graphic analysis of one of these courses shows how students are performing against criteria – for instance they are better at solving problems than posing problems. Students can also benchmark themselves against the group, and compare how they are doing.

The impact of student dashboards – Paula Smith, UoE

I’m going to talk to you about an online surgery course – the theory not the practical side of surgery I might add. The MSc in Surgical Sciences has been running since 2007 and is the largest of the medical distance learning programmes.

The concept of learning analytics may be relatively new but we have been interested in student engagement and participation, and how that can be tracked and acknowledged for a long time as it is part of what motivates students to engage. So I am going to be talking about how we use learning analytics to make an intervention but also to talk about action analytics – to make changes as well as interventions.

The process before the project I will talk about had students being tracked via an MCQ system – students would see a progress bar but staff could see more details. At the end of every year we would gather that data, and present a comparative picture so that students could see how they were performing compared to peers.

Our programmes all use bespoke platforms and that meant we could work with the developers to design measures on student engagement – for example number of posts. A crude way to motivate students. And that team also created activity patterns so we could understand the busier times – and it is a 24/7 programme. All of our students work full time in surgical teams so this course is an add on to that. We never felt a need to make this view available to students… this is a measure of activity but how does that relate  to learning? We need more tangible metrics.

So, in March last year I started a one day a week secondment for a while with Wilma Alexander and Mark Wetton at IS. That secondment has the objectives of creating a student “dashboard” which would allow students to monitor their progress in relation to peers; to use the dashboard to identify at-risk students for early interventions; and then evaluate what (if any) impact that intervention had.

So, we did see a correlation between in-course assessment and examination marks. The exam is 75-80% (was 80, now 75) in the first year. It is a heavily weighted component. You can do well in the exam, and get a distinction, with no in course work during the year. The in-course work is not compulsory but we want students to see the advantage of in course assessments. So, for the predictive modelling regression analysis revealed that only two components had any bearing on end of year marks, which were discussion board ratings, and exam performance (year 1); or exam performance (year 2). So, with that in mind we moved away from predictive models we decided to do a dashboard for students to present a snapshot of their progress against others’. And we wanted this to be simple to understand…

So, here it is… we are using Tableau to generate this. Here the individual student can see their own performance in yellow/orange and compare to the wider group (blue). The average is used to give a marker… If the average is good (in this example an essay has an average mark of 65%) that’s fine, if the average is poor (discussion board which are low weighted has an average of under 40, which is a fail at MSc level) that may be more problematic. So that data is provided with caveats.

Paula Smith shows visualisations created using Tableu

Paula Smith shows visualisations created using Tableu

This interface has been released – although my intervention is just an email which points to the dashboard and comments on performance. We have started evaluating it: the majority think it is helpful (either somewhat, or a lot). But worryingly a few have commented “no, unhelpful”, and we don’t know the reasons for that. But we have had positive comments on the whole. We asked about extra material for one part of the course. And we asked students how the data makes them feel… although the majority answered ‘interested’, ‘encouraged’, and ‘motivated’, one commented that they were apathetic about it – actually we only had a 15% response rate for this survey which suggests that apathy is widely felt.

Most students felt the dashboard provided feedback, which was useful. And the majority of students felt they would use the dashboard – mainly monthly or thereabouts.

I will be looking further at the data on student achievement and evaluating it over this summer, and should be written up at the end of the year. But I wanted to close with a quote from Li Yuan, at CETIS: “data, by itself, does not mean anything and it depends on human interpretation and intervention“.

Learning Analytics – Daley Davis, Altis Consulting (London) 

We are a consulting company and we are well established in Australia so I thought it would be relevant to talk about what we do there on learning analytics. Australia are ahead on learning analytics and that may well be because they brought in changes to funding fees in 2006 so they view students differently. They are particularly focused on retention. And I will talk about work we did with UNE (University of New England), a university with mainly online students and around 20,000 students in total. They wanted to improve student attrition. So we worked with them to set up a system for a student early alert system for identifying students at risk on disengaging. It used triggers of student interaction as predictors. And this work cut attrition from 18% to 12% and saving time and money for the organisation.

The way this worked was that students had an automated “wellness” engine, with data aggregated at school and head of school levels. And what happened was that staff were ringing students every day – finding out about problems with internet connections, issues at home etc. Some of these easily fixed or understood.

The system picked up data from their student record system, their student portal, and they also have a system called “e-motion” which asks students to indicate how they are feeling every day – four ratings and also a free text box (that we also mined).

Data was mined with weightings and a student who had previously failed a course, and a student who was very unhappy were both aspects weighted much more heavily. As was students not engaging for 40 days or more (versus other levels, weighted more lightly).

Daley Davis shows the weightings used in a Student Early Alert System at UNE

Daley Davis shows the weightings used in a Student Early Alert System at UNE

Universities are looking at what they already have, coming up with a technical roadmap. But they need to start with the questions you want to answer… What do your students want? What are your KPIs? And how can you measure those KPIs. So, if you are embarking on this process I would start with a plan for 3 years, toward your perfect situation, so you can then make your 1 year or shorter term plans in the direction of making that happen…

Niall: What I want you to do just now is to discuss the burning issues… and come up with a top three…

And [after coffee and KitKats] we are back to share our burning issues from all groups…

Group 1:

  • Making sure we start with questions first – don’t start with framework
  • Data protection and when you should seek consent
  • When to intervene – triage

Group 2:

  • How to decided on what questions to decide on, and what questions and data are important anyway?
  • Implementing analytics – institutional versus course level analytics? Both have strengths, both have risks/issues
  • And what metrics do you use, what are reliable…

Group 3:

  • Institutional readiness for making use of data
  • Staff readiness for making use of data
  • Making meaning from analytics… and how do we support and improve learning without always working on the basis of a deficit model.

Group 4:

  • Different issues for different cohorts – humanities versus medics in terms of aspirations and what they consider appropriate, e.g. for peer reviews. And undergrads/younger students versus say online distance postgrads in their careers already
  • Social media – ethics of using Facebook etc. in learning analytics, and issue of other discussions beyond institution
  • Can’t not interpret data just because there’s an issue you don’t want to deal with.

Group 5:

  • Using learning analytics at either end of the lifecycle
  • Ethics a big problem – might use analytics to recruits successful people; or to stream students/incentivise them into certain courses (both already happening in the US)
  • Lack of sponsorship from senior management
  • Essex found through last three student surveys that students do want analytics.

That issue of recruitment is a real ethical issue. This is something that arises in the Open University as they have an open access policy so to deny entrance because of likely drop out or likely performance would be an issue there… How did you resolve that?

Kevin, OU) We haven’t exactly cracked it. We are mainly using learning analytics to channel students into the right path for them – which may be about helping select the first courses to take, or whether to start with one of our open courses on Future Learn, etc.

Niall: Most universities already have entrance qualifications… A-Level or Higher or whatever… ethically how does that work

Kevin, OU) I understand that a lot of learning analytics is being applied in UCAS processes… they can assess the markers of success etc..

Comment, Wilma) I think  the thing about learning analytics is that predictive models can’t ethically applied to an individual…

Comment, Avril) But then there is also quite a lot of evidence that entry grades don’t necessarily predict performance.

Conclusions from the day and timetable for future events – Niall Sclator

Our next meeting will be in June in Nottingham and I hope we’ll see you then. We’ll have a speaker, via Skype, who works on learning analytics for Blackboard.

And with that, we are done with a really interesting day.