Jul 032017
 

Today I am at the Mykolo Romerio Universitetas in Vilnius, Lithuania, for the European Conference on Social Media 2017. As usual this is a liveblog so additions, corrections etc. all welcome… 

Welcome and Opening by the Conference and Programme Chairs: Aelita Skaržauskienė and Nomeda Gudelienė

Nomeda Gudelienė: I am head of research here and I want to welcome you to Lithuania. We are very honoured to have you here. Social media is very important for building connections and networking, but conferences are also really important still. And we are delighted to have you here in our beautiful Vilnius – I hope you will have time to explore our lovely city.

We were founded 25 years ago when our country gained independence from the Soviet Union. We focus on social studies – there was a gap for new public officials, for lawyers, etc. and our university was founded,

Keynote presentation: Dr. Edgaras Leichteris, Lithuanian Robotics Association – Society in the cloud – what is the future of digitalization?

I wanted to give something of an overview of how trends in ICT are moving – I’m sure you’ve all heard that none of us will have jobs in 20 years because robots will have them all (cue laughter).

I wanted to start with this complex timeline of emerging science and technology that gives an overview of Digital, Green, Bio, Nano, Neuro. Digitalisation is the most important of these trends, it underpins this all. How many of us think digitalisation will save paper? Maybe not for universities or government but young people are shifting to digital. But there are major energy implications of that, we are using a lot of power and heat to digitise our society. This takes us through some of those other areas…. Can you imagine social networking when we have direct neural interfaces?

This brings me to the Hype curve – where see a great deal of excitement, the trough of disillusionment and through to where the real work is. Gartner creates a hype cycle graph every year to illustrate technological trends. At the moment we can pick out areas like Augmented reality, virtual reality, digital currency. When you look at business impact… Well I thought that the areas that seem to be showing real change include Internet of Things – in modern factories you see very few people now, they are just there for packaging as we have sensors and devices everywhere. We have privacy-enhancing technologies, blockchain, brain computer interfaces, and virtual assistance. So we have technologies which are being genuinely disruptive.

Trends wise we also see political focus here. Why is digital a key focus in the European Union? Well we have captured only a small percentage of the potential. And when we look across the Digital Economy and Society index we see this is about skills, about high quality public services – a real priority in Lithuania at the moment – not just about digitalisation for it’s own sake. Now a few days ago the US press laughed at Jean Claude Junker admitting he still doesn’t have a smartphone, but at the same time, he and others leading the EU see that the future is digital.

Some months back I was asked at a training session “Close your eyes. You are now in 2050. What do you see?”. When I thought about that my view was rather dystopic, rather “Big Brother is watching you”, rather hierarchical. And then we were asked to throw out those ideas and focus instead on what can be done. In the Cimulact EU project we have been looking at citizens visions to look toward a future EU research and innovation agenda. In general I note that people from older European countries there was more optimism about green technologies, technology enabling societies… Whilst people from Eastern European countries have tended to be more concerned with the technologies themselves, and with issues of safety and privacy. And we’ve been bringing these ideas together. For me the vision is technology in the service of people, enabling citizens, and creating systems for green and smart city development, and about personal freedom and responsibility. What unites all of these scenarios?  The information was gathered offline. People wanted security, privacy, communication… They didn’t want the technologies per se.

Challenges here? I think that privacy and security is key for social media, and the focus on the right tool, for the right audience, at the right time. If we listen to Time Berners Lee we note that the web is developing in a way divergent from the original vision. Lorrie Faith Cranor, Carnegie Mellon University notes that privacy is possible in a laboratory condition, but in the reality of the real world, it is hard to actually achieve that. That’s why such people as Aral Balkan, self-styled Cyborg Rights Activist – he has founded a cross-Europe party just focusing on privacy issues. He says that the business model of mainstream technology under “surveillance capitalisms” is “people arming and it it is toxic to human rights and democracy”. And he is trying to bring those issues into more prominence.

Another challenge is engagement. The use and time on social media is increasing every year. But what does that mean. Mark Schaefer, Director of Schaefer Marketing Solutions, describes this as “content shock” – we don’t have the capacity to deal with and consume the amount of content we are now encountering. Jay Bayer just wrote the book “Hug your haters” making the differentiation between “offstage haters” vs. “onstage haters”. Offstage haters tend to be older, offline, and only go public if you do not respond. Onstage haters post to every social media network not thinking about the consequences. So his book is about how to respond to, and deal with, many forms of hate on the internet. And one of the recently consulted companies have 150 people working to respond to that sort of “onstage” hate.

And then we have the issue of trolling. In Lithuania we have a government trying to limit alcohol consumption – you can just imagine how many people were being supported by alcohol companies to comment and post and respond to that.

We so also need to think about engagement in something valuable. Here I wanted to highlight three initiatives, two are quite mature, the third is quite new. The first is “My Government” or E citizens. This is about engaging citizens and asking them what they think – they post a question, and provide a (simple) space for discussion. The one that I engaged with only had four respondents but it was really done well. Lithuania 2.0 was looking at ways to generate creative solutions at government level. That project ended up with a lot of nice features… Every time we took it out, they wanted new features… People engaged but then dropped off… What was being contributed didn’t seem directly enough fed into government, and there was a need to feedback to commentators what had happened as a result of their posts. So, we have reviewed this work and are designing a new way to do this which will be more focused around single topics or questions over a contained period of time, with direct routes to feed that into government.

And I wanted to talk about the right tools for the right audiences. I have a personal story here to do with the idea of whether you really need to be in every network. Colleagues asked why I was not on Twitter… There was lots of discussion, but only 2 people were using Twitter in the audience… So these people were trying to use a tool they didn’t understand to reach people who were not using those tools.

Thinking about different types of tools… You might know that last week in Vilnius we had huge rainfall and a flood… Here we have people sharing open data that allows us to track and understand that sort of local emergency.

And there is the issue of how to give users personalised tools, and give opportunity for different opinions – going beyond your filter bubble – and earn profit. My favourite tool was called Personal Journal – it had just the right combination – until that was brought by Flipboard. Algorithmic tailoring can do this well, but there is that need to make it work, to expose to wider views. There is a social responsibility aspect here.

So, the future seems to look like decentralisation – including safe silos that can connect to each other; and the right tools for the right audience. On decentralisation Blockchain, or technologies like it, are looking important. And we are starting to see possible use of that in Universities for credentialing. We can also talk about uses for decentralisation like this.

We will also see new forms of engagement going mass market. Observation of “digital natives” who really don’t want to work in a factory… See those people going to get a coffee, needing money… So putting on their visor/glasses and managing a team in a factory somewhere – maybe Australia – only until that money is earned. We also see better artificial intelligence working on the side of the end users.

The future is ours – we define now, what will happen!

Q&A

Q1) I was wondering what you mean by Blockchain, I haven’t heard it before.

A1) It’s quite complicated to explain… I suggest you Google it – some lovely explanations out there. We have a distributed

Q2) You spoke about the green issues around digitalisation, and I know Block Chain comes with serious environmental challenges – how do we manage that environmental and technological convenience challenge?

A2) Me and my wife have a really different view of green… She thinks we go back to the yurt and the plants. I think differently… I think yes, we consume more… But we have to find spots where we consume lots of energy and use technology to make it more sustainable. Last week  was at the LEGO factory in Denmark and they are working on how to make that sustainable… But that is challenging as their clients want trusted, robust, long-lasting materials. There are aready some technologies but we have to see how that will happen.

Q3) How do you see the role of artificial intelligence in privacy? Do you see it as a smart agent and intermediary between consumers and marketers?

A3) I am afraid of a future like Elon Musk where artificial intelligence takes over. But what AI can do is that it can help us interpret data for our decisions. And it can interpret patterns, filter information, help us make the best use of information. At the same time there is always a tension between advertisers and those who want to block advertisers. In Lithuanian media we see pop ups requesting that we switch off ad blocking tools… At the same time we will see more ad blocks… So Google, Amazon, Facebook… They will use AI to target us better in different ways. I remember hearing from someone that you will always have advertising – but you’ll like it as it will be tailored to your preferences.

Q4) Coming from a background of political sciences and public administration… You were talking about decentralisation… Wouldn’t it be useful to differentiate between developed and developing world, or countries in transition… In some of those contexts decentralisation can mean a lack of responsibility and accountability…

A4) We see real gaps already between cities and rural communities – increasingly cities are their own power and culture, with a lot of decisions taken like mini states. You talked a possible scenario that is quite 1984 like, of centralisation for order. But personally I still believe in decentralisation. There is a need for responsibility and accountability, but you have more potential for human rights and

Aelita Skaržauskienė: Thank you to Edgaras! I actually just spend a whole weekend reading about Block Chain as here in Lithuania we are becoming a hub for Fin Tech – financial innovation start ups.

So, I just wanted to introduce today here. Social media is very important for my department. More than 33 researchers here look at social technologies. Social media is rising in popularity, but more growth lies ahead. More than 85% of internet users are engaging with social media BUT over 5 billion people in the world still lack regular access to the internet, so that number will increase. There have already been so many new collaborations made possible for and by social media.

Thank you so much for your attention in this exciting and challenging research topic!

Stream B: Mini track on Social Media in Education (Chair: Nicola Osborne and Stefania Manca)

As I’m chairing this session (as Stefania is presenting), my notes do not include Q&A I’m afraid. But you can be confident that interesting questions were asked and answered!

The use of on-line media at a Distance Education University – Martins Nico, University of South Africa, Pretoria, South Africa

South Africa University is an online only university so I will be talking about research we have been doing on the use of Twitter, WhatsApp, Messenger, Skype and Facebook by students. A number of researchers have also explored obstacles experienced in social media. Some identified obstacles will be discussed.

In terms of professional teaching dispositoins these are principals, commitments, values and professional ethics that influence the attitude and behavious of educators, and I called on my background in organisational psychology and measuring instruments to explore different ideas of presence: virtual/technological; pedagogical; expert/cognitive; social. And these sit on a scale from Behaviours that are easily changed, and those that are difficult to change. And I want to focus on the difficult to change area of incorporating technologies significantly into practive – in the virtual/technologial presence area.

Now, about our university… We have 350k students and +/- 100k non-formal students. African and international students from 130 countries. We are a distance education university. 60% are between 25 and 39 and 63.9% are female. At Unisa we think about “blended” learning, from posting materials (snail mail) through to online presence. In our open online distance learning context we are using tools including WhatsApp, BBM, Mxit, WeChat, Research Gate, Facebook, LinkedIn, intranet, Google drive and wiki spaces, multimedia etc. We use a huge range, but it is up to the lecturer exactly which of these they use. For all the modules online you can view course materials, video clips, articles, etc. For this module that I’m showing here, you have to work online, you can’t work offline, it’s a digital course.

So, the aim of our research was to understand how effectively the various teaching dispositions are using the available online media, and to what extent there is a relationship between disposition and technology used. Most respondents we had (40.5%) had 1 to 3 years of service. Most respondents (45.1%) were Baby Boomers. Most were female (61%), most respondents were lecturers and senior lecturers.

Looking at the results, the most used was WhatsApp, with instant messaging and social networking high. Microbogging and digital curation were amongst the least used.

Now, when we compare that to the dispositions, we seen an interesting correlation between Social presence dispositions and instant messaging; virtual presence dispositions using research networking, cloud computing… The most significant relationships were between virtual and online tools. No significant correlation between pedagogical presence and any particular tools.

I just wanted to talk about the generations at play here: Baby boomers, Gen X-ers, and Millennials. Looking at the ANOVA analysis for generations and gender. Only for instance messaging and social networking was there any significant result. In both cases millennials use this most. In terms of gender we see females using social networking and instant messaging more than males. The results show younger generation or millennials and females use the two online media significantly more than other groups – for our university that has an implication to ensure our staff understand the spaces our students use.

The results confirmed that millennials are most inclined to use instant messaging and social networking. Females were using these the most.

So, my reocmmendation? To increase usage of online tools, the university will need to train academics in the usage of the various online tools. To arrange workshops on new technology, social media and mobile learning. And we need to advise and guide academics to increase web self-efficacy and compensate accordingly. And determine the needs and preferences of students pertaining to the use of social media in an ODL environment, and focus

Towards a Multilevel Framework for Analysing Academic Social Network Sites: A Network Socio-Technical Perspective – Manca Stefania, National Research Council of Italy and Juliana Elisa Raffaghelli, University of Florence, Italy

I work on the field of learning, distance education, distance learning, social media and social networking. I’m going to share with you some work I am doing with Juliana Elisa Raffaghelli on the use of social networking sites for academic purposes. I know there are lots of different definitions here. In this year I’m talking about the use of social media sites for scholarly communication. As we all know there are many different dispositions to communicate our work, for what we do, including academic publications, conferneces like this, but also we have seen a real increase in the use of social media for scholarly communication. And we have seen Acadeic.edu and ResearchGate  in widest use of these, but others are out there.

The aim of my study was to investigate these kinds of sites, not only in terms of adoption, uptake, what kind of actions people do in these sites. But the study is a theoretical piece of work taking a socio-technical perspective. But before I talk more about this I wanted to define some of the terms and context here.

Digital Scholarship is the use of digital evidence, methods of inquiry, research, publication and preservation to achieve scholarly and research goals. And can encompass both scholarly communication using digital media and research on digital media. Martin Weller, one of the first to explore this area, describes digital scholarship as shorthand of an intersection in technology-related developments namely: digital content; networked distribution; open practices. And the potential transformational quality of that intersection.

A recent update to this update, by Greenhow and Gleason (2014) have defined Social Scholarship as the means by which social media affordaces and potential values evolve the ways scholarship is done in academia. And Veletsianos and Kimmons (2012) have talked about Networked Participatory Scholarship as a new form of scholarship arising from these new uses of technology and new types of practice.

There are lots of concerns and tensions here that have been raised… The blurring boundaries of personal and professional identities. The challenge of unreliable information online. Many say that ResearchGate and Academia.edu have a huge number of fake profiles, and that not all of what is there can be considered reliable. There is also a perception that these sites may not be useful – a social factor. There is the challenge of time to curate different sites. And in the traditional idea of “publish or perish” there has been some concern over these sites.

The premise of this study is to look at popular academic sites like ResearchGate, like Academia.edu. Although these sites are increasingly transforming scholarly communication and academic identity, there is a need to understand these at a socio technical level, which is where this study comes in. Academic social network sites are networked socio-technical systems. These systems are determined by social forces and technological features. Design, implementation and use of such technologies sit in a wider cultural and social context (Hudson and Wolf 2003?).

I wanted to define these sites through a multilevel framework, with a socio-economic layer (ownership, governance, business model); techno-cultural layer (technology, user/usage, content); networked-scholar layer (networking, knowledge sharing, identity). Those first two layers come from a popular study of social networking usage, but we added that third level to capture those scholarly qualities. The first two levels refer to the structure and wider context.

We also wanted to bring in social capital theory/ies, encompassing the capacity of social networks to produce goods for mutual benefits (Bourdieu, 1986). This can take the form of useful information, personal relationships or group networks (Putnam 2000). We took this approach because the scholarly community can be viewed as knowledge sharing entities formed by trust, recognition etc. I will move past an overview of social capital types here, and move to my conclusion here…

This positions academic social network sites as networked socio-technical systems that afford social capital among scholars… And here we see structural and distributed scholarly capital.

So to finish a specific example: ResearchGate. The site was founded in 2008 by two physicists and a computer scientist. More than 12 million members distributed worldwide in 193 countries. The majority of members (60%) belong to scientific subject areas, and it is intended to open up science and enable new work and collaboration.

When we look at ResearchGate from the perspective of the socio-economic layer…. Ownership is for-profit. Governance is largely through terms and conditions. The business model is largely based on a wide range of free-of-charge services, with some subscription aspects.

From the techno-cultural layer… Technology signals automatically who one may be interested in connected with, news feeds, propts endorsements, new researchers to follow. And usage can be passive, or they can be active participants after making new connections. And content – it affords publication of diverse types of science outputs.

From the networked scholar layer. Networking – Follow and recommend, Knowledge of sharing – commenting, questions feature, search function, existing Q&As, expertise and skills, and Identity – through profile, score, reach and h-index.

On Linking Social Media, Learning Styles, and Augmented Reality in Education – Kurilovas Eugenijus, Julija Kurilova and Viktorija Dvareckiene, Vilnius University Institute of Mathematics and Informatics, Lithuania

Eugenijus: So, why augmented reality? Well according to predictions it will be the main environment for education by 2020 and we need to think about linking it to students on the one hand, and to academia as well. So, the aim of this work is to present an original method to identify students preferring to actively engage in social media and wanting to use augmented reality. To relate this to learning styles.

Looking over the literature we faced a tremendous development of social media, powered by innovative web technologies, web 2.0 and social networks. But so many different approaches here, and every student is different. Possibilities of AR seem almost endless. And the literature suggests AR may be more effective than traditional methods. Only one meta-analysis work directly addresses personalisation of AR-based systems/environments in education. The learning styles element of this work is about the differences of student needs, not specifically focused on this.

Another aspect of AR can be cognitive overload from the information, the technological devices, and the tasks they need to undertake. Few studies seem to look at pedagogy of AR, rather than tests of AR.

So, our method… All learning processes, activities and scenarios should be personalised to student learning styles. We undertook simple and convenient expert evaluation method based on application of trapezoid fuzzy learning. And looking at suitability of use in elearning. The question given to expertise focus on suitability of learning activities of social media and AR in learning. After that details explaining Felder-Silverman learning styles (4 different styles included) model were provided for the experts.

After the experts completed the questionnaire it’s easy to calculate the average values of suitability of the learning styles and learning activities for AR and social media. So we can now easily compute the average for learning styles… So every student could come in and answer a learning styles questionnaire, get their own table, their personal individual learning styles. Then combining that score, with expert ratings of AR and social media, we can calculate suitability indexes of all learning styles of particular students. The programme does this in, say, 20 seconds…

So, we asked 9 experts to share their opinion on particular learning styles… So here the experts see social media and AR as particularly suitable for visuals and activists (learning styles). We think that suitability indexes should be included in recommender systems – main thing in personalised learning system and shoudl be linked to particular students according to those suitability index. The higher suitability index the better the learning components fit particular students needed.

So, expert evaluation, linking learning activities and students by suitability index and recommender system are main intelligent technologies applied to personalise learning. An optimal learning scenario would make use of this to personalise learning. And as already noted Augmented Reality and social media are most suitable for visual and activist learners; most unsuitable for verbal and reflective learners… And that will be reflected in student happiness and outcomes. Visual and activist learners prefer to actively use learning scenarios based on application of AR and social media.

According to Felder and Silverman most people of college age and older are visual. Visual learners remember best what they see rather than what they hear. Visual learners are better able to remember images rather than verbal or text information. For visual learners the optimal learning scenario should include a range of visual materials.

Active learners do not learn much in situations that require them to be passive. They feel more comfortable with or better at active experimentation than reflective observation. For active learners the optimal scenario should include doing something that relates to the wider outside world.

And some conclusions… Learning styles show how this can be best used/tweaked to learners. The influence of visual and social media has shifted student expectations, but many teaching organisations are still quite traditional…

We now have a short break for lunch. This afternoon my notes will be sparse – I’ll be presenting in the Education Mini Track and then, shortly after, in the Social Media Excellence Awards strand. Normal service will be resumed after this afternoon’s coffee break. 

Stream B: Mini track on Social Media in Education (Chair: Nicola Osborne and Stefania Manca)

Digital Badges on Education: Past, Present and Future – Araujo Inês, Carlos Santos, Luís Pedro, and João Batista, Aveiro University, Portugal

I’ve come a little late into Ines’ talk but she is taken us through the history of badges as a certification, including from Roman times. 

This was used like an honour, but also as a punishment, with badges and tattoos used to classify that experience. For a pilgrim going to Compostello de Compagnario(?) they had a badge, but there was a huge range of fake badges out there. The pope eventually required you to come to rome to get your badges. We also have badges like martial arts belts, for scouts… So… Badges have baggage.

With the beginning of the internet we started the beginnings of digital badges, as a way to recognise achievements and to recognise professional achievements. So, we have the person who receives the badge, the person/organisation who issues the badge, and the place where the badge can be displayed. And we have incentives to collect and share badges associated with various cities across the world.

Many platforms have badges. We have Open Badges infrastructures (Credly, BadgeOS, etc.) and we have the place to display and share badges. In educational platforms we also have support for badges, including Moodle, Edmodo, Makewaves.es, SAPO campus (at our speaker’s home institution), etc. But in our VLE we didn’t see badges being used as we expected so we tried to look out at how badges are being used (see badgetheworld.org) worldwide…

How are badges being used? Authority; award and motivations; sequential orientation – gain one, then the other…; research; recognition; identity; evidence or achievement; credentialing. The biggest use was around two major areas: motivation (for students but also teachers and others), as well as credentialing. And in fact some 10% of digital badges are used to motivate and reward, and to recognise skills, of teachers. However major use is with students and that is split across award, credentialing, and evidence of achievement.

So, our final recommendations was for the integration of badges in education: that we should choose a platform, show the advantage of using a repository (e.g. a backpack for digital badges); to choose the type of badge – mission type and/or award type; and enjoy it.

Based on this information we began a MOOC: Badges: how to use it. And you can see a poster on the MOOC. And this was based on the investigation we did for this work.

Q&A

Q1) Have you had some feedback, or collected some information on students’ interest on badges… How do they react or care about getting those badges?

A1) Open Badges are not really known to everyone in Portugal. The first task I had was to explain them, and what the advantages there were. Teachers like the idea… They feel that it is very important for their students and have tried it for their students. Most of the experiments show students enjoying the badges… But I’m not sure that they understand that they can use it again if they show it in social media, into the community… But that is a task still to do. The first experience I have, I’ve known about from the teachers who were in the MOOC, they enjoy it, they liked it, they asked for more badges.

Q2) I know about the concept here… Any issues with dual ways to assess students – grades and badges.

A2) Teachers can use them with grading, in parallel. Or if they use them in sequence, they understand how to get to achieve that grade. Teacher has to decide how best to use them… Whether to use them or to motivate to a better grade.

Q3) Thank you! I’m co-ordinating an EU open badge project so I’d like to invite you to publish. Is the MOOC only in Portuguese? My students are designing interactive modules – CC licensed – with best practice guidance. Maybe we can translate and reuse?

A3) It’s only in Portuguese at the moment. We have about 120 people engaged in the MOOC and it runs on SAPO Campus. They are working on a system of badges that can be used across all institutions so that teachers can share badges, a repository to choose from and use in their own teaching.

Comment) Some of that unification really useful for having a shared understanding of meaning and usage of badges.

Yes, but from what I could see teachers were not using badges because they hadn’t really seen examples of how to use them. And they get a badge at the end of the course!

Q4) What is the difference between digital badges and open badges.

A4) Open Badges is a specific standard designed by Mozilla. Digital badges can be created by everyone.

Comment) At my institution the badges are about transferrable skills… They have to meet unit learning outcomes, graduate learning outcomes. They can get prior learning certified through them as well to reduce taught classes for masters students. But that requires that solid infrastructure.

We have infrastructure to issue badge, someone can make and create, to issue a person. The badge has metadata, where it was issued, why, by whom… And then made available in repository. e.g. Mozilla backpack.

Exploring Risk, Privacy and the Impact of Social Media Usage with Undergraduates – Connelly Louise and Nicola Osborne, University of Edinburgh, UK

Thanks to all who came along! Find our abstract and (shortly after today) our preprint here.

And I’ve now moved on to the Best Practice Awards strand where I’ll be presenting shortly… I’ve come in to the questions for Lisa Lundgren (and J. Crippen Kent)’s presentation on using social media to develop social paleontology. From the questions I think I missed hearing about a really interesting project. 

EDINA Digital Footprint Consultancy & Training Service – Osborne Nicola, University of Edinburgh, UK 

Well, that was me. No notes here, but case study will be available soon. 

D-Move – Petrovic Otto, University of Graz, Austria

This is a method and software environment to anticipate “digital natives” acceptance of technology innovations. Looking particularly at how the academic sector is having long term impact on the private sector. And our students are digital natives, that’s importance. So, to introduce me, I’m professor of information systems at the University of Graz, Austria. I have had a number of international roles and have had a strong role in bridging the connection between academia and government, am a member of regulatory authority for telecommunications for Austria. And I have started three companies.

So, what is the challenge? In 2020 more than half of all the people living in our world are born and raised with diital media and the internet, they are digital natives. And they are quite different regarding their values and norms, behaviours and attitudes. Considering the big changes in industries like media, commerce, banking, transport or the travel industry. They have more and more aversion for traditional surveys based on “imagine a situation where you use a technology like…”. Meanwhile surveys designed, executed and interpreted by traditional “experts” will result in traditional views – the real experts are the digital natives. The results should be gained through digital natives’ lives…

So the solution? It is an implemented method, based on the Delphi approach. Digital Natives are used as experts in a multi-round, structured group communication process. In each round they collect their own impressions regarding the Delphi issue. So, for instance, we have digital natives engaging in self-monitoring of their activities.

So, we recruited 4 groups of 5 digital natives; round one discussion as well as interviews with 130 digital natives; field experience embedded in everyday live; discussion; and analysis. We want to be part of the daily life of the digital native, but a big monolithic space won’t work, things change, and different groups use different spaces. We need social media and we need other types of interfaces… We don’t know them today. We have a data capturing layer for pictures, video, annotations. We also need data storage, data presentation and sharing, data tagging and organisation, access control and privacy, private spaces and personalisation… And access control is crucial, as individuals want to keep their data private until they want to share it (if at all).

D-Move gives insights into changes in Digital Natives views, experiences, self-monitoring, etc. And in terms of understanding “why” digital natives behave as they do. The participants show high satisfaction with D-Move as a space for learning. D-Move has been implemented and used in different industries for many years – used for media, transport and logistics, travel industry, health and fitness. It started with messaging based social media, going to social media platforms, finally implementing social internet of things technologies. And we are currently working with one of the most prestigious hotels – with a customer base typically in their seventies… So we are using D-Move to better understand the luxury sector and what parts of technology they need to engage with. D-Move is part of Digital Natives “natural” communication behaviour. And an on-going cycle of scientific evaluation and further technical development.

In terms of the next steps, firstly the conceptual models will be applied to the whole process to better understand digital natives thinking, feeling and behaviour. Using different front ends focused on the internet of things technologies. And offering D-Move to different industries to book certain issues like using an omnibus survey. And D-Move is both a research environment and a teaching environment. We have two streams going in the same direction, including as a teaching instrument.

Q&A

Q1) Your digital native participants, how do you recruit them?

A1) It depends on the age group. It ranges from age 10 to nearer age 30. For our university we can reach 20-25 year old, for 10 years to 20 we work with schools. 25 to 30 years old is harder to recruit.

Q2) What about ethical issues? How do you get informed consent from 10 to 18 year olds.

A2) These issues are usually based on real issues in life, and this is why security and privacy is very important. And we have sophisticated ways of indicating what is and is not OK to share. This is partly through storing data in our storage. It is not a public system, the data is not accessible to others.

Q3) We’ve seen a few presentations on using data from participants. According to the POPI Act (based on EU GDPR, you can’t use data without consent… How do you get around that?

A3) It’s easier because it is not a public system, and we do not relate information in publications, only at an aggregated level.

At this point I feel it is important to note my usual “digital native” caveat that I don’t agree with the speaker on this term (or the generalisations around it) which has been disputed widely in the literature, including by Marc Prensky, it’s originator.

The Traditions Challenge mobile App – Peruta Adam, Syracuse University, New York, USA

I’ve been looking at how colleges and universities have been using social media in student recruitment, alumni engagement etc. And it has been getting harder and harder to get access to social media data over the years, so I decided to design my own thing.

So, think back to your first days of universities. You probably had a lot of concerns. For instance Ithaca College is in a town less than 7 miles wide, there isn’t a big sports programme, it is hard to build community. So… The Traditions Challenge is a mobile app to foster engagement and community building for incoming university students – this works as a sort of bucket list of things to do and engage with. This launched at Ithaca in August 2016 with over 100 challenges. For instance FYRE, which already encourages engagement, is a challenge here. Faculty Office Hours is it’s own challenge – a way to get students to find out about these. And the fountains – a notable feature on campus – you can have your image taken. And we encourage them to explore the town, for instance engaging with the farmers market.

So there is a list of challenges, there is also a feed to see what else is happening on campus. And there is information on the school. And this is all gamified. Challenges earn points, there is a leaderboard which gets students status. And there are some actual real world challenges – stickers, a nice sweatshirt, etc. And this is all designed to get students more engaged, and more engaged early on at university. There is a lot of academic research on students who are more involved and engaged, being more likely to stay at that university.

Traditions in the University are very important We have over 4000 institutions. And those traditions translate into a real sense of identity for students. There are materials on traditions, keep safe books for ticket stubs, images, etc. but these are not digital. And those are nice but there is no way to track what is going on (plus who takes pictures).  And in fact Ithaca tried that approach on campus – a pack, whiteboards, etc. But this year, with the app, there are many more data that can be quantified. This year we had around 200 sign ups (4% of on campus students). We didn’t roll out to everyone, but picked influencers and told them to invite friends, then them to invite their friends, etc. And those 200 sign ups did over 1400 challenges and 44 checked in for prizes. Out of the top ten challenges, 70% of the most popular challenges were off-campus, and 100% of those were non-academic experiences. There is a sense of students being most successful when they involved in a lot of things, and have more activities going on. It is hard for comparing the analogue with the app but we know that at least 44 students checked in for prizes with the app, versus 8 checking in when we ran the analogue challenges.

In terms of students responding to the challenges, they enjoyed the combination of academic and non-academic activities. One student, who’d been enrolled for 3 years, found out about events on campus through the app that he had never heard about before. Some really responded to the game, to the competition. Others just enjoyed the check list, and a way to gather memories. Some just really want the prize! (Others were a lot less excited). Maybe more prizes could also help – we are trying that.

In terms of App Design and UX. And this cohort hugely care about the wording of things, the look of things… Their expectation is really really high.

In terms of identity students reported feeling a real sense of connection to Ithaca – but it’s early days, we need some longitudinal data here.

We found that the digital experience is preferred. Mobile development is expensive and time consuming – I had an idea, tried to build a prototype, applied for a grant to hire a designer, but everyone going down this path have to understand that you need developers, designers, and marketing staff at the university to be involved. And like I said, the expectations were really high, We ran workshops before making anything to make sure we understood that expectation.

I would also note that universities in the US are really getting protective of their brand, the use of logos, fonts etc. They really trusted me but it took several goes to get a logo we were all happy with.

And finally, data from the app, from follow up work, show that students really want to augment their experience with on campus activities, off campus activities… And active and involved students seem to lead to active and involved alumni – that would be great data to track. And that old book approach was lovely as tangible things are good – but it’s easy to automate some printing from the app…

So, what’s happening now? Students are starting, they will see posters and postcards, they will see targeted Facebook ads.

I think that this is a good example of how a digital experience can connect with a really tangible experience.

And finally, I’m from Suracuse University, and I’d like to thank Ithaca College, and NEAT for their support.

Q&A

Q1) What is the quality of contribution like here?

A1) It looks quite a lot like Instagram update – a photo, text, tagging, you can edit it later.

Q2) And can you share to other social media?

A2) Yes, they can share to Facebook and Twitter.

Q3) I wanted to ask about the ethics of what happens when students take images of each other?

A3) Like other types of social media, that’s a social issue. But there is a way to flag images and admins can remove content as required.

Q4) Most of your data is from female participants?

A4) Yes, about 70% of people who took part were female participants.

Q5) How did you recruit users for your focus groups?

A5) We recruited our heaviest app users… We emailed them to invite them along. the other thing I wanted to note that it wasn’t me, or colleagues, running focus groups, it was student facilitators to make this peer to peer.

Q6) How reliable is the feedback? Aren’t they going to be easy to please here?

A6) Sure, they will be eager to please so there may be some bias. I will eventually be doing some research on these data points eventually.

Q7) Any plans to expand to other universities?

A7) Yes, would love to compare the three different types of US universities in particular.

Q8) Is the app free to students?

A8) Yes, I suspect if I was to monetize this it would be for the university – a license type set up.

Mini track on Social Media in Education – Chair: Nicola Osborne and Stefania Manca

Evaluation of e-learning via Social Networking Website by full-time Students in Russia – Pivovarov Ivan, RANEPA, Russia

Why did I look at this area? Well the Russian Government is presently struggling with poor education service delivery. There is great variety in the efficiency and quality of higher education. So, the Russian Government is looking for ways to make significant improvements. And, in my opinion, social media can be effective in full time teaching. And that’s what my research was looking at.

So, I wanted to determine the best techniques of delivery of e-learning via social networking websites. I was looking at vk.com rather than Facebook. VK is by far the biggest social media in Russia. The second biggest is Instagram. There is strong competition there.

So I was looking at the views of students about educational usage of HK, targeting bachelor students coming from the Russian Presidential Academy of National Economy and Public Administration – an atypical institution focused specifically on public administration. A special interest group was created on VK and the educational content was regularly uploaded there. We had 100s of people in this group – hoping for 1000 in future. So material would include assignments, educational contests, etc. And finally after six months of using this space, I decided to make a questionnaire and ask my students what they like, what they don’t like, what they didn’t like the most, etc. and we had 100 responses. Age wise 82% were between 18 and 21 years old; 12% are 21-24 years old; 6% were older than 24. This slide shows that users of social media are typically young, when they move on in life, have families etc, they don’t tend to use social media. We did also ask about Facebook, 53% had a Facebook account, 47% did not.

We asked what the advantages are of VK over Faceook. 52% said most of their friends were on VK. 13% said that VK had a more user friendly interface than Facebook. 29% said VK has a more interesting background – sharing of music, films etc. – than Facebook. Looking at usage of VK in educational purpose, 35% use it weekly; 31% very seldom; 14% 2-3 times a weel; 10% daily. Usage is generally heavier on week days, on the weekend that drops.

So, what motivated people to be a member of the special interest group on a social media website? Most (53%) said the ease of access to information; 31% the dissemination of information; 4% said for the chance of interaction. And when asked what the students wanted to improve, most (53%) wanted to increase teacher-student interaction – more teachers to join them on social media.

Students mostly preferred posts from teachers that were about administration of the unit (28%) and content (28%). When asked if the students wanted to watch video lectures, 85% said yes. One year after this work I started to record video lectures – short (5-10 mins) and they become available prior to a lecture. And then find some new definitions, new terms, etc. And in the lecture we follow up, go into details. We can go straight into discussion. So this response inspired me to create this video content.

I also asked if students had taken an online class before, 52% had, 48% hadn’t. I asked students how they likes social media interaction on social media – 86% of students found it positive (but I only asked the after they’d been assessed to avoid too much bias in results).

Conclusions here…. Well I wanted to compare Russian to other contexts. Students in Russia wanted more teacher-student interactions. “comments must be encouraged” was not present in our experiment but in research in Turkey

Q&A

Q1) Is there an equivalent to YouTube in Russia?

A1) Yes, YouTube is big. There is an alternative called RuTube – maybe more the Russian Vimeo. No Twitter – Telegram is nearest. And no Russian analogous to SnapChat but it is pushed away by Instagram Stories now I think. WhatsApp is very popular, but I don’t see the educational potential there. This semester I had students make online translations of my lecture… with Instagram Stories… VK does try and copy features from other worldwide spaces – they have stories. But Instragram is most popular.

Q2) Among the takeaways is the need for more intense interaction between students and teaching staff. Are your teaching staff motivated to do this? I do this as a “hobby” in my institution? Is it formalised in your school? And also you said about the strength of VK versus Facebook – you noted that people using VK drives traffic… So where do you see opportunities for new platforms in Russia?

A2) Your second question, that’s hard to predict. Two or three years ago it was hard to predict Instagram Stories or Snapchat. But I guess probably social media associated with sport…

Q2) Potential won’t be hampered by attitudes in the population to steer toward what they know.

A2) I don’t think so… On the time usage front I think my peers probably share your concerns about time and engagement.

Comment) It depends on how it develops… We have a minimum standard. In our LMS there is a widget, and staff have to make videos per semester for them – that’s now a minimum practice. Although in the long run teaching isn’t really rewarded – it’s research that is typically rewarded… Do you have to answer to a manager on this in terms of restrictions on trying things out?

A2) No, I am lucky, I am free to experiment. I have a big freedom I think.

Q3) Do you feel uncomfortable being in a social space with your students… To be appropriate in your profile picture… What is your dynamic?

A3) All my photos are clean anyway! Sports, conferences… But yes, as a University teacher you have to be sensible. You have to be careful with images etc… But still…

Comment) But that’s something people struggle with – whether to have one account or several…

A3) I’m a very public person… Open to everyone… So no embaressing photos! On LMS, my university has announced that we will have a new learning management system. But there is a a question of whether students will like that or engage with that. There is a Clayton Christenson concept of disruptive innovation. This tool wasn’t designed for education, but it can be… Will an LMS be comfortable for students to use though?

Comment) Our university is almost post-LMS… So maybe if you don’t have one already, you could jump somewhere else, to a web 2.0 delivery system…

A3) The system will be run and tested in Moscow, and then rolled out to the regions…

Q4) You ran this course for your students at your institution, but was the group open to others? And how does that work in terms of payments if some are students, some are not?

A4) Everyone can join the group. And when they finish, they don’t escape from the group, they stay, they engage, they like etc. Not everyone, but some. Including graduates. So the group is open and everyone can join it.

Developing Social Media Skills for Professional Online Reputation of Migrant Job-Seekers – Buchem Ilona, Beuth University of Applied Sciences Berlin, Germany

We have 12,800 students, many of whom have a migrant background, although the work I will present isn’t actually for our students, its for migrants seeking work.

Cue a short video on what it means to be a migrant moving across the world in seek of a brighter future and a safe place to call home. Noting the significant rise in migration, often because of conflict and uncertainty. 

That was a United Nations video about refugees. Germany has accepted a huge number of refugees, over 1.2 million in 2015, 2016. And, because of that, we have and need quite a complex structure of programmes and support for migrants making their home. At the same time here Germany has shortages of skilled workers so there is a need to match up skills and training here. There is particular need for doctors, engineers, experts in technology and ICT for instance.

But, it’s not al good news. Unemployment in Germany is twice as high among people who have migration background compared to those who do not. At the same time we have migrants with high skills and social capital but it is hard if not impossible to certify and check that. Migrant academics, including refugees, are often faced with unemployment, underemployment or challenging work patterns.

In that video we saw a certificate… Germany is a really organised country but that means without certificates and credentials available. But we also see the idea of the connected migrant, with social media enabling that – for social gain but also to help find jobs and training.

So the project here is “BeuthBonus”, a follow on project. We are targeted at skilled migrant workers – this partly fills a gap in delivery as training programmes for unskilled workers are more common. It was developed to help migrant academics to find appropriate work at the appropriate level. The project is funded by the German Federal Ministry of Research and Education, the German Federal Ministry of Labour, and also part of an EU Open Badges pilot as we are also an Open Badges Network pilot for recognition of skills.

Our participants 2015-16 are 28 in total (12 female, 16 male), from 61 applications. Various backgrounds but we have 20 different degrees there: 28% BA, 18% MA, 7% PhD. They are mainly 30-39 or 40-49 and they are typically from Tunisia, Afghanistan, Syria, etc.

So, the way this works is that we cooperate with different programmes – e.g. an engineer might take an engineering refresher/top up. We also have a module on social media – just one module – to help participants understand social media, develop their skills, and demonstrate their skills to employers. This is also a good fit as job applications are now overwhelmingly digital now. And also the employment of recruiters has moved from reserved to positive to a digital CV.

So, in terms of how companies in Germany are using social media in recruitment. Xing, a German language only version of a tool like LinkedIn, is the biggest for recruitment advertising. In terms of active sourcing in social media, 45% of job seekers prefer to be approached. And in fact 21% of job seekers would pay to be better visible in these space. 40% of job openings are actively sourced – higher in IT sector.

So we know that building an online professional reputation is important, and more highly skilled job hunters will particularly benefit from this. So, we have a particular way that we do this. We have a process for migrants to develop their online professional development. They start by searching for themselves, then others comment on what was found. They are asked to reflect and think about their own strengths and the requirements of the labour market. Then they go in and look at how the spaces are used, how people brand themselves, and use these spaces. Then some framing around a theme, plan what they will do, and then they set up a schedule for the next weeks and months… So they put it into action.

We then have instrumental ways to assess this – do they use social media, how do they use it, how often, how they connect with others, and how they express themselves online. We also take some culture specific and gender specific considerations into account in doing this.

And, to enhance online presence we look at OpenBadges, set goals, and work towards it. I will not introduce OpenBadges, but I will talk about how we understand competencies. So we have a tool called ProfilPASS – a way to capture experience as transferrable skills that can be presented to the world. We designed badges accordingly. And we have BeuthBonus Badges in the Open Badge Network, but these are on Moodle and available in German and in English to enable flexibility in appling for jobs. Those badges span different levels, they are issues badges at the appropriate levels, they can share them on Xing of LinkedIn as appropriate. And we also encourage them to also look at other sources of digital badges – from IBM developerWorks or Womens Business Club, etc.

So, these results have been really good. Before the programme we had 7% employed, but after we had 75% employed. This tends to be a short term perspective. Before the programme 0% had a digital CV, after 72% did. We see that 8% had an online profile before, but 86% now do. And that networking means they have contacts, and they have a better understanding of the labour market in Germany.

In our survey 83% felt Open Badges are useful for enhancing online reputation.

Open Badge Network has initiatives across the world. We work on Output 4: Open Badges in Territories. We work with employers on how best to articulate the names

Q&A

Q1) In your refugee and migration terminology, do you have subcategories?

A1) We do have sub categories around e.g. language level, so can refer them to language programmes before they are coming to us. And there had been a change – it used to be that economic migrants were not entitled to education, but that has changed now. Migrants and refugees are the target group. It depends on the target group…

Q2) In terms of the employer, do you create a contact point?

A2) We have an advisory board drawn from industry, also our trainers are drawn from industry.

Q3) I was wondering about the cultural differences about online branding?

A3) I have observations only, as we have only small samples and from many countries. One difference is that some people are more reserved, and would not approach someone in a direct way… They would wave (only)… And in Germany the hierarchy is not important in terms of having conversations, making approaches, but that isn’t the case in some other places. And sharing an image, and a persona… that can be challenging. That personal/professional mix can be even tricky.

Q4) How are they able to manage those presences online?

A4) Doing that searching in a group.. And with coaches they have direct support, a space to discuss what is needed, etc.

Q5) Lets say you take a refugee from country x, what is needed?

A5) They have to have a degree, and they have to have good german – a requirement of our funder – and they have to be located in Germany.

Comment) This seems like it is building so much capacity… I think what you are doing over there is fantastic and opening doors to lots of people.

Q6) In Germany, all natives have these skills already? Or do you do this for German people too? Maybe they should?

A6) For our students I tend to just provide guidance for this. But yes, maybe we need this for all our students too.

Jun 152016
 

Today I’m at the University of Edinburgh Principal’s Teaching Award Scheme Forum 2016: Rethinking Learning and Teaching Together, an event that brings together teaching staff, learning technologists and education researchers to share experience and be inspired to try new things and to embed best practice in their teaching activities.

I’m here partly as my colleague Louise Connelly (Vet School, formerly of IAD) will be presenting our PTAS-funded Managing Your Digital Footprint project this afternoon. We’ll be reporting back on the research, on the campaign, and on upcoming Digital Foorprints work including our forthcoming Digital Footprint MOOC (more information to follow) and our recently funded (again by PTAS) project: “A Live Pulse: YikYak for Understanding Teaching, Learning and Assessment at Edinburgh.

As usual, this is a liveblog so corrections, comments, etc. welcome. 

Velda McCune, Deputy Director of the IAD who heads up the learning and teaching team, is introducing today:

Welcome, it’s great to see you all here today. Many of you will already know about the Principal’s Teaching Award Scheme. We have funding of around £100k from the Development fund every year, since 2007, in order to look at teaching and learning – changing behaviours, understanding how students learn, investigating new education tools and technologies. We are very lucky to have this funding available. We have had over 300 members of staff involved and, increasingly, we have students as partners in PTAS projects. If you haven’t already put a bid in we have rounds coming up in September and March. And we try to encourage people, and will give you feedback and support and you can resubmit after that too. We also have small PTAS grants as well for those who haven’t applied before and want to try it out.

I am very excited to welcome our opening keynote, Paul Ashwin of Lancaster University, to kick off what I think will be a really interesting day!

Why would going to university change anyone? The challenges of capturing the transformative power of undergraduate degrees in comparisons of quality  – Professor Paul Ashwin

What I’m going to talk about is this idea of undergraduate degrees being transformative, and how as we move towards greater analytics, how we might measure that. And whilst metrics are flawed, we can’t just ignore these. This presentation is heavily informed by Lee Schumers work on Pedagogical Content Knowledge, which always sees teaching in context, and in the context of particular students and settings.

People often talk about the transformative nature of what their students experience. David Watson was, for a long time, the President for the Society of Higher Education (?) and in his presidential lectures he would talk about the need to be as hard on ourselves as we would be on others, on policy makers, on decision makers… He said that if we are talking about education as educational, we have to ask ourselves how and why this transformation takes place; whether it is a planned transformation; whether higher education is a nesseccary and/or sufficient condition for such transformations; whether all forms of higher education result in this transformation. We all think of transformation as important… But I haven’t really evidenced that view…

The Yerevan Communique: May 2015 talks about wanting to achieve, by 2020, a European Higher Education area where there are common goals, where there is automatic recognition of qualifictions and students and graduates can move easily through – what I would characterise is where Bologna begins. The Communique talks about higher education contributing effectively to build inclusive societies, found on democratic values and human rights where educational opportunities are part of European Citizenship. And ending in a statement that should be a “wow!” moment, valuing teaching and learning. But for me there is a tension: the comparability of undergraduate degrees is in conflict with the idea of transformational potential of undergraduate degrees…

Now, critique is too easy, we have to suggest alternative ways to approach these things. We need to suggest alternatives, to explain the importance of transformation – if that’s what we value – and I’ll be talking a bit about what I think is important.

Working with colleagues at Bath and Nottingham I have been working on a project, the Pedagogic Quality and Inequality Project, looking at Sociology students and the idea of transformation at 2 top ranked (for sociology) and 2 bottom ranked (for sociology) universities and gathered data and information on the students experience and change. We found that league tables told you nothing about the actual quality of experience. We found that the transformational nature of undergraduate degrees lies in changes in students sense of self through their engagement with discplinary knowledge. Students relating their personal projects to their disciplines and the world and seeing themselves implicated in knowledge. But it doesn’t always happen – it requires students to be intellectually engaged with their courses to be transformed by it.

To quote a student: “There is no destination with this discipline… There is always something further and there is no point where you can stop and say “I understaood, I am a sociologist”… The thing is sociology makes you aware of every decision you make: how that would impact on my life and everything else…” And we found the students all reflecting that this idea of transformation was complex – there were gains but also losses. Now you could say that this is just the nature of sociology…

We looked at a range of disciplines, studies of them, and also how we would define that in several ways: the least inclusive account; the “watershed” account – the institutional type of view; and the most inclusive account. Mathematics has the most rich studies in this area (Wood et al 2012) where the least inclusive account is “Numbers”, watershed is “Models”, most inclusive is “approach to life”. Similarly Accountancy moves from routine work to moral work; Law from content to extension of self; Music from instrument to communicating; Geograpy is from general world to interactions; Geoscience is from composition of earth – the earth, to relations earth and society. Clearly these are not all the same direction, but they are accents and flavours of the same time. We are going to do a comparison next year on chemistry and chemical engineering, in the UK and South Africa, and actually this work points at what is particular to Higher Education being about engaging with a system of knowledge. Now, my colleague Monica McLean would ask why that’s limited to Higher Education, couldn’t it apply to all education? And that’s valid but I’m going to ignore it just for now!

Another students comments on transformation of all types, for example from wearing a tracksuit to lectures, to not beginning to present themselves this way. Now that has nothing to do with the curriculum, this is about other areas of life. This student almost dropped out but the Afro Carribean society supported and enabled her to continue and progress through her degree. I have worked in HE and FE and the way students talk about that transformation is pretty similar.

So, why would going to university change anyone? It’s about exposure to a system of knowledge changing your view of self, and of the world. Many years ago an academic asked what the point of going to university was, given that much information they learn will be out of date. And the counter argument there is that engagement with seeing different perspectives, to see the world as a sociologist, to see the world as a geographer, etc.

So, to come back to this tension around the comparability of undergraduate degrees, and the transformational potential of undergraduate degrees. If we are about transformation, how do we measure it? What are the metrics for this? I’m not suggesting those will particularly be helpful… But we can’t leave metrics to what is easy to gather, we have to also look at what is important.

So if we think of the first area of compatibility we tend to use rankings. National and international higher education rankings are a dominant way of comparing institutions’ contributions to student success. All universities have a set of figures that do them well. They have huge power as they travel across a number of contexts and audiences – vice chancellors, students, departmental staff. It moves context, it’s portable and durable. It’s nonsense but the strength of these metrics is hard to combat. They tend to involved unrelated and incomparable measures. Their stability reinforces privilege – higher status institutions tend to enrol a much greated proportion of privileged students. You can have some unexpected outcomes but you have to have Oxford, Cambridge, Edinburgh, UCL, Imperial all near the top then your league table is rubbish… Because we already know they are the good universities… Or at least those rankings reinforce the privilege that already exists, the expectations that are set. They tell us nothing about transformation of students. But are skillful performances shaped by generic skills or students understanding of a particular task and their interactions with other people and things?

Now the OECD has put together a ranking concept on graduate outcomes, the AHELO, which uses tests for e.g. physics and engineering – not surprising choices as they have quite international consistency, they are measurable. And they then look at generic tests – e.g a deformed fish is found in a lake, using various press releases and science reports write a memo for policy makers. Is that generic? In what way? Students doing these tests are volunteers, which may not be at all representative. Are the skills generic? Education is about applying a way of thinking in an unstructured space, in a space without context. Now, the students are given context in these texts so it’s not a generic test. But we must be careful about what we measure as what we measure can become an index of quality or success, whether or not that is actually what we’d want to mark up as success. We have strategic students who want to know what counts… And that’s ok as long as the assessment is appropriately designed and set up… The same is true of measures of success and metrics of quality and teaching and learning. That is why I am concerned by AHELO but it keeps coming back again…

Now, I have no issue with the legitimate need for comparison, but I also have a need to understand what comparisons represent, how they distort. Are there ways to take account of students’ transformation in higher education?

I’ve been working, with Rachel Sweetman at University of Oslo, on some key characteristics of valid metrics of teaching quality. For us reliability is much much more important than availability. So, we need ways to assess teaching quality that:

  • are measures of the quality of teaching offered by institutions rather than measures of institutional prestige (e.g. entry grades)
  • require improvements in teaching practices in order to improve performance on the measures
  • as a whole form a coherent set of metrics rather than a set of disparate measures
  • are based on established research evidence about high quality teaching and learning in higher education
  • reflect the purposes of higher education.

We have to be very aware of Goodhearts’ rule that we must be wary of any measure that becomes a performance indicator.

I am not someone with a big issue with the National Student Survey – it is grounded in the right things but the issue is that it is run each year, and the data is used in unhelpful distorted ways – rather than acknowledging and working on feedback it is distorting. Universities feel the need to label engagement as “feedback moments” as they assume a less good score means students just don’t understand when they have that feedback moment.

Now, in England we have the prospect of the Teaching Excellence Framework English White Paper and Technical Consultation. I don’t think it’s that bad as a prospect. It will include students views of teaching, assessment and academic support from the National Student Survey, non completion rates, measures over three years etc. It’s not bad. Some of these measures are about quality, and there is some coherence. But this work is not based on established research evidence… There was great work here at Edinburgh on students learning experiences in UK HE, none of that work is reflected in TEF. If you were being cynical you could think they have looked at available evidence and just selected the more robust metrics.

My big issue with Year 2 TEF metrics are how and why these metrics have been selected. You need a proper consultation on measures, rather than using the White Paper and Technical Consultation to do that. The Office for National Statistics looked at measures and found them robust but noted that the differences between institutions scores on the selected metrics tend to be small and not significant. Not robust enough to inform future work according to the ONS. It seems likely that peer review will end up being how we differentiate between institution.

And there are real issues with TEF Future Metrics… This comes from a place of technical optimism that if you just had the right measures you’d know… This measure ties learner information to tax records for “Longitudinal Education Outcomes data set” and “teaching intensity”. Teaching intensity is essentially contact hours… that’s game-able… And how on earth is that about transformation, it’s not a useful measure of that. Unused office hours aren’t useful, optional seminars aren’t useful…  Keith Chigwell told me about a lecturer he knew who lectured a subject, each week fewer and fewer students came along. The last three lectures had no students there… He still gave them… That’s contact hours that count on paper but isn’t useful. That sort of measure seems to come more from ministerial dinner parties than from evidence.

But there are things that do matter… There is no mechanism outlines for a sector-wide discussion of the development of future metrics. What about expert teaching? What about students relations to knowledge? What about the first year experience – we know that that is crucial for student outcomes? Now the measures may not be easy, but they matter. And what we also see is the Learning Gains project, but they decided to work generically, but that also means you don’t understand students particular engagement with knowledge and engagement. In generic tests the description of what you can do ends up more important than what you actually do. You are asking for claims for what they can do, rather than performing those things. You can see why it is attractive, but it’s meaningless, it’s not a good measure of what Higher Education can do.

So, to finish, I’ve tried to put teaching at the centre of what we do. Teaching is a local achievement – it always shifts according to who the students are , what the setting is, and what the knowledge is. But that also always makes it hard to capture and measure. So what you probably need is a lot of different imperfect measures that can be compared and understood as a whole. However, if we don’t try we allow distorting measures, which reinforce inequalities, to dominate. Sometimes the only thing worse than not being listened to by policy makers, is being listened to them. That’s when we see a Frankenstein’s Monster emerge, and that’s why we need to recognise the issues, to ensure we are part of the debate. If we don’t try to develop alternative measures we leave it open to others to define.

Q&A

Q1) I thought that was really interesting. In your discussion of transformation of undergraduate students I was wondering how that relates to less traditional students, particularly mature students, even those who’ve taken a year out, where those transitions into adulthood are going to be in a different place and perhaps where critical thinking etc. skills may be more developed/different.

A1) One of the studies I talked about was London Metropolitan University has a large percentage of mature students… And actually there the interactions with knowledge really did prove transformative… Often students lived at home with family whether young or mature students. That transformation was very high. And it was unrelated to achievements. So some came in who had quite profound challenges and they had transformation there. But you have to be really careful about not suggesting different measures for different students… That’s dangerous… But that transformation was there. There is lots of research that’s out there… But how do we transform that into something that has purchase… recognising there will be flaws and compromises, but ensuring that voice in the debate. That it isn’t politicians owning that debate, that transformations of students and the real meaning of education is part of that.

Q2) I found the idea of transformation that you started with really interesting. I work in African studies and we work a lot on decolonial issues, and of the need to transform academia to be more representative. And I was concerned about the idea of transformation as a decolonial type issue, of being like us, of dressing like that… As much as we want to challenge students we also need to take on and be aware of the biases inherent in our own ways of doing things as British or Global academics.

A2) I think that’s a really important question. My position is that students come into Higher Education for something. Students in South Africa – and I have several projects there – who have nowhere to live, have very little, who come into Higher Education to gain powerful knowledge. If we don’t have access to a body of knowledge, that we can help students gain access to and to gain further knowledge, then why are we there? Why would students waste time talking to me if I don’t have knowledge. The world exceeds our ability to know it, we have to simplify the world. What we offer undergraduates is powerful simplifications, to enable them to do things. That’s why they come to us and why they see value. They bring their own biographies, contexts, settings. The project I talked about is based in the work of Basil Bernstein who argues that the knowledge we produce in primary research… But when we design curriculum it isn’t that – we engage with colleagues, with peers, with industry… It is transformed, changed… And students also transform that knowledge, they relate it to their situation, to their own work. But we are only a valid part of that process if we have something to offer. And for us I would argue it’s the access to body of knowledge. I think if we only offer process, we are empty.

Q3) You talked about learning analytics, and the issues of AHELO, and the idea of if you see the analytics, you understand it all… And that concept not being true. But I would argue that when we look at teaching quality, and a focus on content and content giving, that positions us as gatekeepers and that is problematic.

A3) I don’t see knowledge as content. It is about ways of thinking… But it always has an object. One of the issues with the debate on teaching and learning in higher education is the loss of the idea of content and context. You don’t foreground the content, but you have to remember it is there, it is the vehicle through which students gain access to powerful ways of thinking.

Q4) I really enjoyed that and I think you may have answered my question.. But coming back to metrics you’ve very much stayed in the discipline-based silos and I just wondered how we can support students to move beyond those silos, how we measure that, and how to make that work.

A4) I’m more course than discipline focused. With the first year of TEF the idea of assessing quality across a whole institution is very problematic, it’s programme level we need to look at. inter-professional, interdisciplinary work is key… But one of the issues here is that it can be implied that that gives you more… I would argue that that gives you differently… It’s another new way of seeing things. But I am nervous of institutions, funders etc. who want to see interdisciplinary work as key. Sometimes it is the right approach, but it depends on the problem at hand. All approaches are limited and flawed, we need to find the one that works for a given context. So, I sort of agree but worry about the evangelical position that can be taken on interdisciplinary work which is often actually multidisciplinary in nature – working with others not genuinely working in an interdisciplinary way.

Q5) I think to date we focus on objective academic ideas of what is needed, without asking students what they need. You have also focused on the undergraduate sector, but how applicable to the post graduate sector?

A5) I would entirely agree with your comment. That’s why pedagogic content matters so much. You have to understand your students first, as well as then also understanding this body of knowledge. It isn’t about being student-centered but understanding students and context and that body of knowledge. In terms of your question I think there is a lot of applicability for PGT. For PhD students things are very different – you don’t have a body of knowledge to share in the same way, that is much more about process. Our department is all PhD only and there process is central. That process is quite different at that level… It’s about contributing in an original way to that body of knowledge as its core purpose. That doesn’t mean students at other levels can’t contribute, it just isn’t the core purpose in the same way.

Parallel Sessions from PTAS projects: Social Media – Enhancing Teaching & Building Community? – Sara Dorman, Gareth James, Luke March

Gareth: It was mentioned earlier that there is a difference between the smaller and larger projects funded under this scheme – and this was one of the smaller projects. Our project was looking at whether we could use social media to enhance teaching and community in our programmes but in wider areas. And we particularly wanted to look at the use of Twitter and Facebook, to engage them in course material but also to strengthen relationships. So we decided to compare the use of Facebook used by Luke March in Russian Politics courses, with the use of Twitter and Facebook  in African Politics courses that Sara and I run.

So, why were we interested in this project? Social media is becoming a normal area of life for students, in academic practice and increasingly in teaching (Blair 2013; Graham 2014). Twitter increasingly used, Facebook well established. It isn’t clear what the lasting impact of social media would be but Twitter especially is heavily used by politicians, celebrities, by influential people in our fields. 2014 data shows 90% of 18-24 year olds regularly using social media. For lecturers social media can be an easy way to share a link as Twitter is a normal part of academic practice (e.g. the @EdinburghPIR channel is well used), keeping staff and students informed of events, discussion points, etc. Students have also expressed interest in more community, more engagement with the subject area. The NSS also shows some overall student dissatisfaction, particularly within politics. So social media may be a way to build community, but also to engage with the wider subject. And students have expressed preference for social media – such as Facebook groups – compared to formal spaces like Blackboard Learn discussion boards. So, for instance, we have a hashtag #APTD – the name of one of our courses – which staff and students can use to share and explore content, including (when you search through) articles, documents etc. shared since 2013.

So, what questions did we ask? Well we wanted to know:

  • Does social media facilitate student learning and enhance the learning experience?
  • Does social media enable students to stay informaed?
  • Does it facilitate participation in debates?
  • Do they feel more included and valued as part of the suject area?
  • Is social media complementary to VLEs like Learn?
  • Which medium works best?
  • And what disadvantages might there be around using these tools? \

We collected data through a short questionnaire about awareness, usage, usefulness. We designed just a few questions that were part of student evaluation forms. Students had quite a lot to say on these different areas.

So, our findings… Students all said they were aware of these tools. There was slightly higher levels of awareness among Facebook users, e.g. Russian Politics for both UG and PG students. Overall 80% said they were aware to some extent. When we looked at usage – meaning access of this space rather than necessarily meaningful engagement – we felt that usage of course materials on Twitter and Facebook does not equal engagement. Other studies have found students lurking more than posting/engaging directly. But, at least amongst our students (n=69), 70% used resources at least once. Daily usage was higher amongst Facebook users, i.e. Russian Politics. Twitter more than twice as likely to have never been used.

We asked students how useful they found these spaces. Facebook was seen as more useful than Twitter. 60% found Facebook “very” or “somewhat useful”. Only a third described Twitter as “somewhat useful” and none said “very useful”. But there were clear differences between UG and PG students. UG students were generally more positive than PG students. They noted that it was useful and interesting to keep up with news and events, but not always easy to tie that back to the curriculum. Students claimed it “interesting” a lot – for instance comparing historical to current events. More mixed responses included that there was plenty of material on Learn, so didn’t use FB or Twitter. Another commented they wanted everything on Learn, in one place. One commented they don’t use Twitter so don’t want to follow the course there, would prefer Facebook or Learn. Some commented that too many posts were shared, information overload. Students thought some articles were random, couldn’t tell what was good and what was not.

A lot of these issues were also raised in focus group discussions. Students do appreciate sharing resources and staying informed, but don’t always see the connection to the course. They recognise potential for debate and discussion but often it doesn’t happen, but when it does they find it intimidating for that to be in a space with real academics and others, indeed they prefer discussion away from tutors and academics on the course too. Students found Facebook better for network building but also found social vs academic distinction difficult. Learn was seen as academic and safe, but also too clunky to navigate and engage in discussions. Students were concerned others might feel excluded. Some also commented that not liking or commenting could be hurtful to some. One student comments “it was kind of more like the icing than the cake” – which I think really sums it up.

Students commented that there was too much noise to pick through. And “I didn’t quite have the know-how to get something out of”. “I felt a bit intimidated and wasn’t sure if I should join in”. others commented only using social media for social purpose – that it would be inappropriate to engage with academics there.  Some saw Twitter as a professional, Facebook as social.

So, some conclusions…

It seems that Facebook is more popoular with students than Twitter, seen as better for building community. Some differences between UG and PG students, with UG more interested. Generally less enthusiasm than anticiapted. Students were interested in nd aware of benefits of joining in discussions but also wary of commenting too much in “public”. This suggests that we need to “build community” in order for the “community building” tools to really works.

There is also an issue of lack of integration between FB, Twitter and Learn. Many of our findings reflect others, for instance Matt Graham in Dundee – who saw potential for HE humanities students. Facebook was particularly popular for their students than Twitter. He looked more at engagement and saw some students engaging more deeply with the wider African knowledge. But one outcome was that student engagement did not occur or engage sustainably without some structure – particular tasks and small nudges, connected to Learning Outcomes, flagging clear benefits at beginning, and that students should take a lead in creating groups – which came out of our work too – also suggested.

There are challenges here: inappropriate use, friending between staff and students for instance. Alastair Blair notes in an article that the utility of Twitter, despite the challenge, cannot be ignored. For academics thinking about impact it is important, but also for students it is important for alignment with wider subject area that moves beyond the classroom.

Our findings suggest that there is no need to rush into social media. But at the same time Sara and I still see benefits for areas like African Studies which is fast moving and poorly covered in the mainstream media. But the idea of students wanting to be engaged in the real world was clearly not carried through. Maybe more support and encouragement is needed for students – and maybe for staff too. And it would be quite interesting to see if and how students experiences of different politics and events – #indyref, #euref, etc. differ. Colleagues are considering using social media in a course on the US presidential election, might work out differently as students may be more confident to discuss these. The department has also moved forward with more presences for staff and students, also alumni.

Closing words from Matt Graham that encouraging students to question and engage more broadly with their subject is a key skill.

Q&A

Q1) What sort of support was in place, or guidelines, around that personla/academic identity thing?

A1) Actually none. We didn’t really realise this would happen. We know students don’t always engage in Learn. We didn’t really fully appreciate how intimidating students really found this. I don’t think we felt the need to give guidelines…

A1 – SD) We kind of had those channels before the course… It was organic rather than pedagogic…

Q1) We spoke to students who wanted more guidance especially for use in teaching and learning.

A1 – SD) We did put Twitter on the Learn page… to follow up… Maybe as academics we are the worst people to understand what students would do… We thought they would engage…

Q1) Will you develop guidelines for other courses…

A1) And a clearer explanation might encourage students to engage a bit more… Could be utility in doing some of that. University/institution wise there is cautious adoption and you see guidance issued for staff on using these things… But wouldn’t want overbearing guidance there.

Q1) We have some guidance under CC licence that you can use, available from Digital Footprints space.

Q2) Could you have a safer filtered space for students to engage. We do writing courses with international PG students and thought that might be useful to have social media available there… But maybe it will confuse them.

A2) There was a preference for a closed “safer” environment, talking only to students in their own cohort and class. I think Facebook is more suited to that sort of thing, Twitter is an open space. You can create a private Facebook group… One problem with Russian Politics was that they have a closed group… But had previous cohorts and friends of staff…

A2 – SD) We were trying to include students in real academia… Real tensions there over purpose and what students get out of it… The sense of not knowing… Some students might have security concerns but think it was insecurity in academic knowledge. They didn’t see themselves as co-producers. That needs addressing…

A2) Students being reluctant to engage isn’t new, but we thought we might have more engagement in social media. Now this was the negative side but actually there was positive things here – that wider awareness, even if one directional.

Q3) I just wanted to ask more about the confidence to participate and those comments that suggested that was a bigger issue – not just in social media – for these students, similarly information seeking behaviour

A3) There is work taking place in SPS around study skills, approaching your studies. Might be some room to introduce this stuff earlier on in school wide or subject wide courses… Especially if we are to use these schools. I completely agree that by the end of these studies you should have these skills – how to write properly, how to look for information… The other thing that comes to mind having heard our keynote this morning is the issue of transformative process. It’s good to have high expectations of UG students, and they seem to rise to the occasion… But I think that we maybe need to understand the difference between UG and PG students… And in PG years they take that further more fully.

A3 – SD) UG are really big courses – which may be part of the issue. In PG they are much smaller… Some students are from Africa and may know more, some come in knowing very little… That may also play in…

Q4) On the UG/PG thing these spaces move quickly! Which tools you use will change quickly. And actually the type of thing you post really matters – sharing a news article is great, but how you discuss and create follow up afterwards – did you see that, the follow up, the creation, the response…

A4 – SD) Students did sometimes interact… But the people who would have done that with email/Learn were the same that used social media in that way.

A4) Facebook and Twitter are new technologies still… So perhaps students will be increasingly more engaged and informed and up for engaging in these space. I’m still getting to grips with the etiquette of Twitter. There was more discussion on Facebook Groups than on Twitter… But also can be very surface level learning… It complements what we are doing but there are challenges to overcoming them… And we have to think about whether that is worthwhile. Some real positives and real challenges.

Parallel Sessions from PTAS projects: Managing Your Digital Footprint (Research Strand) – Dr Louise Connelly 

This was one of the larger PTAS-funded projects. This is the “Research Strand” is because it ran in parallel to the campaign which was separately funded.

There is so much I could cover in this presentation so I’ve picked out some areas I think will be practical and applicable to your research. I’m going to start by explaining what we mean by “Digital Footprint” and then talk more about our approach and the impact of the work. Throughout the project and campaign we asked students for quotes and comments that we could share as part of the campaign – you’ll see these throughout the presentation but you can also use these yourself as they are all CC-BY.

The project wouldn’t have been possible without an amazing research team. I was PI for this project – based at IAD but I’m now at the Vet School. We also had Nicola Osborne (EDINA), Professor Sian Bayne (School of Education). We also had two research students – Phil Sheail in Semester 1 and Clare Sowton in Semester 2. But we also had a huge range of people across the Colleges and support services who were involved in the project.

So, I thought I’d show you a short video we made to introduce the project:

YouTube Preview Image

The idea of the video was to explain what we meant by a digital foorprint. We clearly defined what we meant as we wanted to emphasis to students and staff – though students were the focus – was that your footprint is not just what you do but also what other people post about you, or leave behind about you. That can be quite scary to some so we wanted to address how you can have some control about that.

We ran a campaign with lots of resources and materials. You can find loads of materials on the website. That campaign is now a service based in the Institute for Academic Development. But I will be focusing on the research in this presentation. This all fitted together in a strategy. The campaig was to raise awareness and provide practical guidance, the research sought to gain an in-depth understanding of student’s usage and produce resources for schools. Then to feed into learning and teaching on an ongoing basis. Key to the resaerch was a survey we ran during the campaign, which was analysed by the research team..

In terms of the gap and scope of the campaign I’d like to take you back to the Number 8 bus… It was an idea that came out of myself and Nicola – and others – being asked regularly for advice and support. There was a real need here, but also a real digital skills gap. We also saw staff wanting to embed social media in the curriculum and needing support. The brainwave was that social media wasn’t the campaign that was needed, it was about digital footprint and the wider issues. We also wanted to connect to current research. boyd (2014) who works on networked teens talks about the benefits as well as the risks… as it is unclear how students are engaging with social/digital media and how they are curating their online profiles. We also wanted to look at the idea of eprofessionalism (Chester et al 2013), particularly in courses where students are treated as paraprofessionals – a student nurse, for instance, could be struck off before graduating because of social media behaviours so there is a very real need to support ad raise awareness amongst students.

Our overall research aim was to: work with students across current delivery modes (UG, PGT, ODL, PhD) in order to better understand how they 

In terms of our research objectives we wanted to: conduct research which generates a rich understanding; to develop a workshop template – and ran 35 workshops for over 1000 students in that one year; to critically analyse social media guidelines – it was quite interesting that a lot of it was about why students shouldn’t engage, little on the benefits; to work in partnership with EUSA – important to engage around e.g. campaign days; to contribute to the wider research agenda; and to effectively disseminate project findings – we engaged with support services, e.g. we worked with Careers about their LinkedIn workshops which weren’t well attended despite students wanting professional presence help and just rebranding the sessions was valuable. We asked students where they would seek support – many said the Advice Place rather than e.g. IS, so we spoke to them. We spoke to the Councelling service too about cyberbullying, revenge porn, sexting etc.

So we ran two surveys with a total of 1,457 responses. Nicola and I ran two lab-based focus groups. I interviewed 6 individuals over a range of interviews with ethnographic tracing. And we gathered documentary analysis of e.g. social media guidelines. We used mixed methods as we wanted this to be really robust.

Sian and Adam really informed our research methods but Nicola and I really led the publications around this work. We have had various publications and presentations including presentations at the European Conference on Social Media, for the Social Media for Higher Education Teaching and Learning conference. Also working on a Twitter paper. We have other papers coming. Workshops with staff and students have happened and are ongoing, and the Digital Ambassador award (Careers and IS) includes Digital Footprint as a strand. We also created a lot of CC-BY resources – e.g. guidelines and images. Those are available for UoE colleagues, but also for national and international community who have fed into and helped us develop those resources.

I’m going to focus on some of the findings…

The survey was on Bristol Online Survey. It was sent to around 1/3rd of all students, across all cohorts. The central surveys team did the ethics approval and issuing of surveys. Timing had to fit around other surveys – e.g. NSS etc. And we we had relatively similar cohorts in both surveys, the second had more responses but that was after the campaign had been running for a while.

So, two key messages from the surveys: (1) Ensure informed consent – crucial for students (also important for staff) – students need to understand the positive and negative implications of using these non traditional non university social media spaces. In terms of what that means – well guidance, some of the digital skills gap support etc. Also (2) Don’t assume what students are using and how they are using it. Our data showed age differences in what was used, cohort differences (UG, PGT, ODL, PhD), lack of awareness e.g. T&Cs, benefits – some lovely anecdotal evidence, e.g. UG informatics student approached by employers after sharing code on GitHub. Also the important of not making assumptions around personal/educational/professional environments – especially came out of interviews, and generally the implications of Digital Footprint. One student commented on being made to have a Twitter account for a course and not being happy about not having a choice in that (e.g. through embedding of tweets in Learn for instance).

Thinking about platforms…

Facebook is used by all cohorts but ODL less so (perhaps a geographic issue in part). Most were using it as a “personal space” and for study groups. Challenges included privacy management. Also issues of isolation if not all students were on Facebook.

Twitter is used mainly by PGT and PhD students, and most actively by 31-50 year olds. Lots of talk about how to use this effectively.

One of the surprises for us was that we thought most courses using social media would have guidelines in place for the use of social media in programme handbooks. But students reported them not being there, or not being aware of it. So we created example guidance which is on the website (CC-BY) and also an eprofessionalism guide (CC-BY) which you can also use in your own programme handbooks.

There were also tools we weren’t aware were in usage and that has led to a new YikYak research project which has just been funded by PTAS and will go ahead over the next year with Sian Bayne leading, myself, Nicola and Informatics. The ethnographic tracing and interviews gave us a much richer understanding of the survey data.

So, what next? We have been working with researchers in Ireland, Australia, New Zealand… EDINA has had some funding to develop an external facing consultancy service, providing training and support for NHS, schools, etc. We have the PTAS funded YikYak project. We have the Digital Footprint MOOC coming in August. The survey will be issued again in October. Lots going on, more to come!

We’ve done a lot and we’ve had loads of support and collaboration. We are really open to that collaboration and work in partnership. We will be continuing this project into the next year. I realise this is the tip of the iceberg but it should be food for thought.

Q&A 

Q1) We were interested in the staff capabilities

A1 – LC) We have run a lot of workshops for staff and research students, done a series at vet. Theres a digital skills issue, research, and learning and teaching, and personal strands here.

A1 – NO) There were sessions and training for staff before… And much of the research into social media and digital footprint has been very small cohorts in very specific areas,

Comment) I do sessions for academic staff in SPS, but I didn’t know about this project so I’ll certainly work that in.

A1 – LC) We did do a session for fourth year SPS students. I know business school are all over this as part of “Brand You”.

Q2) My background was in medicine and when working in a hospital and a scary colleague told junior doctors to delete their Facebook profiles! She was googling them. I saw an article in the Sun that badly misrepresented doctors – of doctors living the “high life” because there was something sunny.

A2 – LC) You need to be aware people may Google you… And be confident of your privacy and settings. And your professional body guidelines about what you have there. But there are grey areas there… We wanted to emphasise informed choice. You have the Right to be Forgotten law for instance. Many nursing students already knew restrictions but felt Facebook restrictions unfair… A recent article says there are 3.5 degrees of separation on Facebook – that can be risky… In teaching and learning this raises issues of who friends who, what you report… etc. The culture is we do use social media, and in many ways that’s positive.

A2 – NO) Medical bodies have very clear guidance… But just knowing that e.g. Profile pictures are always public on Facebook, you can control settings elsewhere… Knowing that means you can make informed decisions.

Q3) What is “Brand You”?

A3) Essentially it’s about thinking of yourself as a brand, how your presences are uses… And what is consistent, how you use your name, your profile images. And how you do that effectively if you do that. There is a book called “Brand You” which is about effective online presence.

Closing Keynote : Helen Walker, GreyBox Consulting and Bright Tribe Trust

I’m doing my Masters in Digital Education with University of Edinburgh, but my role is around edtech, and technology in schools, so I am going to share some of that work with you. So, to set the scene a wee video: Kids React to Technology: Old Computers:

YouTube Preview Image

Watching the kids try to turn on the machine it is clear that many of us are old enough to remember how to work late 1970s/early 1980s computers and their less than intuitive user experience.

So the gaps are maybe not that wide anymore… But there are still gaps. The gaps for instance between what students experience at home, and what they can do at home – and that can be huge. There is also a real gap between EdTech promises and delivery – there are many practitioners who are enervated about new technologies, and have high expectations. We also have to be aware of the reality of skills – and be very cautious of Prensky’s (2001) idea of the “digital native” – and how intoxicating and inaccurate that can be.

There is also a real gap between industry and education. There is so much investment in technology, and promises of technology. Meanwhile we also see perspectives of some that computers do not benefit pupils. Worse, in September 2015 the OECD reported, and it was widely re-reported that computers do not improve pupil results, and may in fact disbenefit. That risks going back before technology, or technology being the icing on the cake… And then you read the report:

“Technology can amplify great teaching but great technology cannot replace poor teaching.”

Well of course. Technology has to be pedagogically justified. And that report also encourages students as co-creators. Now if you go to big education technology shows like BETT and SETT you see very big rich technology companies offering expensive technology solutions to quite poor schools.

That reflects Education Endowment Fund Report 2012 found that “it’s the pedagogy, not technology” and the technology is a catalyst for change. Glynis Cousins says that technology has to work dynamically with pedagogy.

Now, you have fabulous physical and digital resources here. There is the issue here of what schools have. Schools often have machines that are 9-10 years old, but students have much more sophisticated devices and equipment at home – even in poor homes. Their school experience of using old kit to type essays jars with that. And you do see schools trying to innovate with technology – iPads and such in particular… They brought them, they invest thousands.. But they don’t always use them because the boring crucial wifi and infrastructure isn’t there. It’s boring and expensive but it’s imperative. You need that all in order to use these shiny things…

And with that… Helen guides us to gogopp.com and the web app to ask us why a monkey with its hand in a jar with a coin… We all respond… The adage is that if you wanted to catch a monkey you had to put an orange or some nuts in a jar, and wouldn’t let go, so a hunter could just capture the monkey. I deal with a lot of monkeys… A lot of what I work towards is convincing them that letting go of that coin, or nut, or orange, or windows 7 to move on and change and learn.

Another question for us… What does a shot of baseball players in a field have to do with edtech… Well yes, “if you build it, they will come”. A lot of people believe this is how you deal with edtech… Now although a scheme funding technology for schools in England has come to an end, a lot of Free Schools now have this idea. That if you build something, magic will happen…

BTW this gogopp tool is a nice fun free tool – great for small groups…

So, I do a lot of “change management consultation” – it’s not a great phrase but a lot of what it’s about is pretty straightforward. Many schools don’t know what they’ve got – we audit the kit, the software, the skills. We work on a strategy, then a plan, then a budget. And then we look at changes that make sense… Small scale, pathfinder projects, student led work – with students in positions of responsibility, we have a lot of TeachMeet sessions – a forum of 45 mins or so and staff who’ve worked on pathfinder projects have 2 or max 5 mins can share their experience – a way to drop golden nuggets into the day (much more effective than inset days!), and I do a lot of work with departmental heads to ensure software and hardware aligns with needs.

When there is the right strategy and the right pedagogical approach, brilliant things can happen. For instance…

Abdul Chohan, now principal of Bolton Academy, transformed his school with iPads – giving them out and asking them what to do with them. He works with Apple now…

David Mitchell (no, not that one), Deputy Headteacher in the Northwest, started a project called QuadBlogging for his 6th year students (year 7 in Scotland) whereby there are four organisations involved – 2 schools and 2 other institutions, like MIT, like the Government – big organisations. Students get real life, real world feedback in writing. They saw significant increases in their writing quality. That is a great benefit of educational technology – your audience can be as big or small as you want. It’s a nice safe contained forum for children’s writing.

Simon Blower, had an idea called “Lend me your writing”, crowdfunded Pobble – a site where teachers can share examples of student work.

So those are three examples of pedagogically-driven technology projects and changes.

And now we are going to enter Kahoot.it…

The first question is about a free VLE – Edmodo… It’s free except for analytics which is a paid for option.

Next up… This is a free behaviour management tool. The “Class Story” fundtion has recently been added… That’s Class Dojo.

Next… A wealth of free online courses, primarily aimed at science, math and computing… Khan Academy. A really famous resource now. Came about as Salmon Khan who asked for maths homework help… Made YouTube videos… Very popular and now a global company with a real range of videos from teachers. No adverts. Again free…

And next… an adapting learning platform with origins in the “School of One” in NYC. That’s Knewton. School of One is an interesting school which has done away with traditional classroom one to many systems… They use Knewton, which suggests the next class, module, task, etc. This is an “Intelligent Tutoring System” which I am skeptical of but there is a lot of interest from publishers etc. All around personalised learning… But that is all data driven… I have issues with thinking of kids as data producing units.

Next question… Office 365 tool allows for the creation of individual and class digital notebooks – OneNote. It’s a killer app that Microsoft invest in a lot.

And Patrick is our Kahoot winner (I’m second!)! Now, I use Kahoot I training sessions… It’s fun once… Unless everyone uses it through the day. It’s important that students don’t just experience the same thing again and again, that you work as a learning community to make sure that you are using tools in a way that stays interesting, that varies, etc.

So, what’s happening now in schools?

  • Mobility: BYOD, contribution, cross-platform agility
  • Office365/Google/iCloud
  • VLE/LMS – PLE/PLN – for staff and students
  • Data and tracking

So with mobility we see a growth in Bring Your Own Device… That brings a whole range of issues around esafety, around infrastructure. It’s not just your own devices, but also increasingly a kind of hire-purchase scheme for students and parents. That’s a financial pressure – schools are financially pressured and this is just a practical issue. One issue that is repeatedly coming up is the issue of cross-platform agility – phones, tablets, laptops. And discussion of bringing in keyboards, mice, and traditional set ups… Keyboard skills are being seen as important again in the primary sector. The benefit of mobile devices is collaboration, the idea of the main screen allowing everyone to be part of the classroom… You don’t need expensive software, can use e.g. cheap Reflector mirroring software. Apps… Some are brilliant, some are dreadful… Management of apps and mobile device management has become a huge industry… Working with technicians to support getting apps onto devices… How you do volume purchasing? And a lot of apps… One of two hit propositions… You don’t want the same app every week for one task… You need the trade off of what is useful versus getting the app in place/stafftime. We also have the issue of the student journey. Tools like socrative and nearpod lets you push information to devices. But we are going to look at/try now Plickers… What that does is has one device – the teachers mobile app – and I can make up printed codes (we’ve all been given one today) that can be laminated, handed out at the beginning of the year… So we then hold up a card with the appropriate answer at the top… And the teacher’s device is walked around to scan the room for the answers – a nice job for a student to do… So you can then see the responses… And the answer… I can see who got it wrong, and who got it right. I can see the graph of that….

We have a few easy questions to test this: 2+2 = (pick your answer); and how did you get here today? (mostly on foot!).

The idea is it’s a way to get higher order questioning into a session, otherwise you just hear from the kids that put their hands up all the time. So that’s Plicker… Yes, they all have silly names. I used to live in Iceland where a committee meets to agree new names – the word for computer means “witchcraft machine”.

So, thinking about Office365/Google/iCloud… We are seeing a video about a school where pupils helps promote, manage, coding, supporting use of Office365 in the school. And how that’s a way to get people into technology. These are students at Wyndham High in Norfolk – all real students. That school has adopted Office365. Both Office365 and Google offer educational environments. One of the reasons that schools err towards Office365 is because of the five free copies that students get – which covers the several locations and machines they may use at home.

OneNote is great – you can drag and drop documents… you can annotate… I use it with readings, with feedback from tutors. Why it’s useful for students is the facility to create Class Notebooks where you add classes and add notebooks. You can set up a content library – that students can access and use. You can also view all of the students notebooks in real time. In schools I work in we no longer have planners, instead have a shared class notebook – then colleagues can see and understand planning.

Other new functionality is “Classroom” where you can assign classes, assignments… It’s a new thing that brings some VLE functionality but limited in terms of grades being 0-100. And you can set up forms as well – again in preview right now but coming. Feedback goes into a CSV file in excel.

The other thing that is new is Planner – a project planning tool to assign tasks, share documents, set up groups.

So, Office 365 is certainly the tool most secondary schools I work with use.

The other thing that is happening in schools right now is the increasing use of data dashboards and tracking tools – especially in secondary schools – and that is concerning as it’s fairly uncritical. There is a tool called Office Mix which lets you create tracked content in Powerpoint… Not sure if you have access here, but you can use it at home.

Other data in schools tools include Power BI… Schools are using these for e.g. attainment outcomes. There is a free schools version of this tool (used to be too expensive). My concern is that it is not looking at what has impact in terms of teaching and learning. It’s focused on the summative, not the actual teaching and learning, not on students reporting back to teachers on their own learning. Hattie and self-reported grades tells us that students set expectations, goals, and understand rubrics for self-assessment. There is rich and interesting work to be done on using data in rich and meaningful ways.

In terms of what’s coming… This was supposed to be by 2025, then 2020, maybe sooner… Education Technology Action Group suggest online learning is an entitlement, better measures of performance, new emerging teaching and learning, wearables, etc.

Emerging EdTech includes Augmented Reality. It’s a big thing I do… It’s easy but it excites students… It’s a digital overlay on reality… So my two year old goddaughter is colouring in book that is augmented reality – you can then see a 3D virtual dinosaur coloured as per your image. And she asked her dad to send me a picture of her with a dinosaur. Other fun stuff… But where is the learning outcome here? Well there is a tool called Aurasma… Another free tool… You create a new Aura trigger image – can be anything – and you can choose your overlay… So I said I wanted to change the words on th epaper converted into French. It’s dead easy! We get small kids into this and can put loads of hidden AR content around the classroom, you can do it on t-shirts – to show inner working of the body for instance. We’ve had Year 11’s bring Year 7 textbooks to life for them – learning at both ends of the spectrum.

Last thing I want to talk about is micro:bit. This is about coding. In England and Wales coding is compulsory part of English now. All students are being issued a micro:bit and students are now doing all sorts of creative things. Young Rewired State project runs every summer and come to London to have code assessed – the winners were 5 and 6 year olds. So they will come to you with knowledge of coding – but they aren’t digital natives no matter what anyone tells you!

Q&A

Q1 – Me) I wanted to ask about equality of access… How do you ensure students have the devices or internet access at home that they need to participate in these activities and tools – like the Office365 usage at home for instance. In the RSE Digital Participation Inquiry we found that the reality of internet connectivity in homes really didn’t match up to what students will self-report about their own access to technology or internet connections, there is such baggage associated with not having internet access to access to the latest technologies and tools… So I was wondering how you deal with that, or if you have any comments on that.

A1) With the contribution schemes that schools have for devices… Parents contribute what they can, school covers the rest… So that can be 50p or £1 per month, it doesn’t need to be a lot. Also pupil premium money can be used for this. But, yes, parental engagement is important… Many students have 3G access not fixed internet for instance and that has cost implications… some can use dongles supplied by schools but just supporting students like this can cost 15k/yr to support for a small to medium sized cohort. There is some interesting stuff taking place in new build schools though… So for instance Gaia in Wales are a technology company doing a lot of the new build hardware/software set up… In many of those schools there is community wifi access… a way around that issue of connectivity… But that’s a hard thing to solve.

Q1 – Me) There was a proposal some years ago from Gordon Brown’s government, for all school aged children to have government supported internet access at home but that has long since been dropped.

Q2) I fear with technologies is that if I learn it, it’s already out of date. And also learners who are not motivated to engage with these tools they haven’t used before… I enjoyed these tools, their natty…

A2) Those are my “sweet shop” tools… Actually Office365/Google or things like Moodle are the bread and butter tools. These are fun one-off apps… They are pick up and go stuff… but its getting big tools working well that matter. Ignore the sweets if you need or want… The big stuff matters.

And with that Velda is closing with great thanks to our speakers today, to colleagues in IAD, and to Daphne Loads and colleagues. Please do share your feedback and ideas, especially for the next forum!

May 082015
 
Image of surgical student activity data presented by Paula Smith at the Learning Analytics Event

Today I am at the UK Learning Analytics Network organised by the University of Edinburgh in Association with Jisc. Read more about this on the Jisc Analytics blog. Update: you can also now read a lovely concise summary of the day by Niall Sclater, over on the Jisc Analytics blog.

As this is a live blog there may be spelling errors, typos etc. so corrections, comments, additions, etc. are welcome. 

Introduction – Paul Bailey

I’m Paul Bailey, Jisc lead on the Learning Analytics programme at the moment. I just want to say a little bit about the network. We have various bits of project activities, and the network was set up as a means for us to share and disseminate the work we have been doing, but also so that you can network and share your experience working in Learning Analytics.

Housekeeping – Wilma Alexander, University of Edinburgh & Niall Sclater, Jisc

Wilma: I am from the University of Edinburgh and I must say I am delighted to see so many people who have traveled to be here today! And I think for today we shouldn’t mention the election!

Niall: I’m afraid I will mention the election… I’ve heard that Nicola Sturgeon and Alex Salmond have demanded that Tunnucks Teacakes and Caramel Wafers must be served at Westminster! [this gets a big laugh as we’ve all been enjoying caramel wafers with our coffee this morning!]

I’ll just quickly go through the programme for the day here. We have some really interesting speakers today, and we will also be announcing the suppliers in our learning analytics procurement process later on this afternoon. But we kick off first with Dragan.

Doing learning analytics in higher education: Critical issues for adoption and implementation – Professor Dragan Gašević, Chair in Learning Analytics and Informatics, University of Edinburgh

I wanted to start with a brief introduction on why we use learning analytics. The use of learning analytics has become something of a necessity because of the growing needs of education – the growth in the number of students and the diversity of students, with MOOCs being a big part of that realisation that many people want to learn who do not fit our standard idea of what a student is. The other aspect of MOOCs is their scale: as we grow the number of students it becomes difficult to track progress and the feedback loops between students and instructions are lost or weakened.

In learning analytics we depend on two types of major information systems… Universities have had student information systems for a long time (originally paper, computerised 50-60 years ago), but they also use learning environments – the majority of universities have some online coverage of this kind for 80-90% of their programmes. But we also don’t want to exclude other platforms, including communications and social media tools. And no matter what we do with these technologies we leave a digital trace, and that is not a reversible process at this point.

So, we have all this data but what is the point of learning analytics? It is about using machine learning, computer science, etc. approaches in order to inform education. We defined learning analytics as being “measurement, collection, analysis, and reporting” of education but actually that “how” matters less than “why”. It should be about understanding and optimising learning and the environments in which learning occurs. And it is important not to forget that learning analytics are there to understand learning and are about understanding what learning is about.

Some case studies include Course Signals at Purdue. They use Blackboard for their learning management system. They wanted to predict students who would successfully complete students, and to identify those at risk. They wanted to segment their students into at high risk, at risk, or not at risk at all. Having done that they used a traffic light system to reflect that, and they used that traffic light system for students was shown both to staff and students. When they trialed that (Arnold and Pistilli 2012) with a cohort of students, they saw greater retention and success. But if we look back at how I framed this, we need to think about this in terms of whether this changes teaching…

So, also at Purdue, they undertook a project analysing the email content of instructors to students. They found that more detailed feedback, they just increased the summative feedback. So this really indicates that learning analytics really has to feed into changes in teaching practices in our institutions, and we need our learning analytics to provide signalling and guidance that enables teaching staff to improve their practice, and give more constructive feedback. (see Tanes, Arnold, King and Remnet 2011).

University of Michigan looked at “gateway course” as a way to understand performance in science courses (see Wright, McKay, Hershock, Miller and Triz 2014). They defined a measure for their courses, which was “better than expected”. There were two measures for this: previous GPA, and goals set by students for the current course. They then used predictive models for how students could be successful, and ways to help students to perform better than expected. They have also been using technology designed for behavioural change, which they put to use here… Based on that work they generated personalised messages to every students, based on rational for these students, and also providing predicted performance for particular students. For instance an example here showed that a student could perform well beyond their own goals, which might have been influenced by the science course not being their major. The motivator for students here was productive feedback… They interviewed successful students from previous years, and used that to identify behaviours etc. that led to success, and they presented that as feedback from peers (rather than instructors). And i think this is a great way to show how we can move away from very quantitative measures, towards qualitative feedback.

So, to what extent are institutions adopting these approaches? Well, there are very few institutions with institution-wide examples of adoptions. For instance University of Michigan only used this approach on first year science courses. They are quite a distributed university – like Edinburgh – which may be part of this perhaps. Purdue also only used this on some course.

Siemans, Dawson and Lynch (2014) surveyed the use of learning analytics in the HE sector, asking about the level of adoption and type of adoption, ranking these from “Awareness” to “Experimentation” to “Organisation/Students/Faculty”, “Organisational Transformation” and “Sector Transformation”. Siemens et al found that the majority of HE is at the Awareness and Experimentation phase. Similarly Goldstein and Katz (2005) found 70% of institutions at phase 1, it is higher now but bear in mind that 70% doesn’t mean others are further along that process. There is still much to do.

So, what is necessary to move forward? What are the next steps? What do we need to embrace in this process? Well lets talk a bit about direction… The metaphors from business analytics can be useful, borrow lessons from that process. McKinsey offered a really interesting business model of: Data – Model – Transform (see Barton and Court 2012). That can be a really informative process for us in higher education.

Starting with Data – traditionally when we choose to measure something in HE we refer to surveys, particularly student satisfaction surveys. But this is not something with a huge return rate in all countries. More importantly surveys are not the accurate thing. We also have progress statistics – they are in our learning systems as are data but are they useful? We can also find social networks from these systems, from interactions and from course registration systems – and knowing who students hang out with can predict how they perform. We also find that we can get this data, but then how do we process and understand that data? I know some institutions find a lack of IT support can be a significant barrier to the use of learning analytics.

Moving onto Model… Everyone talks about predictive modelling, the question has to be about the value of a predictive model. Often organisations just see this as an outsourced thing – relying on some outsider organisation and data model that provides solutions, but does not do that within the context of understanding what the questions are. And the questions are critical.

And this is, again, where we can find ourselves forgetting that learning analytics is about learning. So there are two things we have to know about, and think about, to ensure we understand what analytics mean:

(1) Instructional conditions – different courses in the same school, or even in the same programme will have a different set of instructional conditions – different approaches, different technologies, different structures. We did some research on an University through their Moodle presence and we found some data that was common to 20-25% of courses, but we did identify some data you could capture that were totally useless (e.g. time online). And we found some approaches that explained 80% of variance, so for example extensive use of Turnitin – not just for plagiarism but also by students for gathering additional feedback. One of our courses defied all trends… they had a Moodle presence but when we followed up on this, found that most of their work was actually in social media so data from Moodle was quite misleading and certainly a partial picture. (see Gasevic, Dawson, Rogers, Gasevic, 2015)

(2) Learner agency – this changes all of the time. We undertook work on the agency of learners, based on log data from a particular course. We explored 6 clusters using cluster matching algorithms… We found that there was a big myth that more time on task would lead to better performance… One of our clusters spent so much time online, another was well below. When we compared clusters we found the top students were that group spending the least time online, the other cluster spending time online performed average. This shows that this is a complex questions. Learning styles isn’t the issue, learning profiles is what matters here. In this course, one profile works well, in another a different profile might work much better. (see Kovanovic, Gasevic, Jok… 201?).

And a conclusion for this section is that our analytics and analysis cannot be generalised.

Moving finally to Transform we need to ensure participatory design of analytics tools – we have to engage our staff and students in these processes early in the process, we won’t get institutional transformation by relying on the needs of statisticians. Indeed visualisations can be harmful (Corrin and de Barba 2014). The University of Melbourne looked at the use of dashboards and similar systems and they reported that for students that were high achieving, high GPA, and high aspirations… when they saw that they were doing better than average, or better than their goals, they actually under-perform. And for those doing less well we can just reinforce issues in their self efficacy. So these tools can be harmful if not designed in a critical way.

So, what are the realities of adoption? Where are the challenges? In Australia I am part of a study commissioned by the Australian Government in South Australia. This is engaging with the entire tertiary Australian institution. We interviewed every VC and management responsible for learning analytics. Most are in phase 1 or 2… Their major goal was to enable personalised learning… the late phases… They seemed to think that magically they would move from experimentation to personalised learning, they don’t seem to understand the process to get there…

We also saw some software driven approaches. They buy an analytics programme and perceive job is done.

We also see a study showing that there is a lack of a data-informed decision making culture, and/or data not being suitable for informing those types of decisions. (Macfadyen and Dawson 2012).

We also have an issue here that researchers are not focused on scalability here… Lots of experimentation but… I may design beautiful scaffolding based on learning analytics, but I have to think about how that can be scaled up to people who may not be the instructors for instance.

The main thing I want to share here is that we must embrace the complexity of educational systems. Learning analytics can be very valuable for understanding learning but they are not a silver bullet. For institutional or sectoral transformation we need to embrace that complexity.

We have suggested the idea of Rapid Outcome Mapping Approach (ROMA) (Macfadyen, Dawson, Pardo, Gasevic 2014) in which once we have understood the objectives of learning analytics, we also have to understand the political landscape in which they sit, the financial contexts of our organisations. We have to identify stakeholders, and to identify the desired behaviour changes we want from those stakeholders. We also have to develop engagement strategy – we cannot require a single approach, a solution has to provide incentives for why someone should/should not adopt learning analytics. We have to analyse our internal capacity to effect change – especially in the context of analytics tools and taking any value form them. And we finally have to evaluate and monitor chance. This is about capacity development, and capacity development across multiple teams.

We need to learn from successful examples – and we have some to draw upon. The Open University adopted their organisational strategy, and were inspired by the ROMA approach (see Tynan and Buckingham Shum 2013). They developed the model of adoption that is right for them – other institutions will want to develop their own, aligned to their institutional needs. We also need cross-institutional experience sharing and collaboration (e.g. SOLAR, the Society for Learning Analytics Research). This meeting today is part of that. And whilst there may be some competition between institutions, this process of sharing is extremely valuable. There are various projects here, some open source, to enable different types of solution, and sharing of experience.

Finally we need to talk about ethical and privacy consideration. There is a tension here… Some institutions hold data, and think students need to be aware of the data held… But what if students do not benefit from seeing that data? How do we prepare students to engage with that data, to understand this data. The Open University is at the leading edge here and have a clear policy on ethical use of student data. Jisc also have a code of practice for learning analytics which I also welcome and think will be very useful for institutions looking to adopt learning analytics.

I also think we need to develop an analytics culture. I like to use the analogy of, say, Moneyball, where analytics make a big difference… but analytics can be misleading. Predictive models have their flaws, their false positives etc. So a contrasting example would be the Trouble with the Curve – where analytics mask underlying knowledge of an issue. We should never reject our tacit knowledge as we look at adopting learning analytics.

Q&A

Q – Niall): I was struck by your comments about asking the questions… But doesn’t that jar with the idea that you want to look at the data and exploring questions out of that data?

A – Dragan): A great question… As a computer scientist I would love to just explore the data, but I hang out with too many educational researchers… You can start from data and make sense of that. It is valid. However, whenever you have certain results you have to ask certain questions – does this make sense in the context of what is taking place, does this make sense within the context of our institutional needs, and does this make sense in the context of the instructional approach? That questioning is essential no matter what the approach.

Q – Wilma) How do you accommodate the different teaching styles and varying ways that courses are delivered?

A – Dragan) The most important part here is about the development of capabilities – at all levels and in all roles including students. So in this Australian study we identified trends, found these clusters… But some of these courses are quite traditional and linear, others are more ambitious… They have a brilliant multi-faceted approach. Learning analytics would augment this… But when we aggregate this information… But when you have more ambitious goals, the more there is to do. Time is required to adopt learning analytics with sophistication. But we also need to develop tools to the needs of tasks of stakeholders… so stakeholders are capable to work with them… But also not to be too usable. There aren’t that many data scientists so perhaps we shouldn’t use visualisations at all, maybe just prompts triggered by the data… And we also want to see more qualitative insights into our students… their discussion… when they are taking notes… And that then really gives an insight… Social interactions are so beneficial and important to benefit student learning.

Q – Wilbert) You mentioned that work in Australia about Turnitin… What was the set up there that led to that… Or was it just the plagiarism prediction use?

A – Dragan) Turned out to be the feedback being received through Turnitin… Not plagiarism side. Primarily it was on the learner side, not so much the instructors. There is an ethical dilemma there if you do expose that to instructors… If they are using the system to get feedback… Those were year one students, and many were international and from Asia and China where cultural expectation of reproducing knowledge is different… So that is also important.

Q) Talking about the Purdue email study, and staff giving formative feedback to students at risk – how did that work?

A) They did analysis of these messages, and the content of them, and found staff mainly giving motivational messages. I think that was mainly because traffic light system indicated at risk nature but not why that was the case… you need that information too..

Q) Was interested in rhetoric of personalised learning by Vice Chancellors, but most institutions being at stage 1 or 2… What are the institutional blockers? How can they be removed?

A) I wish I had an answer there! But the senior leaders are sometimes forced to make decisions based on financial needs, not just about being driven by data or unaware of data. So in Australian institutions many are small organisations, with limited funding… and existence of the institutions is part of what they have to face, quite aside from adoption of learning analytics. But also University of Melbourne is a complex institution, a leading researcher there but cannot roll out same solution across very different schools and courses….

Niall: And with that we shall have to end the Q&A and hand over to Sheila, who will talk about some of those blockers…

Learning Analytics: Implementation Issues – Sheila MacNeill, Glasgow Caledonian University

I was based at CETIS involved in learning analytics for a lot of that time… But for the last year and a half I have been based at Glasgow Caledonian University. And today I am going to talk about my experience of moving from that overview position to being in an institution and actually trying to do it… I’m looking for a bit of sympathy and support, but hoping to also contextualise some of what Dragan talked about.

Glasgow Caledonian University has about 17,000 students, mostly campus based although we are looking at online learning. We are also committed to blended learning. We provide central support for the university, working with learning technologies across the institution. So I will share my journey… joys and frustrations!

One of the first things I wanted to do was to get my head around what kind of systems we had around the University… We had a VLE (Blackboard) but I wanted to know what else people were using… This proved very difficult. I spoke to our IS department but finding the right people was challenging, a practical issue to work around. So I decided to look a big more broadly with a mapping of what we do… looking across our whole technology position. I identified the areas and what fitted into those areas:

  • (e) Assessment and feedback – Turnitin – we see a lot of interest in rubrics and marking and feedback processes that seem to be having a big impact on student success and actually plagiarism isn’t its main usefulness the more you use it, Gradecentre, Wikis/blogs/journals, peer/self assessment, (e)feedback.
  • (e) Portfolios – wikis/blogs/journals, video/audio – doing trials with nursing students of a mobile app in this space.
  • Collaboration – discussion boards, online chat, video conferencing etc.
  • Content – lectures, PDFs, etc….

I’ve been quite interested in Mark (?) idea of a “core VLE”. Our main systems group around SRS (students records system – newly renamed from it’s former name, ISIS), GCU Learn, the Library, 3rd Party Services. When I did hear from our IS team I found such a huge range of tools that our institution has been using, it seems like every tool under the sun has been used at some point.

In terms of data… we can get data from our VLE, from Turnitin, from wikis etc. But it needs a lot of cleaning up. We started looking at our data, trying it on November data from 2012 and 2013 (seemed like a typical month). And we found some data we would expect, changes/increases of use over time. But we don’t have data on a module level, or programme level, etc. Hard to view in detail or aggregate up (yet). We haven’t got data from all of our systems yet. I would say we are still at the “Housekeeping” stage… We are just seeing what we have, finding a baseline… There is an awful lot of housekeeping that needs to be done, a lot of people to talk to…

But as I was beginning this process I realised we had quite a number of business analysts at GCU who were happy to talk. We have been drawing out data. We can make dashboards easily, but USEFUL dashboards are proving more tricky! We have meanwhile been talking about Blackboard about their data analytics platform. It is interesting thinking about that… given the state we are in about learning analytics, and finding a baseline, we are looking at investing some money to see what data we can get from Blackboard that might enable us to start asking some questions. There are some things I’d like to see from, for example, combining on campus library card data with VLE data. And also thinking about engagement and what that means… Frustratingly for me I think that it is quite hard to get data from Blackboard… I’m keen that next license we sign we actually have a clause about the data we want, in the format we want, when we want it… No idea if that will happen but I’d like to see that.

Mark Stubbs (MMU) has this idea of a tube map of learning… This made me think of the Glasgow underground map – going in circles a bit, not all joining up. We really aren’t quite there yet, we are having conversations about what we could, and what we should do. In terms of senior management interest in learning analytics… there is interest. But when we sent out the data we had looked we did get some interesting responses. Our data showed a huge increase in mobile use – we didn’t need a bring your own device policy, students were already doing it! We just need everything mobile ready. Our senior staff are focused on NSS and student survey data, that’s a major focus. I would like to take that forward to understand what is happening, and more structured way…

And I want to finish by talking about some of the issues that I have encountered. I came in fairly naively to this process. I have learned that…

Leadership and understanding is crucial – we have a new IS director which should make a great difference. You need both carrots and stick, and that takes a real drive from the top to make things actually start.

Data is obviously important. Our own colleagues have issues access data from across the institution. People don’t want to share, they don’t know if they are allowed to share. There is a cultural thing that needs investigating – and that relates back to leadership. There are also challenges that are easy to fix such as server space. But that bigger issue of access/sharing/ownership all really matter.

Practice can be a challenge. Sharing of experience and engagement with staff, having enough people understanding systems, is all important for enabling learning analytics here. The culture of talking together more, having a better relationship within an institution, matters.

Specialist staff time matters – as Dragan highlighted in his talk. This work has to be prioritised – a project focusing on learning analytics would give the remit for that, give us a clear picture, establish what needs to be done. To not just buy in technology but truly assess needs before doing that, and in adopting technology. There is potential but learning analytics has to be a priority if it is to be adopted properly.

Institutional amnesia – people can forget what they have done, why, and what they do not do it before… More basic house keeping again really. Understanding, and having tangible evidence of, what has been done and why is also important more broadly when looking at how we use technologies in our institutions.

Niall: Thanks for such an honest appraisal of a real experience there. We need that in this community, not just explaining the benefits of learning analytics. The Open University may be ahead now, but it also faced some of those challenges initially for instance. Now, over to Wilma.

Student data and Analytics work at the University of Edinburgh – Wilma Alexander, University of Edinburgh

Some really interesting talks already to do, I’ll whiz through some sections in fact as I don’t need to retread some of this. I am based in Information Services. We are a very very large, very old University, and it is very general. We have a four year degree. All of that background makes what we do with student data, something it is hard to generalise about.

So, the drivers for the project I will focus on, came out of the understanding we already have about the scale and diversity of this institution. Indeed we are increasingly encouraging students to make imaginative cross overs between schools and programmes which adds to this. Another part of the background is that we have been seriously working in online education, and in addition to a ground breaking digital education masters delivered online, we also have a number of online masters. And further background here is that we have a long term set of process that encourages students to contribute to the discussions within the university, owners and shapers of their own learning.

So, we have an interest in learning analytics, and understanding what students are doing online. We got all excited by the data and probably made the primary error of thinking about how we could visualise that data in pretty pictures… but we calmed down quite quickly. As we turned this into a proper project we framed it much more in the context of empowering students around their activities, about data we already have about our students. We have two centrally supported VLEs at Edinburgh (and others!) which are Blackboard Learn, our main largest system with virtually all on campus programmes use that VLE in some way, but for online distance programmes we took the opportunity to try out Moodle – largely online programmes, and largely created as online distance masters programmes. So, already there is a big distance between how these tools are used in the university, never mind how they are adopted.

There’s a video which shows this idea of building an airplane whilst in the air… this projects first phase, in 2014, has felt a bit like that at times! We wanted to see what might be possible but also we started by thinking about what might be displayed to students. Both Learn and Moodle give you some data about what students do in your courses… but that is for staff, not visible to students. When we came to looking at the existing options… None of what Learn offers quite did what we wanted as none of the reports were easily made student facing (currently Learn does BIRT reports, course reports, stats columns in grade center etc). We also looked at Moodle and there was more there – it is open source and developed by the community so we looked at available options there…

We were also aware that there were things taking place in Edinburgh elsewhere. We are support not research in our role, but we were aware that colleagues were undertaking research. So, for instance my colleague Paula Smith was using a tool to return data as visualisations to students.

What we did as a starting point was to go out and collect user stories. We were asking both staff and students, in terms of information available in the VLE(s), what sort of things would be of interest. We framed this as a student, as a member of staff, as a tutor… as “As a… I want to… So that I can…”. We had 92 stories from 18 staff and 32 students. What was interesting here was that much of what was wanted was already available. For staff much of the data they wanted they really just had to be shown and supported to find the data already available to them. Some of the stuff that came in as “not in scope” was not within the very tight boundaries we had set for the project. But a number of things of interest, requests for information, that we passed on to appropriate colleagues – so one area for this was reading lists and we have a tool that helps with that so we passed that request onto library colleagues.

We also pooled some staff concerns… and this illustrates what both Dragan and Sheila have said about the need to improve the literacy of staff and students using this kind of information, and the need to contextualise it… e.g: “As a teacher/personal tutor I want to have measures of activity of the students so that I can be alerted to who are “falling by the wayside” for instance – a huge gap between activity and that sort of indicator.

Student concerns were very thoughtful. They wanted to understand how they compare, to track progress, they also wanted information on timetables of submissions, assignment criteria/weighting etc. We were very impressed by the responses we had and these are proving valuable beyond the scope of this project…

So, we explored possibilities, and then moved on to see what we could build. And this is where the difference between Learn and Moodle really kicked in. We initially thought we could just install some of the Moodle plugins, and allow programmes to activate them if they wanted to… But that fell at the first hurdle as we couldn’t find enough staff willing to be that experimental with a busy online MSc programme. The only team up for some of that experimentation were the MSc in Digital Education team, which was done as part of a teaching module in some strands of the masters. This was small scale hand cranked from some of these tools. One of the issues with pretty much all of these tools is that they are staff facing and therefore not anonymous.So we had to do that hand cranking to make the data anonymous.

We had lots of anecdotal and qualitative information through focus groups and this module, but we hope to pin a bit more down on that. Moodle is of interest as online distance students… there is some evidence that communication, discussion etc. activity is a reasonable proxy for performance here as they have to start with the VLE.

Learn is a pretty different beast as it is on campus. Blended may not have permeated as strongly on campus. So, for Learn what we do have this little element that produces a little click map of sorts (engagements, discussion, etc)… For courses that only use the VLE for lecture notes, that may not be useful at all, but for others it should give some idea of what is taking place. We also looked at providing guidebook data – mapping use of different week’s sections/resources/quizzes to performance.

We punted those ideas out. The activity information didn’t excite folk as much (32% thought it was useful). The grade information was deemed much more useful (97% thought it was useful)… But do we want our students hooked on that sort of data? Could it have negative effects, as Dragan talked about. And how useful is that overview?

When it came to changes in learning behaviour we had some really interesting and thoughtful responses here. Of the three types of information (discussion boards, grade, activity) it was certainly clear though that grade was where the student interest was.

We have been looking at what courses use in terms of tools… doing a very broad brush view of 2013/14 courses we can see what they use and turn on from: some social/peer network ability – where we think there really is huge value, the percentage of courses actively using those courses on campus are way below those using the VLE for the other functions of Content+Submission/Assessment and Discussion Boards.

So context really is all – reflecting Dragan again here. It has to work for individuals on a course level. We have been mapping our territory here – the university as a whole is hugely engaged in online and digital education in general, and very committed to this area, but there is work to do to join it all up. When we did information gathering we found people coming out of the woodwork to show their interest. The steering group from this project has a representative from our student systems team, and we are talking about where student data lives, privacy and data protection, ethics, and of course also technical issues quite apart from all that… So we also have the Records Management people involved. And because Jisc has these initiatives, and there is an EU initiative, we are tightly engaging with the ethical guidance being produced by both of these.

So, we have taken a slight veer from doing something for everyone in the VLEs in the next year. The tool will be available to all but what we hope to do is to work very closely with a small number of courses, course organisers, and students, to really unpick on a course level how the data in the VLE gets built into the rest of the course activity. So that goes back into the idea of having different models, and applying the model for that course, and for those students. It has been a journey, and it will continue…

Using learning analytics to identify ‘at-risk’ students within eight weeks of starting university: problems and opportunities – Avril Dewar, University of Edinburgh

This work I will be presenting has been undertaken with my colleagues at the Centre for Medical Education, as well as colleagues in the School of Veterinary Medicine and also Maths.

There is good evidence that performance in first year will map quite closely to performance as a whole in a programme. So, with that in mind, we wanted to develop an early warning system to identify student difficulties and disengagement before they reach assessment. Generally the model we developed worked well. About 80% of at risk students were identified. And there were large differences between the most and least at-risk students – the lowest risk score and the highest risk score which suggests this was a useful measure.

The measures we used included:

  • Engagement with routine tasks
  • Completion of formative assessment – including voluntary formative assessment
  • Tutorial attendance (and punctuality where available) – but this proved least useful.
  • Attendance at voluntary events/activities
  • Virtual Learning Environment (VLE) exports (some)
    • Time until first contact proved to be the most useful of these

We found that the measures sometimes failed because the data exports were not always that useful for appropriate (e.g. VLE tables of 5000 colums). Patterns of usage were hard to investigate as raw data on, e.g. time of day of accesses, not properly usable though we think that is useful. Similarly there is no way to know if long usage means a student has logged in, then Googled or left their machine, then returned – or whether it indicates genuine engagement.

To make learning analytics useful we think we need the measures, and the data supporting them, to be simple, to be comprehensible, accessible – and also comparable to data from other systems (e.g. we could have used library data alongside our VLE issues), to scale easily – e.g. common characteristics between schools, not replicating existing measures, discriminates between students – some of the most useful things like the time to first contact, central storage.

We also found there were things that we could access but didn’t use. Some for ethical and some for practical reasons. IP addresses for location was an ethical issue for us, discussion boards similarly we had concern about – we didn’t want students to be put off participating in discussions. Or time taken to answer individual questions. We are concerned that theoretical issues that could be raised could include: evidence that student has been searching essay-buying websites; student is absent from class and claims to be ill but IP address shows another location, etc.

There were also some concerns about the teacher-student relationship. Knowing too much can create a tension in the student-teacher relationship. And the data one could gather about a student could become a very detailed tracking and monitoring system… for that reason we always aim to be conservative, rather than exhaustive in our data acquisition.

We have developed training materials and we are making these open source so that we can partner with other schools, internationally. Whilst each school will have their own systems and data but we are keen to share practice and approaches. Please do get in touch if you would like access to the data, or would like to work with us.

Q&A

Q – Paula) Do you think there is a risk of institutions sleep walking into student dissatisfaction. We are taking a staged approach… but I see less effort going into intervention, to the staff side of what could be done… I take it that email was automated… Scalability is good for that, but I am concerned students won’t respond to that as it isn’t really personalised at all. And how were students in your project, Avril, notified.

A – Avril) We did introduce peer led workshops… We are not sure if that worked yet, still waiting for results of those. We emailed to inform our students if they wanted to be part of this and if they wanted to be notified of a problem. Later years were less concerned, saw the value. First year students were very concerned, so we phrased our email very carefully. When a student was at risk emails were sent individually by their personal tutors. We were a bit wary of telling students of what had flagged them up – it was a cumulative model… were concerned that they might then engage just with those things and then not be picked up by the model.

Niall: Thank you for that fascinating talk. Have you written it up anywhere yet?

Avril: Soon!

Niall: And now to Wilbert…

The feedback hub; where qualitative learning support meets learning analytics – Wilbert Kraan, Cetis

Before I start I have heard about some students gaming some of the simpler dashboards so I was really interested in that.

So, I will be sort and snappy here. The Feedback Hub work has just started… this is musings and questions at this stage. This work is part of the larger Jisc Electronic Management of Assessment (EMA) project. And we are looking at how we might present feedback and learning analytics side by side.

The EMA project is a partnership between Jisc, UCISA and HeLF. It builds on earlier Jisc Assessment and Feedback work And it is a co-design project that identifies priorities, solution areas… and we are now working on solutions. So one part of this is about EMA requirements and workflows, particularly the integration of data (something Sheila touched upon). There is also work taking place on an EMA toolkit that people can pick up and look at. And then there is the Feedback Hub, which I’m working on.

So, there is a whole assessment and feedback lifecycle (as borrowed from a model developed by Manchester Metropolitan, with they permission), This goes from Specifying to Setting, Supporting, Submitting, Marking and production of feedback, Recording of grades etc… and those latter stages is where the Feedback Hub sits.

So, what is a feedback hub really? It is a system that provides a degree programme of life wide view of assignments and feedback. The idea is that it moves beyond the current module that you are doing, to look across modules and across years. There will be feedback that is common across areas, that gives a holistic view of what has already been done. So this is a new kind of thing… When I look at nearest tools I found VLE features – database view of all assignments for a particular student for learner and tutor to see. A simple clickable list that is easy to do and does help. Another type is a tutoring or assignment management system – capturing timetables of assignments, tutorials etc. These are from tutor perspective. Some show feedback as well. And then we have assignment services – including Turnitin – about plagiarism, but also management of logistics of assignments, feedback etc.

So, using those kinds of tools you can see feedback as just another thing that gets put in the learning records store pot in some ways. But feedback can be quite messy, hard to disentangle in line feedback from the document itself. Teachers approach feedback differently… though pedagogically the qualitative formative feedback that appears in these messy ways can be hugely valuable.  Also these online assessment management tools can be helpful for mapping and developing learning outcomes and rubrics – connecting that to the assignment you can gain some really interesting data… There is also the potential for Computer Aided Assessment feedback – sophisticated automated data on tests and assignments which work well in some subjects. And possibly one of the most interesting learning analytics data is on the engagement with feedback. A concern from academic staff is that you can give rich feedback, but if the students don’t use it how useful it is really? So capturing that could be useful…

So, having identified those sources, how do we present such a holistic view? One tool presents this as an activity stream – like Twitter and Facebook – with feedback part of that chronological list of assignments… We know that that could help. Also an expanding learning outcomes rubric – click it to see feedback connected to it, would it be helpful? We could also do text extraction, something like Wordle, but would that help? And the other thing that might see is clickable grades – to understand what a grade means… And finally should we combine feedback hub with analytics data visualisations.

Both learning analytics and feedback track learning progress over time, and try to predict the future. Feedback related data can be a useful learning analytics data source.

Q&A

Q – Me) Adoption and issues of different courses doing different things? Student expectations and added feedback?

A) This is an emerging area… IET in London/University of London have been trialing this stuff… they have opened that box… Academic practice can make people very cautious…

Comment) Might also address the perennial student question of wanting greater quality feedback… Might address deficit of student satisfaction

A) Having a coordinated approach to feedback… From a pedagogical point of view that would help. But another issue there is that of formative feedback, people use these tools in formative ways as well. There are points of feedback before a submission that could be very valuable, but the workload is quite spectacular as well. So balancing that could be quite an interesting thing.

Jisc work on Analytics – update on progress to date– Paul Bailey, Jisc and Niall Sclater. 

Paul: We are going to give you a bit of an update on where we are on the Learning Analytics project, and then after that we’ll have some short talks and then will break out into smaller groups to digest what we’ve talked about today.

The priorities we have for this project are: (1) basic learning analytics solution, an interventions tool and a student tool; (2) code of practice for learning analytics; and (3) learning analytics support and network.

We are a two year project, with the clock ticking from May 2015. We have started by identifying suppliers to initiate contracts and develop products; then institutions will be invited to participate in the discovery stage or pilots (June-Sept 2015). Year 1 in Sept 2015-2016 we will run that discovery stage (10-20 institutions), pilots (10+ institutions); institutions move from discovery to pilot. Year 2 will be about learning from and embedding that work. And for those of you that have worked with us in the past, the model is a bit different: rather than funding you then learning from that, we will be providing you with support and some consultancy and learning from this as you go (rather than funding).

Michael Webb: So… we have a diagram of the process here… We have procured a learning records warehouse (the preferred supplier there is H2P). The idea that VLEs, Student Information Systems and Library Systems feeding into that. There was talk today of Blackboard being hard to get data out of, we do have Blackboard on-board.

Diagram of the Jisc Basic Learning Analytics Solution presented by Paul Bailey and Michael Webb

Diagram of the Jisc Basic Learning Analytics Solution presented by Paul Bailey and Michael Webb

Paul: Tribal are one of the solutions, pretty much off the shelf stuff. Various components and we hope to role it out to about 15 institutions in the first year. The second option there will be the open solution, which is partly developed but needs further work. So the option will be to engage with either one of those solutions, or to engage with both perhaps.

The learning analytics processors will feed the staff dashboards, into a student consent service, and both of those will connect to the alert and intervention system. And there will be a Student App as well.

Michael: The idea is that all of the components are independent so you can buy one, or all of them, or the relevant parts of the service for you.

Paul: The student consent service is something we will develop in order to provide some sort of service to allow students to say what kinds of information can or cannot be shared (of available data from those systems that hold data on them). The alert and intervention system is an area that should grow quite a bit…

So, the main components are the learning records warehouse, the learning analytics processor – for procurement purposes the staff dashboard is part of that, and the student app. And once you have that learning records warehouse is there, you could build onto that, use your own system, use Tableau, etc.

Just to talk about the Discovery Phase, we hope to start that quite soon. The invitation will come out through the Jisc Analytics email list – so if you want to be involved, join that list. We are also setting up a questionnaire to collect readiness information and for institution to express interest. Then in the discovery process (June/July onward) there will be a select preferred approach for the discovery phase. This will be open to around 20 institutions. We have three organisations involved here: Blackboard; a company called DTP Solution Path (as used by Nottingham Trent); and UniCom. For the pilot (September(ish) onward) we have a select solution preference (Year 1-15 (proprietary – Tribal) and 15 open).

Niall: the code of practice is now a document of just more than two pages around complex legal and ethical issues. They can be blockages to move that forward… so this is an attempt to have an overview document to help institution to overcome those issues. We have a number of institutions who will be trialing this. That’s at draft stage right now, and with advisory group to suggest revisions. It is likely to launch by Jisc in June. Any additional issues are being reflected in a related set of online guidance documents.

Effective Learning Analytics project can be found: http://www.jisc.ac.uk/rd/projects/

Another network on 24th June at Nottingham Trent University. At that meeting we are looking to fund some small research type projects – there is an Ideascale page for that. About five ideas in the mix at the moment. Do add ideas (between now and Christmas) and do vote on those. There will be pitches there, for ones to take forward. And if you want funding to go to you as a sole trader rather than to a large institution, that can also happen.

Q&A

Q) Will the open solution be shared on something like GitHub so that people can join in

A) Yes.

Comment – Micheal: Earlier today people talked about data that is already available, that’s in the discovery phase when people will be on site for a day or up to a week in some cases. Also earlier on there was talk about data tracking, IP address etc, and the student consent system we have included is to get student buy-in for that process, so that you are legally covered for what you do as well. And there is a lot of focus on flagging issues, and intervention. The intervention tool is a really important part of this process, as you’ll have seen from our diagram.

For more information on the project see: http://analytics.jiscinvolve.org/wp/

Open Forum – input from participants, 15 min lightning talks.

Assessment and Learning Analytics – Prof Blazenka Divjak, University of Zagreb (currently visiting University of Edinburgh)

I have a background in work with a student body of 80,000 students, and use of learning analytics. And the main challenge I have found has been the management and cleansing of data. If you want to make decisions, learning analytics are not always suitable/in an appropriate state for this sort of use.

But I wanted to today about assessment. What underpins effective teaching? Well this relates to the subject, the teaching methods, the way in which students develop and learn (Calderhead, 1996), and awareness of the relationship between teaching and learning. Assessment is part of understanding that.

So I will talk to two case studies across courses using the same blended approach with open source tools (Moodle and connected tools).

One of these examples is Discrete Math with Graph Theory, a course for the Master of Informatics course with around 120 students and 3 teachers. This uses problem (authentic) posing and problem solving. We have assessment criteria and weighted rubrics (AHP method). So here learning analytics are used for identification of performance based on criteria. We also look at differences between groups (gender, previous study, etc.). Correlation of authentic problem solving with other elements of assessments – hugely important for future professional careers but not always what happens.

The other programme, Project Management for the Master of Entrepreneurship programme, has 60 students and 4 teachers. In this case project teams work on authentic tasks. Assessment criteria + weighted rubrics – integrated feedback. The course uses self-assessment, peer-assessment, and teacher assessment. Here the learning analytics are being used to assess consistency, validity, reliability of peer-assessment. Metrics here can include the geometry of learning analytics perhaps.

Looking at a graphic analysis of one of these courses shows how students are performing against criteria – for instance they are better at solving problems than posing problems. Students can also benchmark themselves against the group, and compare how they are doing.

The impact of student dashboards – Paula Smith, UoE

I’m going to talk to you about an online surgery course – the theory not the practical side of surgery I might add. The MSc in Surgical Sciences has been running since 2007 and is the largest of the medical distance learning programmes.

The concept of learning analytics may be relatively new but we have been interested in student engagement and participation, and how that can be tracked and acknowledged for a long time as it is part of what motivates students to engage. So I am going to be talking about how we use learning analytics to make an intervention but also to talk about action analytics – to make changes as well as interventions.

The process before the project I will talk about had students being tracked via an MCQ system – students would see a progress bar but staff could see more details. At the end of every year we would gather that data, and present a comparative picture so that students could see how they were performing compared to peers.

Our programmes all use bespoke platforms and that meant we could work with the developers to design measures on student engagement – for example number of posts. A crude way to motivate students. And that team also created activity patterns so we could understand the busier times – and it is a 24/7 programme. All of our students work full time in surgical teams so this course is an add on to that. We never felt a need to make this view available to students… this is a measure of activity but how does that relate  to learning? We need more tangible metrics.

So, in March last year I started a one day a week secondment for a while with Wilma Alexander and Mark Wetton at IS. That secondment has the objectives of creating a student “dashboard” which would allow students to monitor their progress in relation to peers; to use the dashboard to identify at-risk students for early interventions; and then evaluate what (if any) impact that intervention had.

So, we did see a correlation between in-course assessment and examination marks. The exam is 75-80% (was 80, now 75) in the first year. It is a heavily weighted component. You can do well in the exam, and get a distinction, with no in course work during the year. The in-course work is not compulsory but we want students to see the advantage of in course assessments. So, for the predictive modelling regression analysis revealed that only two components had any bearing on end of year marks, which were discussion board ratings, and exam performance (year 1); or exam performance (year 2). So, with that in mind we moved away from predictive models we decided to do a dashboard for students to present a snapshot of their progress against others’. And we wanted this to be simple to understand…

So, here it is… we are using Tableau to generate this. Here the individual student can see their own performance in yellow/orange and compare to the wider group (blue). The average is used to give a marker… If the average is good (in this example an essay has an average mark of 65%) that’s fine, if the average is poor (discussion board which are low weighted has an average of under 40, which is a fail at MSc level) that may be more problematic. So that data is provided with caveats.

Paula Smith shows visualisations created using Tableu

Paula Smith shows visualisations created using Tableu

This interface has been released – although my intervention is just an email which points to the dashboard and comments on performance. We have started evaluating it: the majority think it is helpful (either somewhat, or a lot). But worryingly a few have commented “no, unhelpful”, and we don’t know the reasons for that. But we have had positive comments on the whole. We asked about extra material for one part of the course. And we asked students how the data makes them feel… although the majority answered ‘interested’, ‘encouraged’, and ‘motivated’, one commented that they were apathetic about it – actually we only had a 15% response rate for this survey which suggests that apathy is widely felt.

Most students felt the dashboard provided feedback, which was useful. And the majority of students felt they would use the dashboard – mainly monthly or thereabouts.

I will be looking further at the data on student achievement and evaluating it over this summer, and should be written up at the end of the year. But I wanted to close with a quote from Li Yuan, at CETIS: “data, by itself, does not mean anything and it depends on human interpretation and intervention“.

Learning Analytics – Daley Davis, Altis Consulting (London) 

We are a consulting company and we are well established in Australia so I thought it would be relevant to talk about what we do there on learning analytics. Australia are ahead on learning analytics and that may well be because they brought in changes to funding fees in 2006 so they view students differently. They are particularly focused on retention. And I will talk about work we did with UNE (University of New England), a university with mainly online students and around 20,000 students in total. They wanted to improve student attrition. So we worked with them to set up a system for a student early alert system for identifying students at risk on disengaging. It used triggers of student interaction as predictors. And this work cut attrition from 18% to 12% and saving time and money for the organisation.

The way this worked was that students had an automated “wellness” engine, with data aggregated at school and head of school levels. And what happened was that staff were ringing students every day – finding out about problems with internet connections, issues at home etc. Some of these easily fixed or understood.

The system picked up data from their student record system, their student portal, and they also have a system called “e-motion” which asks students to indicate how they are feeling every day – four ratings and also a free text box (that we also mined).

Data was mined with weightings and a student who had previously failed a course, and a student who was very unhappy were both aspects weighted much more heavily. As was students not engaging for 40 days or more (versus other levels, weighted more lightly).

Daley Davis shows the weightings used in a Student Early Alert System at UNE

Daley Davis shows the weightings used in a Student Early Alert System at UNE

Universities are looking at what they already have, coming up with a technical roadmap. But they need to start with the questions you want to answer… What do your students want? What are your KPIs? And how can you measure those KPIs. So, if you are embarking on this process I would start with a plan for 3 years, toward your perfect situation, so you can then make your 1 year or shorter term plans in the direction of making that happen…

Niall: What I want you to do just now is to discuss the burning issues… and come up with a top three…

And [after coffee and KitKats] we are back to share our burning issues from all groups…

Group 1:

  • Making sure we start with questions first – don’t start with framework
  • Data protection and when you should seek consent
  • When to intervene – triage

Group 2:

  • How to decided on what questions to decide on, and what questions and data are important anyway?
  • Implementing analytics – institutional versus course level analytics? Both have strengths, both have risks/issues
  • And what metrics do you use, what are reliable…

Group 3:

  • Institutional readiness for making use of data
  • Staff readiness for making use of data
  • Making meaning from analytics… and how do we support and improve learning without always working on the basis of a deficit model.

Group 4:

  • Different issues for different cohorts – humanities versus medics in terms of aspirations and what they consider appropriate, e.g. for peer reviews. And undergrads/younger students versus say online distance postgrads in their careers already
  • Social media – ethics of using Facebook etc. in learning analytics, and issue of other discussions beyond institution
  • Can’t not interpret data just because there’s an issue you don’t want to deal with.

Group 5:

  • Using learning analytics at either end of the lifecycle
  • Ethics a big problem – might use analytics to recruits successful people; or to stream students/incentivise them into certain courses (both already happening in the US)
  • Lack of sponsorship from senior management
  • Essex found through last three student surveys that students do want analytics.

That issue of recruitment is a real ethical issue. This is something that arises in the Open University as they have an open access policy so to deny entrance because of likely drop out or likely performance would be an issue there… How did you resolve that?

Kevin, OU) We haven’t exactly cracked it. We are mainly using learning analytics to channel students into the right path for them – which may be about helping select the first courses to take, or whether to start with one of our open courses on Future Learn, etc.

Niall: Most universities already have entrance qualifications… A-Level or Higher or whatever… ethically how does that work

Kevin, OU) I understand that a lot of learning analytics is being applied in UCAS processes… they can assess the markers of success etc..

Comment, Wilma) I think  the thing about learning analytics is that predictive models can’t ethically applied to an individual…

Comment, Avril) But then there is also quite a lot of evidence that entry grades don’t necessarily predict performance.

Conclusions from the day and timetable for future events – Niall Sclator

Our next meeting will be in June in Nottingham and I hope we’ll see you then. We’ll have a speaker, via Skype, who works on learning analytics for Blackboard.

And with that, we are done with a really interesting day.