Jul 072016
 

On 27th June I attended a lunchtime seminar, hosted by the University of Edinburgh Centre for Research in Digital Education with Professor Catherine Hasse of Aarhus University

Catherine is opening with a still from Ex-machina (2015, dir. Alex Garland). The title of my talk is the difference between human and posthuman learning, I’ll talk for a while but I’ve moved a bit from my title… My studies in posthuman learning has moved me to more of a posthumanistic learning… Today human beings are capable of many things – we can transform ourselves, and ourselves in our environment. We have to think about that and discuss that, to take account of that in learning.

I come from the centre for Future Technology, Culture and Learning, Aarhus University, Denmark. We are hugely interdisciplinary as a team. We discuss and research what is learning under these new conditions, and to consider the implications for education. I’ll talk less about education today, more about the type of learning taking place and the ways we can address that.

My own background is in anthropology of education in Denmark, specifically looking at physicists.In 2015 we got a big grant to work on “The Technucation Project” and we looked at the anthropology of education in Denmark in nurses and teachers – and the types of technological literacy they require for their work. My work (in English) has been about “Mattering” – the learning changes that matter to you. The learning theories I am interested in acknowledge cultural differences in learning, something we have to take account of. What it is to be human is already transformed. Posthumanistics learning is a new conceptualisations and material conditions that change what it was to be human. It was and it ultra human to be learners.

So… I have become interested in robots.. They are coming into our lives. They are not just tools. Human beings encounter tools that they haven’t asked for. You will be aware of predictions that over a third of jobs in the US may be taken over by automated processes and robots in the next 20 years. That comes at the same time as there is pressure on the human body to become different, at the point at which our material conditions are changing very rapidly. A lot of theorists are picking up on this moment of change, and engaging with the idea of what it is to be human – including those in Science and Technology Studies, and feminist critique. Some anthropologist suggest that it is not geography but humans that should shape our conceptions of the world (Anthrpos- Anthropocene), others differ and conceive of the capitalocene. When we talk about the posthuman a lot of the theories acknowledge that we can’t talk about the fact that we can’t think of the human in the same way anymore. Kirksey & Helmreich (2010) talk of “natural-cultural hybrids”, and we see everything from heart valves to sensors, to iris scanning… We are seeing robots, cybords, amalgamations, including how our thinking feeds into systems – like the stockmarkets (especially today!). The human is de-centered in this amalgamation but is still there. And we may yet get to this creature from Ex-machina, the complex sentient robot/cyborg.

We see posthuman learning in uncanny valley… gradually we will move from robots that feel far away, to those with human tissues, with something more human and blended. The new materialism and robotics together challenge the conception of the human. When we talk of learning we talk about how humans learn, not what follows when bodies are transformed by other (machine) bodies. And here we have to be aware that in feminism that people like Rosa Predosi(?) have been happy with the discarding of the human: for them it was always a narrative, it was never really there. The feminist critique is that the “human” was really retruvian man.. But they also critique the idea that Posthu-man is a continuation of individual goal-directed and rational self-enhancing (white male) humans. And that questions the post human…

There are actually two ways to think of the post human. One way is the posthuman learning as something that does away with useless, biological bodies (Kurzweil 2005) and we see transhumanists, Verner Vinge, Hans Moravec, Natasha Vita-More in this space that sees us heading towards the singularity. But the alternative is a posthumanistic approach, which is about cultural transformations of boundaries in human-material assemblages, referencing that we have never been isolated human beings, we’ve always been part of our surroundings. That is another way to see the posthuman. This is a case that I make in an article (Hayles 1999) that we have always been posthuman. We also see have, on the other hand, Spinozists approach which is about how are we, if we understand ourselves as de-centered, able to see ourselves as agents. In other words we are not separate from the culture, we are all Nature-cultural…Not of nature, not of culture but naturacultural (Hayles; Haraway).

But at the same time if it is true that human beings can literally shape the crust of the earth, we are now witnessing anthropomorphism on steroids (Latour, 2011 – Waiting for Gaia [PDF]). The Anthropocene perspective is that, if human impact on Earth can be translated into human responsibility fr the earth, the concept may help stimulate appropriate societal responses and/or invoke appropriate planetary stewardship (Head 2014); the capitalocene (see Jason Moore) talks about moving away from cartesian dualism in global environmental change, the alternative implies a shift from humanity and nature to humanity in nature, we have to counter capitalism in nature.

So from the human to the posthuman, I have argue that this is a way we can go with our theories… There are two ways to understand that, the singularist posthumanism or spinozist posthumanism. And I think we need to take a posthumanistic stance with learning – taking account of learning in technological naturecultures.

My own take here… We talk about intra-species differentiations. This nature is not nature as resource but rather nature as matrices – a nature that operates not only outside and inside our bodies (from global climate to the microbiome) but also through our bodies, including embodied minds. We do create intra-species differentiation, where learning changes what maters to you and others, and what matters changes learning. To create an ecological responsible ultra-sociality we need to see ourselves as a species of normative learners in cultural organisations.

So, my own experience, after studying physicists as an anthropologists I no longer saw the night sky the same way – they were stars and star constellations. After that work I saw them as thousands of potetial suns – and perhaps planets – and that wasn’t a wider discussion at that time.

I see it as a human thing to be learners. And we are ultra social learning. And that is a characteristic of being human. Collective learning is essentially what has made us culturally diverse. We have learning theories that are relavent for cultural diversity. We have to think of learning in a cultural way. Mediational approachs in collective activity. Vygotsky takes the idea of learners as social learners before we become personal learners and that is about the mediation – not natureculture but cultureculture (Moll 2000). That’s my take on it. So, we can re-centre human beings… Humans are not the centre of the universe, or of the environment. But we can be at the centre and think about what we want to be, what we want to become.

I was thinking of coming in with a critique of MOOCs, particularly as those being a capitolocene position. But I think we need to think of social learning before we look at individual learning (Vygotsky 1981). And we are always materially based. So, how do we learn to be engaged collectively? What does it matter – for MOOCs for instance – if we each take part from very different environments and contexts, when that environment has a significant impact. We can talk about those environments and what impact they have.

You can buy robots now that can be programmed – essentially sex robots like “Roxxxy” – and are programmed by reactions to our actions, emotions etc. If we learn from those actions and emotions, we may relearn and be changed in our own actions and emptions. We are seeing a separation of tool-creation from user-demand in Capitalocene. The introduction of robots in work places are often not replacing the work that workers actually want support with. The seal robots to calm dementia patients down cover a role that many carers actually enjoyed in their work, the human contact and suport. But those introducing them spoke of efficiency, the idea being to make employees superfluous but described as “simply an attempt to remove some of the most demeaning hard task from the work with old people so the wor time ca be used for care and attention” (Hasse 2013).

These alternative relations with machines are things we always react too, humans always stretch themselves to meet the challenge or engagement at hand. An inferentialist approach (Derry 2013) acknowledges many roads to knowledge but materiality of thinking reflects that we live in a world of not just case but reason. We don’t live in just a representationalism (Bakker and Derry 2011) paradigm, it is much more complex. Material wealth will teach us new things.. But maybe these machines will encourage us to think we should learn more in a representative than an inferentialist way. We have to challenge robotic space of reasons. I would recommend Jan Derry’s work on Vygotsky in this area.

For me robot representationalism has the capacity to make convincing representations… You can give and take answers but you can’t argue space and reasons… They cannot reason from this representation. Representational content is not articulated by determinate negation and complex concept formation. Algorithmic learning has potential and limitations, and is based on representationalism. Not concept formation. I think we have to take a position on posthumanistic learning, with collectivity as a normative space of reasons; acknowledge mattering matter in concept formation; acknowledge human inferentialism; acknowledge transformation in environment…

Discussion/Q&A

Q1) Can I ask about causes and reasons… My background is psychology and I could argue that we are more automated than we think we are, that reasons come later…

A1) Inferentialism is challenging  the idea of giving and taking reasons as part of normative space. It’s not anything goes… It’s sort of narrowing it down, that humans come into being in terms of learning and thinking in a normative space that is already there. Wilfred Sellers says there is no “bare given” – we are in a normative space, it’s not nature doing this… I have some problems with the term dialectical… But it is a kind of dialective process. If you give an dtake reasons, its not anything goes. I think Jen Derry has a better phrasing for this. But that is the basic sense. And it comes for me from analytical philosophy – which I’m not a huge fan of – but they are asking important questions on what it is to be human, and what it is to learn.

Q2) Interesting to hear you talk about Jan Derry. She talks about technology perhaps obscuring some of the reasoning process and I was wondering how representational things fitted in?

A2) Not in the book I mentioned but she has been working on this type of area at University of London. It is part of the idea of not needing to learn representational knowledge, which is built into technological systems, but for inferentialism we need really good teachers. She has examples about learning about the bible, she followed a school class… Who look at the bible, understand the 10 commandments, and then ask them to write their own bible 10 commandments on whatever topic… That’s a very narrow reasoning… It is engaging but it is limited.

Q3) An ethics issue… If we could devise robots or machines, AI, that could think inferentially, should we?

A3) A challenge for me – we don’t have enough technical people. My understanding is that it’s virtually impossible to do that. You have claims but the capacities of AI systems so far are so limited in terms of function. I think that “theory of mind” is so problematic. They deteriorise what it means to be human, and narrow what it means to be our species. I think algorithmic learning is representational… I may be wrong though… If we can… There are poiltical issues. Why make machines that are one to one to human beings… Maybe to be slaves, to do dirty work. If they can think inferentiality, should they not have ethical rights. In spinostas we have a responsibility to think about those ethical issues.

Q4) You use the word robot, that term is being used to be something very embodies and physical.. But algorithmic agency, much less embodied and much less visible – you mentioned the stock market – and how that fits in.

A4) In a way robots are a novelty, a way to demonstrate that. A chatbot is also a robot. Robot covers a lot of automated processes. One of the things that came out of AI at one point was that AI couldn’t learn without bodies.. That for deep learning there needs to be some sort of bodily engagement to make bodily mistakes. But then encounters like Roxy and others is that they become very much better… As humans we stretch to engage with these robots… We take an answer for an answer, not just an algorithm, and that might change how we learn.

Q4) So the robot is a point of engaging for machine learning… A provocation.

A4) I think roboticists see this as being an easy way to make this happen. But everything happens so quickly… Chips in bodies etc. But can also have robots moving in space, engaging with chips.

Q5) Is there something here about artifical life, rather than artifical intelligence – that the robot provokes that…

A5) That is what a lot of roboticists work at, is trying to create artificial life… There is a lot of work we haven’t seen yet. Working on learning algorithms in computer programming now, that evolves with the process, a form of artifical life. They hope to create robots and if they malfunction, they can self-repair so that the next generation is better. We asked at a conference in Prague recently, with roboticists, was “what do you mean by better?” and they simply couldn’t answer that, which was really interesting… I do think they are working on artifical life as well. And maybe there are two little connections between those of us in education, and those that create these things.

Q6) I was approached by robotics folks about teaching robots to learn drawing with charcoal, largely because the robotic hand had enough sensitivity to do something quite complex – to teach charcoal drawing and representation… The teacher gesticulates, uses metaphor, describes things… I teach drawing and representational drawing… There is no right answer there, which is tough for robototics… What is the equivelent cyborg/dual space in learning? Drawing toolsa re cyborg-esque in terms of digital and drawing tools… BUt also that diea of culture… You can manipulate tools, awareness of function and then the hack, and complexity of that hack… I suppose lots of things were ringing true but I couldn’t quite stick them in to what I’m trying to get at…

A6) Some of this is maybe tied to Schuman Enhancement Theory – the idea of a perfect cyborg drawing?

Q6) No, they were interested in improving computer learning, and language, but for me… The idea of human creativity and hacking… You could pack a robot with the history of art, and representation, so much information… Could do a lot… But is that better art? Or better design? A conversation we have to have!

A6) I tend to look at the dark side of the coin in a way… Not because I am techno-determinist… I do love gadgets, technology enhances our life, we can be playful… BUt in the capitalocene… There is much more focus on this. The creative side of technology is what many people are working on… Fantastic things are coming up, crossovers in art… New things can be created… What I see in nursing and teaching learning contexts is how to avoid engaging… So lifting robots are here, but nursing staff aren’t trained properly and they avoid them… Creativity goes many ways… I’m seeing from quite a particular position, and that is partly a position of warning. These technologies may be creative and they may then make us less and less creative… That’s a question we have to ask. For physicists, who have to be creative, are always so tied to the materiality, the machines and technologies in their working environments. I’ve also seen some of these drawing programmes…. It is amazing what you can draw with these tools… But you need purpose, awareness of what those changes mean… Tools are never innocent. We have to analyse what tools are doing to us

Jun 152016
 

Today I’m at the University of Edinburgh Principal’s Teaching Award Scheme Forum 2016: Rethinking Learning and Teaching Together, an event that brings together teaching staff, learning technologists and education researchers to share experience and be inspired to try new things and to embed best practice in their teaching activities.

I’m here partly as my colleague Louise Connelly (Vet School, formerly of IAD) will be presenting our PTAS-funded Managing Your Digital Footprint project this afternoon. We’ll be reporting back on the research, on the campaign, and on upcoming Digital Foorprints work including our forthcoming Digital Footprint MOOC (more information to follow) and our recently funded (again by PTAS) project: “A Live Pulse: YikYak for Understanding Teaching, Learning and Assessment at Edinburgh.

As usual, this is a liveblog so corrections, comments, etc. welcome. 

Velda McCune, Deputy Director of the IAD who heads up the learning and teaching team, is introducing today:

Welcome, it’s great to see you all here today. Many of you will already know about the Principal’s Teaching Award Scheme. We have funding of around £100k from the Development fund every year, since 2007, in order to look at teaching and learning – changing behaviours, understanding how students learn, investigating new education tools and technologies. We are very lucky to have this funding available. We have had over 300 members of staff involved and, increasingly, we have students as partners in PTAS projects. If you haven’t already put a bid in we have rounds coming up in September and March. And we try to encourage people, and will give you feedback and support and you can resubmit after that too. We also have small PTAS grants as well for those who haven’t applied before and want to try it out.

I am very excited to welcome our opening keynote, Paul Ashwin of Lancaster University, to kick off what I think will be a really interesting day!

Why would going to university change anyone? The challenges of capturing the transformative power of undergraduate degrees in comparisons of quality  – Professor Paul Ashwin

What I’m going to talk about is this idea of undergraduate degrees being transformative, and how as we move towards greater analytics, how we might measure that. And whilst metrics are flawed, we can’t just ignore these. This presentation is heavily informed by Lee Schumers work on Pedagogical Content Knowledge, which always sees teaching in context, and in the context of particular students and settings.

People often talk about the transformative nature of what their students experience. David Watson was, for a long time, the President for the Society of Higher Education (?) and in his presidential lectures he would talk about the need to be as hard on ourselves as we would be on others, on policy makers, on decision makers… He said that if we are talking about education as educational, we have to ask ourselves how and why this transformation takes place; whether it is a planned transformation; whether higher education is a nesseccary and/or sufficient condition for such transformations; whether all forms of higher education result in this transformation. We all think of transformation as important… But I haven’t really evidenced that view…

The Yerevan Communique: May 2015 talks about wanting to achieve, by 2020, a European Higher Education area where there are common goals, where there is automatic recognition of qualifictions and students and graduates can move easily through – what I would characterise is where Bologna begins. The Communique talks about higher education contributing effectively to build inclusive societies, found on democratic values and human rights where educational opportunities are part of European Citizenship. And ending in a statement that should be a “wow!” moment, valuing teaching and learning. But for me there is a tension: the comparability of undergraduate degrees is in conflict with the idea of transformational potential of undergraduate degrees…

Now, critique is too easy, we have to suggest alternative ways to approach these things. We need to suggest alternatives, to explain the importance of transformation – if that’s what we value – and I’ll be talking a bit about what I think is important.

Working with colleagues at Bath and Nottingham I have been working on a project, the Pedagogic Quality and Inequality Project, looking at Sociology students and the idea of transformation at 2 top ranked (for sociology) and 2 bottom ranked (for sociology) universities and gathered data and information on the students experience and change. We found that league tables told you nothing about the actual quality of experience. We found that the transformational nature of undergraduate degrees lies in changes in students sense of self through their engagement with discplinary knowledge. Students relating their personal projects to their disciplines and the world and seeing themselves implicated in knowledge. But it doesn’t always happen – it requires students to be intellectually engaged with their courses to be transformed by it.

To quote a student: “There is no destination with this discipline… There is always something further and there is no point where you can stop and say “I understaood, I am a sociologist”… The thing is sociology makes you aware of every decision you make: how that would impact on my life and everything else…” And we found the students all reflecting that this idea of transformation was complex – there were gains but also losses. Now you could say that this is just the nature of sociology…

We looked at a range of disciplines, studies of them, and also how we would define that in several ways: the least inclusive account; the “watershed” account – the institutional type of view; and the most inclusive account. Mathematics has the most rich studies in this area (Wood et al 2012) where the least inclusive account is “Numbers”, watershed is “Models”, most inclusive is “approach to life”. Similarly Accountancy moves from routine work to moral work; Law from content to extension of self; Music from instrument to communicating; Geograpy is from general world to interactions; Geoscience is from composition of earth – the earth, to relations earth and society. Clearly these are not all the same direction, but they are accents and flavours of the same time. We are going to do a comparison next year on chemistry and chemical engineering, in the UK and South Africa, and actually this work points at what is particular to Higher Education being about engaging with a system of knowledge. Now, my colleague Monica McLean would ask why that’s limited to Higher Education, couldn’t it apply to all education? And that’s valid but I’m going to ignore it just for now!

Another students comments on transformation of all types, for example from wearing a tracksuit to lectures, to not beginning to present themselves this way. Now that has nothing to do with the curriculum, this is about other areas of life. This student almost dropped out but the Afro Carribean society supported and enabled her to continue and progress through her degree. I have worked in HE and FE and the way students talk about that transformation is pretty similar.

So, why would going to university change anyone? It’s about exposure to a system of knowledge changing your view of self, and of the world. Many years ago an academic asked what the point of going to university was, given that much information they learn will be out of date. And the counter argument there is that engagement with seeing different perspectives, to see the world as a sociologist, to see the world as a geographer, etc.

So, to come back to this tension around the comparability of undergraduate degrees, and the transformational potential of undergraduate degrees. If we are about transformation, how do we measure it? What are the metrics for this? I’m not suggesting those will particularly be helpful… But we can’t leave metrics to what is easy to gather, we have to also look at what is important.

So if we think of the first area of compatibility we tend to use rankings. National and international higher education rankings are a dominant way of comparing institutions’ contributions to student success. All universities have a set of figures that do them well. They have huge power as they travel across a number of contexts and audiences – vice chancellors, students, departmental staff. It moves context, it’s portable and durable. It’s nonsense but the strength of these metrics is hard to combat. They tend to involved unrelated and incomparable measures. Their stability reinforces privilege – higher status institutions tend to enrol a much greated proportion of privileged students. You can have some unexpected outcomes but you have to have Oxford, Cambridge, Edinburgh, UCL, Imperial all near the top then your league table is rubbish… Because we already know they are the good universities… Or at least those rankings reinforce the privilege that already exists, the expectations that are set. They tell us nothing about transformation of students. But are skillful performances shaped by generic skills or students understanding of a particular task and their interactions with other people and things?

Now the OECD has put together a ranking concept on graduate outcomes, the AHELO, which uses tests for e.g. physics and engineering – not surprising choices as they have quite international consistency, they are measurable. And they then look at generic tests – e.g a deformed fish is found in a lake, using various press releases and science reports write a memo for policy makers. Is that generic? In what way? Students doing these tests are volunteers, which may not be at all representative. Are the skills generic? Education is about applying a way of thinking in an unstructured space, in a space without context. Now, the students are given context in these texts so it’s not a generic test. But we must be careful about what we measure as what we measure can become an index of quality or success, whether or not that is actually what we’d want to mark up as success. We have strategic students who want to know what counts… And that’s ok as long as the assessment is appropriately designed and set up… The same is true of measures of success and metrics of quality and teaching and learning. That is why I am concerned by AHELO but it keeps coming back again…

Now, I have no issue with the legitimate need for comparison, but I also have a need to understand what comparisons represent, how they distort. Are there ways to take account of students’ transformation in higher education?

I’ve been working, with Rachel Sweetman at University of Oslo, on some key characteristics of valid metrics of teaching quality. For us reliability is much much more important than availability. So, we need ways to assess teaching quality that:

  • are measures of the quality of teaching offered by institutions rather than measures of institutional prestige (e.g. entry grades)
  • require improvements in teaching practices in order to improve performance on the measures
  • as a whole form a coherent set of metrics rather than a set of disparate measures
  • are based on established research evidence about high quality teaching and learning in higher education
  • reflect the purposes of higher education.

We have to be very aware of Goodhearts’ rule that we must be wary of any measure that becomes a performance indicator.

I am not someone with a big issue with the National Student Survey – it is grounded in the right things but the issue is that it is run each year, and the data is used in unhelpful distorted ways – rather than acknowledging and working on feedback it is distorting. Universities feel the need to label engagement as “feedback moments” as they assume a less good score means students just don’t understand when they have that feedback moment.

Now, in England we have the prospect of the Teaching Excellence Framework English White Paper and Technical Consultation. I don’t think it’s that bad as a prospect. It will include students views of teaching, assessment and academic support from the National Student Survey, non completion rates, measures over three years etc. It’s not bad. Some of these measures are about quality, and there is some coherence. But this work is not based on established research evidence… There was great work here at Edinburgh on students learning experiences in UK HE, none of that work is reflected in TEF. If you were being cynical you could think they have looked at available evidence and just selected the more robust metrics.

My big issue with Year 2 TEF metrics are how and why these metrics have been selected. You need a proper consultation on measures, rather than using the White Paper and Technical Consultation to do that. The Office for National Statistics looked at measures and found them robust but noted that the differences between institutions scores on the selected metrics tend to be small and not significant. Not robust enough to inform future work according to the ONS. It seems likely that peer review will end up being how we differentiate between institution.

And there are real issues with TEF Future Metrics… This comes from a place of technical optimism that if you just had the right measures you’d know… This measure ties learner information to tax records for “Longitudinal Education Outcomes data set” and “teaching intensity”. Teaching intensity is essentially contact hours… that’s game-able… And how on earth is that about transformation, it’s not a useful measure of that. Unused office hours aren’t useful, optional seminars aren’t useful…  Keith Chigwell told me about a lecturer he knew who lectured a subject, each week fewer and fewer students came along. The last three lectures had no students there… He still gave them… That’s contact hours that count on paper but isn’t useful. That sort of measure seems to come more from ministerial dinner parties than from evidence.

But there are things that do matter… There is no mechanism outlines for a sector-wide discussion of the development of future metrics. What about expert teaching? What about students relations to knowledge? What about the first year experience – we know that that is crucial for student outcomes? Now the measures may not be easy, but they matter. And what we also see is the Learning Gains project, but they decided to work generically, but that also means you don’t understand students particular engagement with knowledge and engagement. In generic tests the description of what you can do ends up more important than what you actually do. You are asking for claims for what they can do, rather than performing those things. You can see why it is attractive, but it’s meaningless, it’s not a good measure of what Higher Education can do.

So, to finish, I’ve tried to put teaching at the centre of what we do. Teaching is a local achievement – it always shifts according to who the students are , what the setting is, and what the knowledge is. But that also always makes it hard to capture and measure. So what you probably need is a lot of different imperfect measures that can be compared and understood as a whole. However, if we don’t try we allow distorting measures, which reinforce inequalities, to dominate. Sometimes the only thing worse than not being listened to by policy makers, is being listened to them. That’s when we see a Frankenstein’s Monster emerge, and that’s why we need to recognise the issues, to ensure we are part of the debate. If we don’t try to develop alternative measures we leave it open to others to define.

Q&A

Q1) I thought that was really interesting. In your discussion of transformation of undergraduate students I was wondering how that relates to less traditional students, particularly mature students, even those who’ve taken a year out, where those transitions into adulthood are going to be in a different place and perhaps where critical thinking etc. skills may be more developed/different.

A1) One of the studies I talked about was London Metropolitan University has a large percentage of mature students… And actually there the interactions with knowledge really did prove transformative… Often students lived at home with family whether young or mature students. That transformation was very high. And it was unrelated to achievements. So some came in who had quite profound challenges and they had transformation there. But you have to be really careful about not suggesting different measures for different students… That’s dangerous… But that transformation was there. There is lots of research that’s out there… But how do we transform that into something that has purchase… recognising there will be flaws and compromises, but ensuring that voice in the debate. That it isn’t politicians owning that debate, that transformations of students and the real meaning of education is part of that.

Q2) I found the idea of transformation that you started with really interesting. I work in African studies and we work a lot on decolonial issues, and of the need to transform academia to be more representative. And I was concerned about the idea of transformation as a decolonial type issue, of being like us, of dressing like that… As much as we want to challenge students we also need to take on and be aware of the biases inherent in our own ways of doing things as British or Global academics.

A2) I think that’s a really important question. My position is that students come into Higher Education for something. Students in South Africa – and I have several projects there – who have nowhere to live, have very little, who come into Higher Education to gain powerful knowledge. If we don’t have access to a body of knowledge, that we can help students gain access to and to gain further knowledge, then why are we there? Why would students waste time talking to me if I don’t have knowledge. The world exceeds our ability to know it, we have to simplify the world. What we offer undergraduates is powerful simplifications, to enable them to do things. That’s why they come to us and why they see value. They bring their own biographies, contexts, settings. The project I talked about is based in the work of Basil Bernstein who argues that the knowledge we produce in primary research… But when we design curriculum it isn’t that – we engage with colleagues, with peers, with industry… It is transformed, changed… And students also transform that knowledge, they relate it to their situation, to their own work. But we are only a valid part of that process if we have something to offer. And for us I would argue it’s the access to body of knowledge. I think if we only offer process, we are empty.

Q3) You talked about learning analytics, and the issues of AHELO, and the idea of if you see the analytics, you understand it all… And that concept not being true. But I would argue that when we look at teaching quality, and a focus on content and content giving, that positions us as gatekeepers and that is problematic.

A3) I don’t see knowledge as content. It is about ways of thinking… But it always has an object. One of the issues with the debate on teaching and learning in higher education is the loss of the idea of content and context. You don’t foreground the content, but you have to remember it is there, it is the vehicle through which students gain access to powerful ways of thinking.

Q4) I really enjoyed that and I think you may have answered my question.. But coming back to metrics you’ve very much stayed in the discipline-based silos and I just wondered how we can support students to move beyond those silos, how we measure that, and how to make that work.

A4) I’m more course than discipline focused. With the first year of TEF the idea of assessing quality across a whole institution is very problematic, it’s programme level we need to look at. inter-professional, interdisciplinary work is key… But one of the issues here is that it can be implied that that gives you more… I would argue that that gives you differently… It’s another new way of seeing things. But I am nervous of institutions, funders etc. who want to see interdisciplinary work as key. Sometimes it is the right approach, but it depends on the problem at hand. All approaches are limited and flawed, we need to find the one that works for a given context. So, I sort of agree but worry about the evangelical position that can be taken on interdisciplinary work which is often actually multidisciplinary in nature – working with others not genuinely working in an interdisciplinary way.

Q5) I think to date we focus on objective academic ideas of what is needed, without asking students what they need. You have also focused on the undergraduate sector, but how applicable to the post graduate sector?

A5) I would entirely agree with your comment. That’s why pedagogic content matters so much. You have to understand your students first, as well as then also understanding this body of knowledge. It isn’t about being student-centered but understanding students and context and that body of knowledge. In terms of your question I think there is a lot of applicability for PGT. For PhD students things are very different – you don’t have a body of knowledge to share in the same way, that is much more about process. Our department is all PhD only and there process is central. That process is quite different at that level… It’s about contributing in an original way to that body of knowledge as its core purpose. That doesn’t mean students at other levels can’t contribute, it just isn’t the core purpose in the same way.

Parallel Sessions from PTAS projects: Social Media – Enhancing Teaching & Building Community? – Sara Dorman, Gareth James, Luke March

Gareth: It was mentioned earlier that there is a difference between the smaller and larger projects funded under this scheme – and this was one of the smaller projects. Our project was looking at whether we could use social media to enhance teaching and community in our programmes but in wider areas. And we particularly wanted to look at the use of Twitter and Facebook, to engage them in course material but also to strengthen relationships. So we decided to compare the use of Facebook used by Luke March in Russian Politics courses, with the use of Twitter and Facebook  in African Politics courses that Sara and I run.

So, why were we interested in this project? Social media is becoming a normal area of life for students, in academic practice and increasingly in teaching (Blair 2013; Graham 2014). Twitter increasingly used, Facebook well established. It isn’t clear what the lasting impact of social media would be but Twitter especially is heavily used by politicians, celebrities, by influential people in our fields. 2014 data shows 90% of 18-24 year olds regularly using social media. For lecturers social media can be an easy way to share a link as Twitter is a normal part of academic practice (e.g. the @EdinburghPIR channel is well used), keeping staff and students informed of events, discussion points, etc. Students have also expressed interest in more community, more engagement with the subject area. The NSS also shows some overall student dissatisfaction, particularly within politics. So social media may be a way to build community, but also to engage with the wider subject. And students have expressed preference for social media – such as Facebook groups – compared to formal spaces like Blackboard Learn discussion boards. So, for instance, we have a hashtag #APTD – the name of one of our courses – which staff and students can use to share and explore content, including (when you search through) articles, documents etc. shared since 2013.

So, what questions did we ask? Well we wanted to know:

  • Does social media facilitate student learning and enhance the learning experience?
  • Does social media enable students to stay informaed?
  • Does it facilitate participation in debates?
  • Do they feel more included and valued as part of the suject area?
  • Is social media complementary to VLEs like Learn?
  • Which medium works best?
  • And what disadvantages might there be around using these tools? \

We collected data through a short questionnaire about awareness, usage, usefulness. We designed just a few questions that were part of student evaluation forms. Students had quite a lot to say on these different areas.

So, our findings… Students all said they were aware of these tools. There was slightly higher levels of awareness among Facebook users, e.g. Russian Politics for both UG and PG students. Overall 80% said they were aware to some extent. When we looked at usage – meaning access of this space rather than necessarily meaningful engagement – we felt that usage of course materials on Twitter and Facebook does not equal engagement. Other studies have found students lurking more than posting/engaging directly. But, at least amongst our students (n=69), 70% used resources at least once. Daily usage was higher amongst Facebook users, i.e. Russian Politics. Twitter more than twice as likely to have never been used.

We asked students how useful they found these spaces. Facebook was seen as more useful than Twitter. 60% found Facebook “very” or “somewhat useful”. Only a third described Twitter as “somewhat useful” and none said “very useful”. But there were clear differences between UG and PG students. UG students were generally more positive than PG students. They noted that it was useful and interesting to keep up with news and events, but not always easy to tie that back to the curriculum. Students claimed it “interesting” a lot – for instance comparing historical to current events. More mixed responses included that there was plenty of material on Learn, so didn’t use FB or Twitter. Another commented they wanted everything on Learn, in one place. One commented they don’t use Twitter so don’t want to follow the course there, would prefer Facebook or Learn. Some commented that too many posts were shared, information overload. Students thought some articles were random, couldn’t tell what was good and what was not.

A lot of these issues were also raised in focus group discussions. Students do appreciate sharing resources and staying informed, but don’t always see the connection to the course. They recognise potential for debate and discussion but often it doesn’t happen, but when it does they find it intimidating for that to be in a space with real academics and others, indeed they prefer discussion away from tutors and academics on the course too. Students found Facebook better for network building but also found social vs academic distinction difficult. Learn was seen as academic and safe, but also too clunky to navigate and engage in discussions. Students were concerned others might feel excluded. Some also commented that not liking or commenting could be hurtful to some. One student comments “it was kind of more like the icing than the cake” – which I think really sums it up.

Students commented that there was too much noise to pick through. And “I didn’t quite have the know-how to get something out of”. “I felt a bit intimidated and wasn’t sure if I should join in”. others commented only using social media for social purpose – that it would be inappropriate to engage with academics there.  Some saw Twitter as a professional, Facebook as social.

So, some conclusions…

It seems that Facebook is more popoular with students than Twitter, seen as better for building community. Some differences between UG and PG students, with UG more interested. Generally less enthusiasm than anticiapted. Students were interested in nd aware of benefits of joining in discussions but also wary of commenting too much in “public”. This suggests that we need to “build community” in order for the “community building” tools to really works.

There is also an issue of lack of integration between FB, Twitter and Learn. Many of our findings reflect others, for instance Matt Graham in Dundee – who saw potential for HE humanities students. Facebook was particularly popular for their students than Twitter. He looked more at engagement and saw some students engaging more deeply with the wider African knowledge. But one outcome was that student engagement did not occur or engage sustainably without some structure – particular tasks and small nudges, connected to Learning Outcomes, flagging clear benefits at beginning, and that students should take a lead in creating groups – which came out of our work too – also suggested.

There are challenges here: inappropriate use, friending between staff and students for instance. Alastair Blair notes in an article that the utility of Twitter, despite the challenge, cannot be ignored. For academics thinking about impact it is important, but also for students it is important for alignment with wider subject area that moves beyond the classroom.

Our findings suggest that there is no need to rush into social media. But at the same time Sara and I still see benefits for areas like African Studies which is fast moving and poorly covered in the mainstream media. But the idea of students wanting to be engaged in the real world was clearly not carried through. Maybe more support and encouragement is needed for students – and maybe for staff too. And it would be quite interesting to see if and how students experiences of different politics and events – #indyref, #euref, etc. differ. Colleagues are considering using social media in a course on the US presidential election, might work out differently as students may be more confident to discuss these. The department has also moved forward with more presences for staff and students, also alumni.

Closing words from Matt Graham that encouraging students to question and engage more broadly with their subject is a key skill.

Q&A

Q1) What sort of support was in place, or guidelines, around that personla/academic identity thing?

A1) Actually none. We didn’t really realise this would happen. We know students don’t always engage in Learn. We didn’t really fully appreciate how intimidating students really found this. I don’t think we felt the need to give guidelines…

A1 – SD) We kind of had those channels before the course… It was organic rather than pedagogic…

Q1) We spoke to students who wanted more guidance especially for use in teaching and learning.

A1 – SD) We did put Twitter on the Learn page… to follow up… Maybe as academics we are the worst people to understand what students would do… We thought they would engage…

Q1) Will you develop guidelines for other courses…

A1) And a clearer explanation might encourage students to engage a bit more… Could be utility in doing some of that. University/institution wise there is cautious adoption and you see guidance issued for staff on using these things… But wouldn’t want overbearing guidance there.

Q1) We have some guidance under CC licence that you can use, available from Digital Footprints space.

Q2) Could you have a safer filtered space for students to engage. We do writing courses with international PG students and thought that might be useful to have social media available there… But maybe it will confuse them.

A2) There was a preference for a closed “safer” environment, talking only to students in their own cohort and class. I think Facebook is more suited to that sort of thing, Twitter is an open space. You can create a private Facebook group… One problem with Russian Politics was that they have a closed group… But had previous cohorts and friends of staff…

A2 – SD) We were trying to include students in real academia… Real tensions there over purpose and what students get out of it… The sense of not knowing… Some students might have security concerns but think it was insecurity in academic knowledge. They didn’t see themselves as co-producers. That needs addressing…

A2) Students being reluctant to engage isn’t new, but we thought we might have more engagement in social media. Now this was the negative side but actually there was positive things here – that wider awareness, even if one directional.

Q3) I just wanted to ask more about the confidence to participate and those comments that suggested that was a bigger issue – not just in social media – for these students, similarly information seeking behaviour

A3) There is work taking place in SPS around study skills, approaching your studies. Might be some room to introduce this stuff earlier on in school wide or subject wide courses… Especially if we are to use these schools. I completely agree that by the end of these studies you should have these skills – how to write properly, how to look for information… The other thing that comes to mind having heard our keynote this morning is the issue of transformative process. It’s good to have high expectations of UG students, and they seem to rise to the occasion… But I think that we maybe need to understand the difference between UG and PG students… And in PG years they take that further more fully.

A3 – SD) UG are really big courses – which may be part of the issue. In PG they are much smaller… Some students are from Africa and may know more, some come in knowing very little… That may also play in…

Q4) On the UG/PG thing these spaces move quickly! Which tools you use will change quickly. And actually the type of thing you post really matters – sharing a news article is great, but how you discuss and create follow up afterwards – did you see that, the follow up, the creation, the response…

A4 – SD) Students did sometimes interact… But the people who would have done that with email/Learn were the same that used social media in that way.

A4) Facebook and Twitter are new technologies still… So perhaps students will be increasingly more engaged and informed and up for engaging in these space. I’m still getting to grips with the etiquette of Twitter. There was more discussion on Facebook Groups than on Twitter… But also can be very surface level learning… It complements what we are doing but there are challenges to overcoming them… And we have to think about whether that is worthwhile. Some real positives and real challenges.

Parallel Sessions from PTAS projects: Managing Your Digital Footprint (Research Strand) – Dr Louise Connelly 

This was one of the larger PTAS-funded projects. This is the “Research Strand” is because it ran in parallel to the campaign which was separately funded.

There is so much I could cover in this presentation so I’ve picked out some areas I think will be practical and applicable to your research. I’m going to start by explaining what we mean by “Digital Footprint” and then talk more about our approach and the impact of the work. Throughout the project and campaign we asked students for quotes and comments that we could share as part of the campaign – you’ll see these throughout the presentation but you can also use these yourself as they are all CC-BY.

The project wouldn’t have been possible without an amazing research team. I was PI for this project – based at IAD but I’m now at the Vet School. We also had Nicola Osborne (EDINA), Professor Sian Bayne (School of Education). We also had two research students – Phil Sheail in Semester 1 and Clare Sowton in Semester 2. But we also had a huge range of people across the Colleges and support services who were involved in the project.

So, I thought I’d show you a short video we made to introduce the project:

YouTube Preview Image

The idea of the video was to explain what we meant by a digital foorprint. We clearly defined what we meant as we wanted to emphasis to students and staff – though students were the focus – was that your footprint is not just what you do but also what other people post about you, or leave behind about you. That can be quite scary to some so we wanted to address how you can have some control about that.

We ran a campaign with lots of resources and materials. You can find loads of materials on the website. That campaign is now a service based in the Institute for Academic Development. But I will be focusing on the research in this presentation. This all fitted together in a strategy. The campaig was to raise awareness and provide practical guidance, the research sought to gain an in-depth understanding of student’s usage and produce resources for schools. Then to feed into learning and teaching on an ongoing basis. Key to the resaerch was a survey we ran during the campaign, which was analysed by the research team..

In terms of the gap and scope of the campaign I’d like to take you back to the Number 8 bus… It was an idea that came out of myself and Nicola – and others – being asked regularly for advice and support. There was a real need here, but also a real digital skills gap. We also saw staff wanting to embed social media in the curriculum and needing support. The brainwave was that social media wasn’t the campaign that was needed, it was about digital footprint and the wider issues. We also wanted to connect to current research. boyd (2014) who works on networked teens talks about the benefits as well as the risks… as it is unclear how students are engaging with social/digital media and how they are curating their online profiles. We also wanted to look at the idea of eprofessionalism (Chester et al 2013), particularly in courses where students are treated as paraprofessionals – a student nurse, for instance, could be struck off before graduating because of social media behaviours so there is a very real need to support ad raise awareness amongst students.

Our overall research aim was to: work with students across current delivery modes (UG, PGT, ODL, PhD) in order to better understand how they 

In terms of our research objectives we wanted to: conduct research which generates a rich understanding; to develop a workshop template – and ran 35 workshops for over 1000 students in that one year; to critically analyse social media guidelines – it was quite interesting that a lot of it was about why students shouldn’t engage, little on the benefits; to work in partnership with EUSA – important to engage around e.g. campaign days; to contribute to the wider research agenda; and to effectively disseminate project findings – we engaged with support services, e.g. we worked with Careers about their LinkedIn workshops which weren’t well attended despite students wanting professional presence help and just rebranding the sessions was valuable. We asked students where they would seek support – many said the Advice Place rather than e.g. IS, so we spoke to them. We spoke to the Councelling service too about cyberbullying, revenge porn, sexting etc.

So we ran two surveys with a total of 1,457 responses. Nicola and I ran two lab-based focus groups. I interviewed 6 individuals over a range of interviews with ethnographic tracing. And we gathered documentary analysis of e.g. social media guidelines. We used mixed methods as we wanted this to be really robust.

Sian and Adam really informed our research methods but Nicola and I really led the publications around this work. We have had various publications and presentations including presentations at the European Conference on Social Media, for the Social Media for Higher Education Teaching and Learning conference. Also working on a Twitter paper. We have other papers coming. Workshops with staff and students have happened and are ongoing, and the Digital Ambassador award (Careers and IS) includes Digital Footprint as a strand. We also created a lot of CC-BY resources – e.g. guidelines and images. Those are available for UoE colleagues, but also for national and international community who have fed into and helped us develop those resources.

I’m going to focus on some of the findings…

The survey was on Bristol Online Survey. It was sent to around 1/3rd of all students, across all cohorts. The central surveys team did the ethics approval and issuing of surveys. Timing had to fit around other surveys – e.g. NSS etc. And we we had relatively similar cohorts in both surveys, the second had more responses but that was after the campaign had been running for a while.

So, two key messages from the surveys: (1) Ensure informed consent – crucial for students (also important for staff) – students need to understand the positive and negative implications of using these non traditional non university social media spaces. In terms of what that means – well guidance, some of the digital skills gap support etc. Also (2) Don’t assume what students are using and how they are using it. Our data showed age differences in what was used, cohort differences (UG, PGT, ODL, PhD), lack of awareness e.g. T&Cs, benefits – some lovely anecdotal evidence, e.g. UG informatics student approached by employers after sharing code on GitHub. Also the important of not making assumptions around personal/educational/professional environments – especially came out of interviews, and generally the implications of Digital Footprint. One student commented on being made to have a Twitter account for a course and not being happy about not having a choice in that (e.g. through embedding of tweets in Learn for instance).

Thinking about platforms…

Facebook is used by all cohorts but ODL less so (perhaps a geographic issue in part). Most were using it as a “personal space” and for study groups. Challenges included privacy management. Also issues of isolation if not all students were on Facebook.

Twitter is used mainly by PGT and PhD students, and most actively by 31-50 year olds. Lots of talk about how to use this effectively.

One of the surprises for us was that we thought most courses using social media would have guidelines in place for the use of social media in programme handbooks. But students reported them not being there, or not being aware of it. So we created example guidance which is on the website (CC-BY) and also an eprofessionalism guide (CC-BY) which you can also use in your own programme handbooks.

There were also tools we weren’t aware were in usage and that has led to a new YikYak research project which has just been funded by PTAS and will go ahead over the next year with Sian Bayne leading, myself, Nicola and Informatics. The ethnographic tracing and interviews gave us a much richer understanding of the survey data.

So, what next? We have been working with researchers in Ireland, Australia, New Zealand… EDINA has had some funding to develop an external facing consultancy service, providing training and support for NHS, schools, etc. We have the PTAS funded YikYak project. We have the Digital Footprint MOOC coming in August. The survey will be issued again in October. Lots going on, more to come!

We’ve done a lot and we’ve had loads of support and collaboration. We are really open to that collaboration and work in partnership. We will be continuing this project into the next year. I realise this is the tip of the iceberg but it should be food for thought.

Q&A 

Q1) We were interested in the staff capabilities

A1 – LC) We have run a lot of workshops for staff and research students, done a series at vet. Theres a digital skills issue, research, and learning and teaching, and personal strands here.

A1 – NO) There were sessions and training for staff before… And much of the research into social media and digital footprint has been very small cohorts in very specific areas,

Comment) I do sessions for academic staff in SPS, but I didn’t know about this project so I’ll certainly work that in.

A1 – LC) We did do a session for fourth year SPS students. I know business school are all over this as part of “Brand You”.

Q2) My background was in medicine and when working in a hospital and a scary colleague told junior doctors to delete their Facebook profiles! She was googling them. I saw an article in the Sun that badly misrepresented doctors – of doctors living the “high life” because there was something sunny.

A2 – LC) You need to be aware people may Google you… And be confident of your privacy and settings. And your professional body guidelines about what you have there. But there are grey areas there… We wanted to emphasise informed choice. You have the Right to be Forgotten law for instance. Many nursing students already knew restrictions but felt Facebook restrictions unfair… A recent article says there are 3.5 degrees of separation on Facebook – that can be risky… In teaching and learning this raises issues of who friends who, what you report… etc. The culture is we do use social media, and in many ways that’s positive.

A2 – NO) Medical bodies have very clear guidance… But just knowing that e.g. Profile pictures are always public on Facebook, you can control settings elsewhere… Knowing that means you can make informed decisions.

Q3) What is “Brand You”?

A3) Essentially it’s about thinking of yourself as a brand, how your presences are uses… And what is consistent, how you use your name, your profile images. And how you do that effectively if you do that. There is a book called “Brand You” which is about effective online presence.

Closing Keynote : Helen Walker, GreyBox Consulting and Bright Tribe Trust

I’m doing my Masters in Digital Education with University of Edinburgh, but my role is around edtech, and technology in schools, so I am going to share some of that work with you. So, to set the scene a wee video: Kids React to Technology: Old Computers:

YouTube Preview Image

Watching the kids try to turn on the machine it is clear that many of us are old enough to remember how to work late 1970s/early 1980s computers and their less than intuitive user experience.

So the gaps are maybe not that wide anymore… But there are still gaps. The gaps for instance between what students experience at home, and what they can do at home – and that can be huge. There is also a real gap between EdTech promises and delivery – there are many practitioners who are enervated about new technologies, and have high expectations. We also have to be aware of the reality of skills – and be very cautious of Prensky’s (2001) idea of the “digital native” – and how intoxicating and inaccurate that can be.

There is also a real gap between industry and education. There is so much investment in technology, and promises of technology. Meanwhile we also see perspectives of some that computers do not benefit pupils. Worse, in September 2015 the OECD reported, and it was widely re-reported that computers do not improve pupil results, and may in fact disbenefit. That risks going back before technology, or technology being the icing on the cake… And then you read the report:

“Technology can amplify great teaching but great technology cannot replace poor teaching.”

Well of course. Technology has to be pedagogically justified. And that report also encourages students as co-creators. Now if you go to big education technology shows like BETT and SETT you see very big rich technology companies offering expensive technology solutions to quite poor schools.

That reflects Education Endowment Fund Report 2012 found that “it’s the pedagogy, not technology” and the technology is a catalyst for change. Glynis Cousins says that technology has to work dynamically with pedagogy.

Now, you have fabulous physical and digital resources here. There is the issue here of what schools have. Schools often have machines that are 9-10 years old, but students have much more sophisticated devices and equipment at home – even in poor homes. Their school experience of using old kit to type essays jars with that. And you do see schools trying to innovate with technology – iPads and such in particular… They brought them, they invest thousands.. But they don’t always use them because the boring crucial wifi and infrastructure isn’t there. It’s boring and expensive but it’s imperative. You need that all in order to use these shiny things…

And with that… Helen guides us to gogopp.com and the web app to ask us why a monkey with its hand in a jar with a coin… We all respond… The adage is that if you wanted to catch a monkey you had to put an orange or some nuts in a jar, and wouldn’t let go, so a hunter could just capture the monkey. I deal with a lot of monkeys… A lot of what I work towards is convincing them that letting go of that coin, or nut, or orange, or windows 7 to move on and change and learn.

Another question for us… What does a shot of baseball players in a field have to do with edtech… Well yes, “if you build it, they will come”. A lot of people believe this is how you deal with edtech… Now although a scheme funding technology for schools in England has come to an end, a lot of Free Schools now have this idea. That if you build something, magic will happen…

BTW this gogopp tool is a nice fun free tool – great for small groups…

So, I do a lot of “change management consultation” – it’s not a great phrase but a lot of what it’s about is pretty straightforward. Many schools don’t know what they’ve got – we audit the kit, the software, the skills. We work on a strategy, then a plan, then a budget. And then we look at changes that make sense… Small scale, pathfinder projects, student led work – with students in positions of responsibility, we have a lot of TeachMeet sessions – a forum of 45 mins or so and staff who’ve worked on pathfinder projects have 2 or max 5 mins can share their experience – a way to drop golden nuggets into the day (much more effective than inset days!), and I do a lot of work with departmental heads to ensure software and hardware aligns with needs.

When there is the right strategy and the right pedagogical approach, brilliant things can happen. For instance…

Abdul Chohan, now principal of Bolton Academy, transformed his school with iPads – giving them out and asking them what to do with them. He works with Apple now…

David Mitchell (no, not that one), Deputy Headteacher in the Northwest, started a project called QuadBlogging for his 6th year students (year 7 in Scotland) whereby there are four organisations involved – 2 schools and 2 other institutions, like MIT, like the Government – big organisations. Students get real life, real world feedback in writing. They saw significant increases in their writing quality. That is a great benefit of educational technology – your audience can be as big or small as you want. It’s a nice safe contained forum for children’s writing.

Simon Blower, had an idea called “Lend me your writing”, crowdfunded Pobble – a site where teachers can share examples of student work.

So those are three examples of pedagogically-driven technology projects and changes.

And now we are going to enter Kahoot.it…

The first question is about a free VLE – Edmodo… It’s free except for analytics which is a paid for option.

Next up… This is a free behaviour management tool. The “Class Story” fundtion has recently been added… That’s Class Dojo.

Next… A wealth of free online courses, primarily aimed at science, math and computing… Khan Academy. A really famous resource now. Came about as Salmon Khan who asked for maths homework help… Made YouTube videos… Very popular and now a global company with a real range of videos from teachers. No adverts. Again free…

And next… an adapting learning platform with origins in the “School of One” in NYC. That’s Knewton. School of One is an interesting school which has done away with traditional classroom one to many systems… They use Knewton, which suggests the next class, module, task, etc. This is an “Intelligent Tutoring System” which I am skeptical of but there is a lot of interest from publishers etc. All around personalised learning… But that is all data driven… I have issues with thinking of kids as data producing units.

Next question… Office 365 tool allows for the creation of individual and class digital notebooks – OneNote. It’s a killer app that Microsoft invest in a lot.

And Patrick is our Kahoot winner (I’m second!)! Now, I use Kahoot I training sessions… It’s fun once… Unless everyone uses it through the day. It’s important that students don’t just experience the same thing again and again, that you work as a learning community to make sure that you are using tools in a way that stays interesting, that varies, etc.

So, what’s happening now in schools?

  • Mobility: BYOD, contribution, cross-platform agility
  • Office365/Google/iCloud
  • VLE/LMS – PLE/PLN – for staff and students
  • Data and tracking

So with mobility we see a growth in Bring Your Own Device… That brings a whole range of issues around esafety, around infrastructure. It’s not just your own devices, but also increasingly a kind of hire-purchase scheme for students and parents. That’s a financial pressure – schools are financially pressured and this is just a practical issue. One issue that is repeatedly coming up is the issue of cross-platform agility – phones, tablets, laptops. And discussion of bringing in keyboards, mice, and traditional set ups… Keyboard skills are being seen as important again in the primary sector. The benefit of mobile devices is collaboration, the idea of the main screen allowing everyone to be part of the classroom… You don’t need expensive software, can use e.g. cheap Reflector mirroring software. Apps… Some are brilliant, some are dreadful… Management of apps and mobile device management has become a huge industry… Working with technicians to support getting apps onto devices… How you do volume purchasing? And a lot of apps… One of two hit propositions… You don’t want the same app every week for one task… You need the trade off of what is useful versus getting the app in place/stafftime. We also have the issue of the student journey. Tools like socrative and nearpod lets you push information to devices. But we are going to look at/try now Plickers… What that does is has one device – the teachers mobile app – and I can make up printed codes (we’ve all been given one today) that can be laminated, handed out at the beginning of the year… So we then hold up a card with the appropriate answer at the top… And the teacher’s device is walked around to scan the room for the answers – a nice job for a student to do… So you can then see the responses… And the answer… I can see who got it wrong, and who got it right. I can see the graph of that….

We have a few easy questions to test this: 2+2 = (pick your answer); and how did you get here today? (mostly on foot!).

The idea is it’s a way to get higher order questioning into a session, otherwise you just hear from the kids that put their hands up all the time. So that’s Plicker… Yes, they all have silly names. I used to live in Iceland where a committee meets to agree new names – the word for computer means “witchcraft machine”.

So, thinking about Office365/Google/iCloud… We are seeing a video about a school where pupils helps promote, manage, coding, supporting use of Office365 in the school. And how that’s a way to get people into technology. These are students at Wyndham High in Norfolk – all real students. That school has adopted Office365. Both Office365 and Google offer educational environments. One of the reasons that schools err towards Office365 is because of the five free copies that students get – which covers the several locations and machines they may use at home.

OneNote is great – you can drag and drop documents… you can annotate… I use it with readings, with feedback from tutors. Why it’s useful for students is the facility to create Class Notebooks where you add classes and add notebooks. You can set up a content library – that students can access and use. You can also view all of the students notebooks in real time. In schools I work in we no longer have planners, instead have a shared class notebook – then colleagues can see and understand planning.

Other new functionality is “Classroom” where you can assign classes, assignments… It’s a new thing that brings some VLE functionality but limited in terms of grades being 0-100. And you can set up forms as well – again in preview right now but coming. Feedback goes into a CSV file in excel.

The other thing that is new is Planner – a project planning tool to assign tasks, share documents, set up groups.

So, Office 365 is certainly the tool most secondary schools I work with use.

The other thing that is happening in schools right now is the increasing use of data dashboards and tracking tools – especially in secondary schools – and that is concerning as it’s fairly uncritical. There is a tool called Office Mix which lets you create tracked content in Powerpoint… Not sure if you have access here, but you can use it at home.

Other data in schools tools include Power BI… Schools are using these for e.g. attainment outcomes. There is a free schools version of this tool (used to be too expensive). My concern is that it is not looking at what has impact in terms of teaching and learning. It’s focused on the summative, not the actual teaching and learning, not on students reporting back to teachers on their own learning. Hattie and self-reported grades tells us that students set expectations, goals, and understand rubrics for self-assessment. There is rich and interesting work to be done on using data in rich and meaningful ways.

In terms of what’s coming… This was supposed to be by 2025, then 2020, maybe sooner… Education Technology Action Group suggest online learning is an entitlement, better measures of performance, new emerging teaching and learning, wearables, etc.

Emerging EdTech includes Augmented Reality. It’s a big thing I do… It’s easy but it excites students… It’s a digital overlay on reality… So my two year old goddaughter is colouring in book that is augmented reality – you can then see a 3D virtual dinosaur coloured as per your image. And she asked her dad to send me a picture of her with a dinosaur. Other fun stuff… But where is the learning outcome here? Well there is a tool called Aurasma… Another free tool… You create a new Aura trigger image – can be anything – and you can choose your overlay… So I said I wanted to change the words on th epaper converted into French. It’s dead easy! We get small kids into this and can put loads of hidden AR content around the classroom, you can do it on t-shirts – to show inner working of the body for instance. We’ve had Year 11’s bring Year 7 textbooks to life for them – learning at both ends of the spectrum.

Last thing I want to talk about is micro:bit. This is about coding. In England and Wales coding is compulsory part of English now. All students are being issued a micro:bit and students are now doing all sorts of creative things. Young Rewired State project runs every summer and come to London to have code assessed – the winners were 5 and 6 year olds. So they will come to you with knowledge of coding – but they aren’t digital natives no matter what anyone tells you!

Q&A

Q1 – Me) I wanted to ask about equality of access… How do you ensure students have the devices or internet access at home that they need to participate in these activities and tools – like the Office365 usage at home for instance. In the RSE Digital Participation Inquiry we found that the reality of internet connectivity in homes really didn’t match up to what students will self-report about their own access to technology or internet connections, there is such baggage associated with not having internet access to access to the latest technologies and tools… So I was wondering how you deal with that, or if you have any comments on that.

A1) With the contribution schemes that schools have for devices… Parents contribute what they can, school covers the rest… So that can be 50p or £1 per month, it doesn’t need to be a lot. Also pupil premium money can be used for this. But, yes, parental engagement is important… Many students have 3G access not fixed internet for instance and that has cost implications… some can use dongles supplied by schools but just supporting students like this can cost 15k/yr to support for a small to medium sized cohort. There is some interesting stuff taking place in new build schools though… So for instance Gaia in Wales are a technology company doing a lot of the new build hardware/software set up… In many of those schools there is community wifi access… a way around that issue of connectivity… But that’s a hard thing to solve.

Q1 – Me) There was a proposal some years ago from Gordon Brown’s government, for all school aged children to have government supported internet access at home but that has long since been dropped.

Q2) I fear with technologies is that if I learn it, it’s already out of date. And also learners who are not motivated to engage with these tools they haven’t used before… I enjoyed these tools, their natty…

A2) Those are my “sweet shop” tools… Actually Office365/Google or things like Moodle are the bread and butter tools. These are fun one-off apps… They are pick up and go stuff… but its getting big tools working well that matter. Ignore the sweets if you need or want… The big stuff matters.

And with that Velda is closing with great thanks to our speakers today, to colleagues in IAD, and to Daphne Loads and colleagues. Please do share your feedback and ideas, especially for the next forum!

May 122016
 
Participants networking over lunch at eLearning@ed

Last week I was delighted to be part of the team organising the annual eLearning@ed Conference 2016. The event is one of multiple events and activities run by and for the eLearning@ed Forum, a community of learning technologists, academics, and those working with learning technologies across the University of Edinburgh. I have been Convener of the group since last summer so this was my first conference in this role – usually I’m along as a punter. So, this liveblog is a little later than usual as I was rather busy on the day…

Before going into my notes I do also want to say a huge thank you to all who spoke at the event, all who attended, and an extra special thank you to the eLearning@ed Committee and Vlad, our support at IAD. I was really pleased with how the event went – and feedback has been good – and that is a testament to the wonderful community I have the privilege of working with all year round here at Edinburgh.

Note: Although I have had a chance to edit these notes they were taken live so just let me know if you spot any errors and I will be very happy to make any corrections. 

The day opened with a brief introduction from me. Obviously I didn’t blog this but it was a mixture of practical information, enthusiasm for our programme, and an introduction to our first speaker, Melissa Highton:

Connecting ISG projects for learning and teaching – Melissa Highton (@honeybhighton), Director: Learning, Teaching and Web (LTW), Information Services.

Today is about making connections. And I wanted to make some connections on work that we have been doing.

I was here last year and the year before, and sharing updates on what we’ve been doing. It’s been a very good year for LTW. It has been a very busy year for open, inspired by some of the student work seen last year. We have open.ed launched, the new open educational resources policies, we have had the OER conference, we have open media, we have had some very bold moves by the library. And a move to make digital images from the library are open by default. That offers opportunities for others, and for us.

Extract from the Online Learning Consortium's 2016 Infographic

Extract from the Online Learning Consortium’s 2016 Infographic (image copyright OLC 2016)

There is evidence – from the US (referencing the EdTech: a Catalyst for Success section of the Online Learning Consortium 2016 Infographic). with students reporting increased engagement with course materials, with professors, with fellow students. And there is also a strong interest in digital video. MediaHopper goes fully launched very soon, and we are taking a case to Knowledge Strategy Committee and Learning and Teaching Committee to invest further in lecture capture, which is heavily used and demanded. And we need to look at how we can use that content, how it is being used. One of the things that I was struck by at LAK, was the amount of research being done on the use of audio visual material, looking at how students learn from video, how they are used, how they are viewed. Analytics around effective video for learning is quite interesting – and we’ll be able to do much more with that when we have these better systems in place. And I’ve included an image of Grace Hopper, who we named MediaHopper after.

Melissa Highton speaking at eLearning@ed 2016

Melissa Highton speaking at eLearning@ed 2016

Talking of Learning Analytics I’m a great fan of the idea that if a thing is worth doing, it’s worth doing a 2×2 matrix. So this is the Learning Analytics Map of Activities, Research and Roll-out (LAMARR – a great mix of Hollywood screen icon, and the inventor of wifi!), and there are a whole range of activities taking place around the university in this area at the moment, and a huge amount of work in the wider sector.

We also are the only University in the UK with a Wikimedian in Residence. It is a place entirely curated by those with interest in the world, and there is a real digital literacy skill for our students, for us, in understanding how information is created and contested online, how it becomes part of the internet, and that’s something that is worth thinking about for our students. I have a picture here of Sophie Jex-Blake, she was part of the inspiration for our first Wikipedia Edit-a-thon on women in science. Our Wikimedian is with us for just one year, so do make use of him. He’s already worked on lots of events and work, he’s very busy, but if you want to talk to him about a possible event, or just about the work being done, or that you want to do.

Here for longer than one year we have Lynda.com, an online collection of training videos which the University has signed up to for 3 years, and will be available through your University login. Do go and explore it now, and you will have Edinburgh University access from September. The stuff that is in there, can be curated into playlists, via learn, usage etc.

So, Wikipedia for a year, Lynda.com for three years, MediaHopper here now, and open increasingly here.

Highlights from recent conferences held in Edinburgh, chaired by Marshall Dozier

Marshall: Conferences are such an opportunity to make a connection between each other, with the wider community, and we hope to fold those three big conferences that have been taking place back into our own practice.

OER16 Open Culture Conference – Lorna Campbell (@lornamcampbell), Open Education Resources Liaison for Open Scotland, LTW.

This was the 7th OER conference, and the first one to take place in Edinburgh. It was chaired by myself and Melissa Highton. Themes included Strategic advantage of open, creating a culture of openness and the reputational challenges of “open-washing”; converging and competing cultures of open knowledge, open source, open content, open practice, open data and open access; hacking, making and sharing; openness and public engagement?; and innovative practices in cultural heritage contexts, which I was particularly to see us get good engagement from.

There was originally a sense that OER would die out, but actually it is just getting bigger and bigger. This years OER conference was the biggest yet, and that’s because of support and investment from those who, like the University of Edinburgh, who see real value in openness. We had participants from across the world – 29 countries – despite being essentially a UK based conference. And we had around a 50/50 gender split – no all male panel here. There is no external funding around open education right now, so we had to charge but we did ensure free and open online participation for all – keynotes live-streamed to the ALT channel, we had Radio #EDUtalk @ OER16, with live streaming of keynotes, and interviews with participants and speakers from the conference – those recordings are hugely recommended; and we also had a busy and active Twitter channel. We had a strong Wikimedia presence at OER16, with editing training, demonstrations, and an ask a Wikimedian drop-in clinic, and people found real value in that.

Lorna Campbell speaking about OER16 at eLearning@ed 2016

Lorna Campbell speaking about OER16 at eLearning@ed 2016

We also had a wide range of keynotes and I’m just going to give a flavour of these. Our first was Catherine Cronin, National University of Ireland, Galway, who explored different definitions of openness, looking at issues of context and who may be excluded. We all negotiate risk when we are sharing, but negotiating that is important for hope, equality, and justice.

In the year of the 400th anniversary of Shakespeare’s death we were delighted to have Shakespeare scholar Emma Smith, who had a fantastic title: Free Willy: Shakespeaker & OER. In her talk she suggested teaching is an open practice now, that “you have to get over yourself and let people see what you are doing”.

John Scally’s keynote talked about the National Library of Scotland’s bold open policy. The NLS’ road to openness has been tricky, with tensions around preservation and access. John argued that the library has to move towards equality, and that open was a big part of that.

Edupunk Jim Groom of Reclaim Hosting, has quite a reputation in the sector and he was giving his very first keynote in the UK. JIm turned our attention from open shared resources, and towards open tech infrastructure, working at individual scale, but making use of cloud, networked resources which he sees as central to sustainable OER practice.

The final keynote was from Melissa Highton, with her talk Open with Care. She outlined the vision and policy of UoE. One idea introduced by Melissa was “technical and copyright debt”, the costs of not doing licensing, etc. correctly in the first place. IT Directors and CIOs need to be persuaded of the need for investment in OER.

It is difficult to summarise such a diverse conference, but there is growing awareness that openness is a key aspect that underpins good practice. I wanted to quote Stuart Allen’s blog. Stuart is a student on the MSc in Digital Education. HE did a wonderful summary of the conference.

Next year’s conference has the theme of Open and Politics and will be co-chaired by Josie Frader and Alec Tartovsky, chair of CC in Poland (our first international co-chair).

Learning@Scale 2016 – Amy Woodgate, Project Manager – Distance Education Initiative (DEI) & MOOCs, LTW.

I am coming at this from a different perspective here, as participant rather than organiser. This conference is about the intersection between informatics approaches and education. And I was interested in the degree to which that was informed by informatics, and that really seems to flag a need to interrogate what we do in terms of learning analytics, educational approach. So my presentation is kind of a proposal…

We have understood pedagogy for hundreds of years, we have been doing a huge amount of work on digital pedagogy, and the MSc in Digital Education is leading in this area. We have environments for learning, and we have environments at scale, including MOOCs, which were very evident at L@S. At University of Edinburgh we have lots of digitally based learning environments: ODL; MOOCS; and the emergence of UG credit-bearing online courses. But there is much more opportunity to connect these things for research and application – bringing pedagogy and environments at scale.

The final keynote at L@S was from Ken Koedinger, at Carnegie Mellon University. He suggested that every learning space should be a learning lab. We shouldn’t just apply theory, but building, doing, providing evidence base, thinking as part of our practice. He talked about collecting data, testing that data, understanding how to use data for continuous improvement. We are a research led institution, we have amazing opportunities to blend those things. But perhaps we haven’t yet fully embraced that Design, Deploy, Data, Repeat model. And my hope is that we can do something together more. We’ve done MOOCs for four years now, and there are so many opportunities to use the data, to get messy in the space… We haven’t been doing that but no-one has been. What was hard about the conference for me was that lots of it was about descriptive stats – we can see that people have clicked a video, but not connecting that back to anything else. And what was interesting to me was the articulation into physical environments here – picking up your pen many times is not meaningful. And so many Learning Analytics data sources are what we can capture, not necessarily what is meaningful.

The keynote had us answer some questions, about knowing when students are learning. You can see when people view or like a video, but there is a very low correlation between liking and learning… And for me that was the most important point of the session. That was really the huge gap, more proactive research, engagement, for meaningful measures of learning – not just what we can measure.

Mike Sharples, OU was also a keynote at L@S, and he talked about learning at scale, and how we can bring pedagoguey into those spaces, and the intersection of diversity, opportunity and availability. One of the things FutureLearn is exploring is the notion of citizen inquiry – people bring own research initiatives (as students) and almost like kickstarter engage the community in those projects. Interesting to see what happens, but an interesting question of how we utilize the masses, the scale of these spaces. We need you as the community working with us to start questioning how we can get more out of these spaces. Mike’s idea was that we have to rethink our idea of effective pedagoguey, and of ensuring that that is sustainable as being a key idea.

Working backwards then, there were many many papers submitted, not all were accepted, but you can view the videos of keynotes on Media Hopper, and there were posters for those not able to present as well. The winner of the best paper was “1A Civic Mission of MOOCs” – which gave the idea that actually there was a true diversity of people engaged in political MOOCs, and they weren’t all trolly, there was a sense of “respectful disagreement”. There were a lot of papers that we can look at, but we can’t apply any of these findings that can be applied without critical reflection, but there is much that can be done there.

It was interesting Lorna’s comments about gender balance. At L@S there were great female speakers, but only 15% of the whole. That reflected the computer science angle and bias of the event, and there felt like there was a need for the humanities to be there – and I think that’s an aspiration for the next one, to submit more papers, and get those voices as part of the event.

Although perhaps a slightly messy summary of the event, I wanted to leave you with the idea that we should be using what we do here at Edinburgh, with what we have available here, to put out a really exciting diverse range of work for presenting at next year’s third L@S!

So, what do people think about that idea of hacking up our learning spaces more? Thinking more about integrating data analysis etc, and having more of a community of practice around online pedagogies for learning@scale.

Amy Woodgate speaking about Learning@Scale 2016

Amy Woodgate speaking about Learning@Scale at elearning@ed 2016

Q&A

Q1) I think that issue of measuring what we can measure is a real issue right now. My question here is about adapting approach for international students – they come in and play huge fees, and there are employers pushing for MOOCs instead… But then we still want that income… So how does that all work together.

A1) I don’t think learning at scale is the only way to do teaching and learning, but it is an important resource, and offers new and interesting ways of learning. I don’t feel that it would compromise that issue of international students. International students are our students, we are an international community on campus, embracing that diversity is important. It’s not about getting rid of the teacher… There is so much you can do with pedagogies online that are so exciting, so immersive… And there is more we can get out of this in the future. I find it quite awkward to address your point though… MOOCs are an experimentation space I think, for bringing back into core. That works for some things, and some types of content really work at scale – adaptive learning processes for instance – lots of work up front for students then to navigate through. But what do others think about using MOOCs on campus…

Comment, Tim) I think for me we can measure things, but that idea of how those actions actually relate to the things that are not measured… No matter how good your VLE, people will do things beyond it. And we have to figure out how we connect and understand how they connect.

Q2, Ruby) Thank you very much for that. I was just a little bit worried… I know we have to move away from simplistic description of this measure, means this thing. But on one slide there was an implication that measuring learning… can be measured through testing. And I don’t think that that that is neccassarily true or helpful. Liking CAN be learning. And there is a lot of complexity around test scores.

A2)  Yes, that chart was showing that viewing a particular video, hadn’t resulted in better learning uptake at the end of the course… But absolutely we do need to look at these things carefully…

Q3) At the recent BlackBoard conference there was the discussion of credit bearing MOOCs, is there any plan to do that now?

A3) This sometihng we can do of course, could take a MOOC into a credit bearing UG course, where the MOOC is about content. What becomes quite exciting is moving out and, say, the kind of thing MSc DE did with eLearning and Digital Cultures – making connections between the credit bearing module and the MOOC, in interesting and enriching ways. The future isn’t pushing students over to the MOOC, but taking learning from one space to another, and seeing how that can blend. Some interesting conversations around credit alliances, like a virtual Erasmus, around credit like summer school credit. But then we fall back of universities wanting to do exams, and we have a strong track record of online MScs not relying on written exams, but not all are as progressive right now.

Q4, Nigel) I’m in Informatics, and am involved in getting introductory machine learning course online, and one of the challenges I’m facing is understanding how students are engaging, how much. I can ask them what they liked… But it doesn’t tell me much. That’s one issue. But connecting up what’s known about digital learning and how you evaluate learning in the VLEs is good… The other thing is that there is a lot of data I’d like to get out of the VLE and which to my knowledge we can’t access that data… And we as data scientists don’t have access.

Comment, Anne-Marie Scott) We are still learning how to do that best but we do collect data and we are keen to see what we can do. Dragan will talk more about Learning Analytics but there is also a UoE group that you could get involved with.

Q5, Paul) That was fascinating, and I wish I’d been able to make it along… I was a bit puzzled about how we can use this stuff… It seems to me that we imagine almost a single student body out there… In any programme we have enthusiastic students desperate to learn, no matter what; in the middle we have the quite interested, may need more to stay engaged; and then there are people just there for the certificate who just want it easy. If we imagine we have to hit all of the audiences in one approach it won’t work. We are keen to have those super keen students. In medicine we have patient groups with no medical background or educational background, so motivated to learn about their own conditions… But then in other courses, we see students who want the certificate… I think that enormous spectrum give us enormous challenges.

A5) An interesting pilot in GeoSciences on Adaptive Learning, to try to address the interested and the struggling students. Maths and Physics do a lot with additional resources with external sites – e.g. MOOCs – in a curated list from academics, that augment core. Then students who just want the basics, for those that want to learn more… Interesting paper on cheating in MOOCs, did analysis on multiple accounts and IP addresses, and toggling between accounts… Got a harvester and master account, looked at clusters…. Master accounts with perfect learning… Harvesting were poorer, then the ones in the middle… The middle is the key part… That’s where energy should be in the MOOC.

Q6) I was intrigued by big data asset work, and getting more involved… What are tensions with making data openly available… Is it competition with other universities…

A6) That’s part of project with Dragan and Jeff Haywood have been leading on Learning Analytics data policy… MOOCs include personally identifiable data, can strip it, but requires work. University has desire to share data, but not there yet for easy to access framework to engage with data. To be part of that, it’s part of bigger Learning Analytics process.

LAK’16 Learning Analytics & Knowledge Conference – Professor Dragan Gasevic (@dgasevic), Chair in Learning Analytics and Informatics, Moray House School of Education & School of Informatics

The Learning Analytics and Knowledge Conference, LAK’16, took place in Edinburgh last week. It was in it’s sixth edition. It started in Canada as a response to several groups of people looking at data collected in different types of digital environments, and also the possibility to merge data from physical spaces, instruments, etc. It attracted a diverse range of people from educational research, machine learning, psychology, sociology, policy makers etc. In terms of organisation we had wonderful support from the wonderful Grace Lynch and two of my PhD students, who did a huge amount. I also had some wonderful support from Sian Bayne and Jeff Haywood in getting this set up! They helped connect us to others, within the University and throughout the conference. But there are many others I’d like to thank, including Amy and her team who streamed all four parallel sessions throughout the conference.

In terms of programme the conference has a research stream and a practitioner stream. Our chairs help ensure we have a great programme – and we have three chairs for each stream. They helped us ensure we had a good diversity of papers and audiences, and vendors. We have those streams to attract papers but we deliberately mix the practice and research sessions are combined and share sessions… And we did break all records this time. This was only the second conference outside North America, and most of our participants are based there, but we had almost double the submissions this year. These issues are increasingly important, and the conference is an opportunity to critically reflect on this issue. Many of our papers were very high in quality, and we had a great set of workshops proposed – selecting those was a big challenge and only 50% made it in… So, for non computer scientists the acceptance ratio maybe isn’t a big deal… But for computer scientists it is a crucial thing. So here’s we accepted about 30% of papers… Short papers were particularly competitive – this is because the field is maturing, and people want to see more mature work.

Dragan Gasevic speaking about LAK'16 at eLearning@ed 2016.

We had participants from 35 countries, across our 470 participants – 140 from the US, 120 from the UK, and then 40 from Australia. Per capita Australia was very well represented. But one thing that is a little disappointing is that other European countries only had 3 or 4 people along, that tells us something about institutional adoption of learning analytics, and research there. There are impressive learning analytics work taking place in China right now, but little from Africa. In South America there is one hub of activity that is very good.

Workshops wise the kinds of topics addressed included learning design and feedback at scale, learning analytics for workplace and professional learning – definitely a theme with lots of data being collected but often private and business confidential work but that’s a tension (EU sees analytics as public data), learning analytics across physical and digital spaces – using broader data and avoiding the “streetlight effect”, temporal learning analytics – trying to see how learning processes unfold… Students are not static black boxes… They change decisions, study strategies and approaches based on feedback etc; also had interesting workshop on IMS Caliper; we also had a huge theme and workshop on ethical and privacy issues; and another on learning analytics for learners; a focus on video, and on smart environments; also looking for opportunities for educational researchers to engage with data – through data mining skills sessions to open conversations with with informaticians. We also had a “Failathon” – to try ideas, talk about failed ideas.

We also had a hackathon with Jisc/Apero… They issues an Edinburgh Statement for learning analytics interoperability. Do take a look, add your name, to address the critical points…

I just want to highlight a few keynotes: Professor Mireilla Hildebrandt talked about the law and learning as a a machine, around privacy, data and bringing in issues including the right to be forgotten. The other keynote I wanted to talk about was Professor Paul A Kirshner on learning analytics and policy – a great talk. And final keynote was Robert Mislevy who talked about psychometric front of learning analytics.

Finally two more highlights, we picked two papers out as the best:

  • Privacy and analytics – it’s a DELICATE issue. A checklist for trusted learning analytics – Hendrik Drachsler and Wolfgang Greller.
  • When should we stop? Towards Universal approach – details of speakers TBC

More information on the website. And we have more meetings coming up – we had meetings around the conference… And have more coming up with a meeting with QAA on Monday, session with Blackboard on Tuesday, and public panel with George Siemens & Mark Milliron the same day.

Q&A

Q1) Higher Education is teaching, learning and research… This is all Learning Analytics… So do we have Teaching Analytics?

A1) Great point… Learning analytics is about learning, we shouldn’t be distracted by toys. We have to think about our methods, our teaching knowledge and research. learning analytics with pretty charts isn’t neccassarily helpful – sometimes event detrimental – t0 learners. We have to look at instructional designs, to support our instructors, to use learning analytics to understand the cues we get in physical environments. One size does not fit all!

Marshall) I set a challenge for next year – apply learning analytics to the conference itself!

Student-centred learning session, chaired by Ruby Rennie

EUSA: Using eLearning Tools to Support and Engage Record Numbers of Reps – Tanya Lubicz-Nawrocka (@TanyaLubiczNaw), Academic Engagement Coordinator, EUSA; Rachel Pratt, Academic Representation Assistant, EUSA; Charline Foch (@Woody_sol), EUSA, and Sophie McCallum,Academic Representation Assistant, EUSA.

Tanya opened the presentation with an introduction to what EUSA: the Edinburgh University Students Association is and does, emphasizing the independence of EUSA and its role in supporting students, and supporting student representatives… 

Rachel: We support around 2000 (2238) students across campus per year, growing every year (actually 1592 individuals – some are responsible for several courses), so we have a lot of people to support.

Sophie: Online training is a big deal, so we developed an online training portal within Learn. That allows us to support students on any campus, and our online learners. Students weren’t always sure about what was involved in the role, and so this course is about helping them to understand what their role is, how to engage etc. And in order to capture what they’ve learned we’ve been using Open Badges, for which over to Tanya…

Tanya Lubicz-Nawrocka speaking about EUSA's use of Learn and Open Badges at elearning@ed 2016

Tanya Lubicz-Nawrocka speaking about EUSA’s use of Learn and Open Badges at elearning@ed 2016

Tanya: I actually heard about open badges at this very conference a couple of years ago. These are flexible, free, digital accreditation. Thay are full of information (metadata) and can be shared and used elsewhere in the online world. These badges represent skills in key areas, Student Development badges (purple), Research and communication badges (pink) and ? (yellow).

Tanya shows the EUSA Open Badges at elearning@ed 2016

Tanya shows the EUSA Open Badges at elearning@ed 2016

There have been huge benefits of the badges. There are benefits for students in understanding all aspects of the role, encouraging them to reflect on and document their work and success – and those helped us share their success, to understand school level roles, and to understand what skills they are developing. And we are always looking for new ways to accredit and recognise the work of our student reps, who are all volunteers. It was a great way to recognise work in a digital way that can be used on LinkedIn profiles.

There were several ways to gain badges – many earned an open badge for online training (over 1000 earned); badges were earned for intermediate training – in person (113 earned); and badges were also earned by blogging about their successes and development (168 earned).

And the badges had a qualitative impact around their role and change management, better understanding their skills and relationships with their colleagues.

Sophie McCallum speaking about EUSA's work on training and Open Badges at elearning@ed 2016

Sophie McCallum speaking about EUSA’s work on training and Open Badges at elearning@ed 2016

Rachel: Looking at the learning points from this. In terms of using (Blackboard) Learn for online functionality… For all our modules to work the best they can, 500 users is the most we could. We have two Learn pages – one for CSE (College of Science & Engineering), one for CHSS (College of Humanities and Social Sciences), they are working but we might have to split them further for best functionality. We also had challenges with uploading/bulk uploading UUNs (the University personal identifiers) – one wrong UUN in several hundred, loses all. Information services helped us with that early on! We also found that surveys in Learn are anonymous – helpful for ungraded reflection really.

In terms of Open Badges the tie to an email address is a challenge. If earned under a student email address, it’s hard to port over to a personal email address. Not sure how to resolve that but aware of it. And we also found loading of badges from “Backpack” to sites like LinkedIn was a bit tedious – we’ll support that more next year to make that easier. And there are still unknown issues to be resolved, part of the Mozilla Open Badges environment more broadly. There isn’t huge support online yet, but hopefully those issues will be addressed by the bigger community.

Using eLearning tools have helped us to upscale, train and support record numbers of Reps in their roles; they have helped us have a strong positive quantitative and qualitative impact in engaging reps; and importance of having essential material and training online and optional, in-person intermediate training and events. And it’s definitely a system we’ll continue to have and develop over the coming years.

Rachel Pratt talks about EUSA's training approach, working with student representatives across the University, at elearning@ed 2016

Rachel Pratt talks about EUSA’s training approach, working with student representatives across the University, at elearning@ed 2016

Q&A

Q1) Have you had any new feedback from students about this new rep system… I was wondering if you have an idea of whether student data – as discussed earlier – is on the agenda for students?

A1 – Tanya) Students are very well aware of their data being collected and used, we are part of data analytics working groups across the university. It’s about how it is stored, shared, presented – especially the issue of how you present information when they are not doing well… Interested in those conversations about how data is used, but we are also working with reps, and things like the Smart Data Hacks to use data for new things – timetabling things for instance…

Q2) ?

A2) It’s a big deal to volunteer 50 hours of their time per year. They are keen to show that work to future employers etc.

Q3) As usual students and EUSA seem to be way ahead. How do you find out more about the badges?

A3) They can be clicked for more metadata – that’s embedded in it. Feedback has been great, and the blogposts have really helped them reflect on their work and share that.

SLICCs: Student-Led Individually Created Courses – Simon Riley, Senior Lecturer, MRC Centre for Reproductive Health

I’m Simon Riley, from the School of Medicine. I’m on secondment with the IAD and that’s why I’m on this. I’m coming to it from having worked on the student led component in medicine. You would think that medicine would be hugely confined by GMC requirements, but there is space there. But in Edinburgh there is about a year of the five year programme that is student led – spread across that time but very important.

Now, before speaking further I must acknowledge my colleague Gavin McCabe, Employability Consultant who has been so helpful in this process.

SLICCs are essentially a reflective framework, to explore skill acquisition, using an e-portfolio. We give students generic Learning Outcomes (LOs), which allow the students to make choices. Although it’s not clear how much students understand or engage with learning outcomes… We only get four or five per module. But those generic LOs allow students to immediately define their own aims and anticipated learning in their “proposal”. Students can take ownership of their own learning by choosing the LOs to address.

Simon Riley talks about SLICCs at eLearning@ed 2016

Simon Riley talks about SLICCs at eLearning@ed 2016

The other place that this can raise tensions is the idea of “academic rigor”. We are comfortable at assessing knowledge, and assessments that are knowledge based. And we assume they get those other graduate attributes by osmosis… I think we have to think carefully about how we look at that. Because the SLICCs are reflection on learning, I think there is real rigor there. But there has to be academic content – but it’s how they gain that knowledge. Tanya mentioned the Edinburgh Award – a reflective process that has some similarities but it is different as it is not for credit.

Throughout their learning experience students can make big mistakes, and recover from them. But if you get students to reflect early, and reflect on any issue that is raised, then they have the opportunity to earn from mistakes, to consider resilience, and helping them to understand their own process for making and dealing with mistakes.

The other concern that I get is “oh, that’s a lot of work for our staff”… I was involved in Pilot 1 and I discovered that when giving feedback I was referring students back to the LOs they selected, their brief, the rubric, the key feedback was about solving the problem themselves… It’s relatively light touch and gives ownership.

So, here are three LOs… Around Analysis, Application, Evaluation. This set is Level 8. I think you could give those to any student, and ask them to do some learning, based on that, and reflect on it… And that’s across the University, across colleges… And building links between the colleges and schools, to these LOs.

So, where are we at? We had a pilot with a small number of students. It was for extra credit, totally optional. They could conduct their own learning, capture in a portfolio, reflect upon it. And there is really tight link between the portfolio evidence, and the reflective assignment. It was a fascinating set of different experiences… For instance one student went and counter river dolphins in the Amazon, but many were not as exotic… We didn’t want to potentially exclude anyone or limit relevance. Any activity can have an academic element to it if structured and reflected upon appropriately. Those who went through the process… Students have come back to us who did these at Level 8 in second year (highest level senate has approved)… They liked the process – the tutor, the discipline, the framework, more than the credit.

So we have just over 100 students signed up this summer. But I’m excited about doing this in existing programmes and courses… What we’ve done is created SCQF LOs at Level 7, 8, 10 and 11, with resources to reflect, marking rubric, and board of studies documents. I am a course organiser – developing is great but often there isn’t time to do it… So what I’m trying to do is create all that material and then just let others take and reuse that… Add a little context and run onto it. But I want to hold onto the common LOs, as long as you do that we can work between each other… And those LOs include the three already shown, plus LO4 on “Talent” and LO5 on “Mindset”, both of which specifically address graduate attributes. We’ve had graduate attributes for years but they aren’t usually in our LOs, just implicit. In these case LOs are the graduate attributes.

Simon Riley gets very animated talking about Learning Outcomes at eLearning@ed 2016

Simon Riley gets very animated talking about Learning Outcomes at eLearning@ed 2016

What might they look like? Embedded in the curriculum, online and on campus. Level 11 on-campus courses are very interested, seems to fit with what they are trying to do. Well suited to projects, to skill acquisition, and using a portfolio is key – evidencing learning is a really useful step in getting engagement. And there is such potential for interdisciplinary work – e.g. Living Lab, Edinburgh CityScope. Summer schools also very interested – a chance for a student to take a holistic view of their learning over that period. We spend a lot of money sending students out to things – study abroad, summer schools, bursaries… When they go we get little back on what they have done. I think we need to use something like this for that sort of experience, that captures what they have learnt and reflected on.

Q&A

Q1) That idea of students needing to be able to fail successfully really chimes for me… Failures can be very damaging… I thought that the idea of embracing failure, and that kind of start up culture too which values amazing failure… Should/could failure be one of your attributes… to be an amazing failure…

A1) I think that’s LO5 – turning it into a talent. But I think you have touched on an important aspect of our experience. Students are risk averse, they don’t want to fail… But as reflective learners we know that failure matters, that’s when we learn, and this framework can help us address this. I look to people like Paul McC… You have students learning in labs… You can set things up so they fail and have to solve problems… Then they have to work out how to get there, that helps…

Q1) In the sporting world you have the idea of being able to crash the kit, to be able to learn – learning how to crash safely is an early stage skills – in skateboarding, surfing etc.

Keynote, supported by the Centre for Research in Digital Education: In search of connected learning: Exploring the pedagogy of the open web – Dr Laura Gogia MD, PhD, (@GoogleGuacamole)Research Fellow for the Division of Learning Innovation and Student Success at Virginia Commonwealth University, USA, chaired by Jen Ross

Jen: I am really delighted to welcome Laura Gogia to eLearning@ed – I heard her speak a year or so ago and I just felt that great thing where ideas just gel. Laura has just successfully defended her PhD. She is also @GoogleGuacamole on Twitter and organises a Twitter reading club. And her previous roles have been diverse, most interestingly she worked as an obstetrician.

Laura: Thank you so much for inviting me today. I have been watching Edinburgh all year long, it’s just such an exciting place. To have such big conferences this year, there is so much exciting digital education and digital pedagogy work going on, you guys are at the forefront.

So I’m going to talk about connected learning – a simpler title than originally in your programme – because that’s my PhD title… I tried to get every keyword in my PhD title!

Laura Gogia begins her keynote with great enthusiasm at eLearning@ed 2016

Laura Gogia begins her keynote with great enthusiasm at eLearning@ed 2016

Let me show you an image of my daughter looking at a globe here, that look on her face is her being totally absorbed. I look for that look to understand when she is engaged and interested. In the academic context we know that students who are motivated, who see real relevance and benefit to their own work makes for more successful approaches. Drawing on Montesorri and other progressive approaches, Mimi Ito and colleagues have developed a framework for connected learning that shapes those approaches for an online digital world.

Henry Jenkins and colleagues describe Digital Participatory Culture that is interactive, creative, about sharing/contributing and informal mentoring. So a connected teacher might design learning to particularly use those connections out to the wider world. George Siemens and colleagues talk about digital workflow, where we filter/aggregate; critique; remix; amplify – pushing our work out into a noisy world where we need to catch attention. Therefore connected learners and teachers find ways to embed these skills into learning and teaching experiences…

Now this all sounds good, but much of the literature is on K-12, so what does connected learning mean for Higher Education. Now in 2014 my institution embarked on an openly networked connected learning project, on learning experiences that draw from web structure and culture to (potentially) support connected learning and student agency, engagement and success. This is only 2 years in, it’s not about guaranteed success but I’ll be talking about some work and opportunities.

So, a quick overview of VCU, we have an interesting dynamic institution, with the top rated arts college, we have diverse students, a satellite campus in Quatar and it’s an interesting place to be. And we also have VCU RamPages, an unlimited resource for creating webpages, that can be networked and extended within and beyond the University. There are about 16k websites in the last year and a half. Many are student websites, blogs, and eportfolios. RamPages enable a range of experiences and expression but I’ll focus on one, Connected Courses.

Connected Courses are openly networked digital spaces, there are networked participatory activities – some in person, all taught by different teaching staff. And they generate authentic learning products, most of which are visible to the public. Students maintain their own blog sites – usually on RamPages but they can use existing sites if they want. When they enroll on a new course they know they will be blogging and doing so publicly. They use a tag, that is then aggregated and combined with other students posts…

So, this is an example of a standard (WordPress) RamPages blog… Students select the blog template, the header images, etc. Then she uses the appropriate tag for her course, which takes it to the course “Bloggregate”… And this is where the magic happens – facilitating the sharing, the commenting, and from a tutors point of view, the assessment.

Laura Gogia shows the VCA/RamPages

Laura Gogia shows the VCA/RamPages “Bloggregate” at eLearning@ed 2016

The openly networked structure supports student agency and discovery. Students retain control of their learning products during and after the course. And work from LaGuadia found students were more richly engaged in such networked environments. And students can be exposed to work and experience which they would not otherwise be exposed to – from different sites, from different institutions, from different levels, and from different courses.

Connected learning also facilitate networked participation, including collaboration and crowdsourcing, including social media. These tools support student agency – being interdependent and self regulated. They may encourage digital fluency. And they support authentic learning products – making joint contributions that leads to enriched work.

A few years ago the UCI bike race was in Virginia and the University, in place of classes, offered a credited course that encouraged them to attend the bike race and collect evidence and share their reflections through the particular lens of their chosen course option. These jointly painted a rich picture, they were combined into authentic work products. Similarly VCU Field Botany collaboratively  generate a digital field guide (the only one) to the James Richer Park System. This contributes back to the community. Similarly arts students are generating the RVArts site, on events, with students attending, reflecting, but also benefiting our community who share interest in these traditionally decentralised events.

Now almost all connected courses involve blogging, which develops multimodal composition for digital fluency and multiple perspectives. Students include images and video, but some lecturers are embedding digital multimodal composition in their tasks. Inspireed by DS106, University of Mary Washington, our #CuriousCoLab Creative Makes course asks students to process abstract course concepts and enhance their digital fluency. They make a concrete representation of the abstract concept – they put it in their blog with some explanation of why they have chosen to do this in their way. The students loved this… They spent more time, they thought more on these abstract ideas and concepts… They can struggle with those ideas… This course was fully online, with members of the public engaged too – and we saw both students and these external participants did the creative make, whether or not they did the reflective blogging (optional for outside participants).

In terms of final projects students are often asked to create a website. These assignments allow the students to work on topics that really talk to their heart… So, one module can generate projects on multitasking and the brain, another might talk about the impact on the bombing of Hiroshima.

I’ve talked about connected learning but now I’d like to turn to my research on student blogging and tweeting, and my focus on the idea that if students are engaged in Connected Learning we require the recognition and creation of connections with people, and across concepts, contexts and time. I focused on Blogging and tweeting as these are commonly used in connected learning… I asked myself about whether there was something about these practices that was special here. So I looked at how we can capture connected learning through student digital annotation… Looking at hyperlinks, mentions, etc. The things that express digital connection… Are they indicative of pedagogical connections too? I also looking at images and videos, and how students just use images in their blog posts…

Because the Twitter API and WordPress allow capture of digital annotations… You can capture those connections in order to describe engagement. So, for the class I looked at there were weekly Twitter chats… And others beyond the course were open participants, very lightly auditing the course… I wanted to see how they interacted… What I saw was that open students were very well integrated with the enrolled students, and interacting… And this has instructional value too. Instructors used a similar social network analysis tool to ask students to reflect on their learning and engagement.

Laura Gogia speaking about linking and interaction patterns at VCU as part of her eLearning@ed 2016 keynote

Laura Gogia speaking about linking and interaction patterns at VCU as part of her eLearning@ed 2016 keynote

Similarly I looked at psychology students and how they shared hyperlinks… You can see also how sources are found directly, and when they access them exclusively through their Twitter timeline… That was useful for discussing student practice with them – because those are two different processes really – whether reading fully, or finding through others’ sharing. And in a course where there is controversy over legitimate sources, you could have a conversation on what sources you are using and why.

I found students using hyperlinks to point to additional resources, traditional citations, embedded definitions, to connect their own work, but also to contextualise their posts – indicating a presumption of an external audience and of shaping content to them… And we saw different styles of linking. We didn’t see too many “For more info see…” blog posts pointing to eg NYT, CNN. What we saw more of was text like “Smith (2010) states that verbal and nonverbal communication an impact” – a traditional citation… But “Smith 2010” and “nonverbal” were both linked. One goes where you expect (the paper), the other is a kind of “embedded description” – linking to more information but not cluttering their style or main narrative. You couldn’t see that in a paper based essay. You might also see “As part of this course, I have created a framework and design structure for..”… “this course” links to the course – thinking about audience perhaps (more research needed) by talking about context; framework pointed to personal structure etc.

I also saw varying roles of images in blog posts: some were aesthetic, some were illustration, some as extension. Students making self-generated images and videos incorporated their discussion of that making process in their blog posts… I particularly enjoyed when students made their own images and videos.

Laura Gogia talks about the Twitter patterns and hyperlinking practices of her research participants in her eLearning@ed 2016 keynote

Laura Gogia talks about the Twitter patterns and hyperlinking practices of her research participants in her eLearning@ed 2016 keynote

In terms of Twitter, students tweeted differently than they blog. Now we know different platforms support different types of behaviours. What I noticed here was that students tweeted hyperlinks to contribute to the group, or to highlight their own work. So, hyperlink as contribution could be as simple as a link with the hashtag. Whilst others might say “<hyperlink> just confirms what was said by the speaker last week”… which is different. Or it might be, e.g. “@student might find this on financial aid interesting <hyperlink>, now that inclusion of a person name significantly increases the chances of engagement – significantly linked to 3+ replies.

And then we’d see hyperlinks as promotion, although we didn’t see many loading tweets with hashtags to target lots of communities.

So, my conclusions on Digital Annotations, is that these are nuanced areas for research and discussion. I found that students seldom mentioned peer efforts – and that’s a problem, we need to encourage that. There is a lack of targeted contribution – that can be ok and trigger serendipity, but not always. We have to help students and ourselves to navigate to ensure we get information to the right people. Also almost no images I looked at had proper attribution, and that’s a problem. We tell them to cite sources in the text, have to do that in the images too. And finally course design and instructor behaviour matters, students perform better when the structure works for them… So we have to find that sweet spot and train and support instructors accordingly.

I want to end with a quote from a VCU Undergraduate student. This was a listening tour, not a formal part of research, and I asked them how she learned, how they want to learn… And this student talked about the need for learning to be flexible, connected, portable. Does everyone need an open connected space? No, but some do, and these spaces have great affordances… We need to play more here, to stay relevant and engaged with that wider world, to creatively play with the idea of learning!

Q&A

Q1) It was fantastic to see all that student engagement there, it seems that they really enjoy that. I was wondering about information overload and how students and staff deal with that with all those blogs and tweets!

A1) A fabulous question! I would say that students either love or hate connected courses… They feel strongly. One reason for that is the ability to cope with information overload. The first time we ran these we were all learning, the second time we put in information about how to cope with that early on… Part of the reason for this courses is to actually help students cope with that, understand how to manage that. It’s a big deal but part of the experience. Have to own up front, why its important to deal with it, and then deal with it. From a Twitter perspective I’m in the process of persuading faculty to grade Twitter… That hasn’t happened yet… Previously been uncredited, or has been a credit for participation. I have problems with both models… With the no credit voluntary version you get some students who are really into it… And they get frustrated with those that don’t contribute. The participation is more structured… But also frustrating, for the same reasons that can be in class… So we are looking at social network analysis that we can do and embed in grading etc.

Comment – Simon Riley) Just to comment on overload… That’s half of what being a professional or an academic is. I’m a medic and if you search PubMed you get that immediately… Another part of that is dealing with uncertainty… And I agree that we have to embrace this, to show students a way through it… Maybe the lack of structure is where we want to be…

A2) Ironically the people with the least comfort with uncertainty and unstructured are faculty members – those open participants. They feel that they are missing things… They feel they should know it all, that they should absorb it at. This is where we are at. But I was at a digital experience conference where there were 100s of people, loads of parallel strands… There seems to be a need to see it all, do it all… We have to make a conscious effort at ALT Lab to just help people let it go… This may be the first time in history where we have to be fine that we can’t know it all, and we know that and are comfortable…

Q3) Do you explicitly ask students not to contribute to that overload?

A3) I’m not sure we’re mature enough in practice… I think we need to explain what we are doing and why, to help them develop that meta level of learning. I’m not sure how often that’s happening just now but that’s important.

Q4) You talked a lot about talking in the open web in social media. Given that the largest social networks are engaging in commercial activities, in political activities (e.g. Mark Zuckerberg in China), is that something students need to be aware of?

A4) Absolutely, that needs to be there, alongside understanding privacy, understanding attribution and copyright. We don’t use Facebook. We use WordPress for RamPages – have had no problems with that so far. But we haven’t had problems with Twitter either… It’s a good point that should go on the list…

Q5) Could you imagine connected courses for say Informatics or Mathematics…? What do they look like?

A5) Most of the math courses we have dealt with are applied mathematics. That’s probably as far as I could get without sitting with a subject expert – so give me 15 mins with you and I could tell you.

Q6) So, what is the role of faculty here in carefully selecting things for students which we think are high quality?

A6) The role is as it has ever been, to mark those things out as high quality…

Q6) There is a lot of stuff out there… Linking randomly won’t always find high quality content.

A6) Sure, this is not about linking randomly though, it’s about enabling students to identify content, so they understand high quality content, not just the list given, and that supports them in the future. Typically academic staff do curate content, but (depending on the programme), students also go out there to find quality materials, discussing reasons for choosing, helping them model and understand quality. It’s about intentionality… We are trying to get students to make those decisions intentionally.

Digital Education & Technology Enhanced Learning Panel Session, chaired by Victoria Dishon

Victoria: I am delighted to be able to chair this panel. We have some brilliant academic minds and I am very pleased to be able to introduce some of them to you.

Prof. Sian Bayne (@sbayne), Professor of Digital Education in the School of Education, and Assistant Principal, Digital Education

I have a slight identity crisis today! I am Sian Bayne and I’m Professor of Digital Education but I am also newly Assistant Principal, Digital Education. It’s an incredibly exciting area of work to take forward so I thought I’d talk a bit about digital education at Edinburgh and where we are now… We have reputation and leadership, 2600 PG online students, 67 programmes, 2m MOOC learners, and real strategic support in the University. It’s a good time to be here.

Sian Bayne speaking about her exciting new role, at eLearning@ed 2016

Sian Bayne speaking about her exciting new role, at eLearning@ed 2016

We also have a growing culture of teaching innovation in Schools and a strong understanding of the challenges of academic development for and with DE. Velda McCune, Depute Director of IAD, currently on research leave, talks about complex, multilateral and ever shifting conglomerations of learning.

I want to talk a bit about where things are going… Technology trends seem to be taking us in some particular directions…We have a range of future gazing reports and updates, but I’m not as sure that we have a strong body of students, of academics, of support with a vision for what we want digital education to look like here. We did have 2 years ago the Ed2020 trying to look at this. The Stanford 2025 study is also really interesting, with four big ideas emerging around undergraduate education – of the open loop university – why 4 years at a set age, why not 6 years across your lifetime; paced education – 6 years of personalised learning with approaches for discipline we’re embedded in and put HE in the world; Axis flip; purpose learning – coming to uni with a mission not a major… So it would be interesting to think of those ideas in this university.

UAL/LSE did a digital online hack event, Digital is not the future, to explore the idea of hacking the institution from the inside. Looking at shifting to active work. Also a great new MIT Future of Digital Education report too. And if you have any ideas for processes or approaches to take things forward, please do email or Twitter me…

Melissa Highton, Assistant Principal, Online Learning (@honeybhighton)

I am also having quite an identity crisis. Sian and I have inherited quite a broad range of activities from Jeff Haywood, and I have inherited many of the activities that he had as head of IS, particularly thinking about online learning in the institution, number of courses, number of learners, what success would look like, targets – and where they came from – get thrown about… Some are assumptions, some KPI, some reach targets, some pure fantasy! So I’ll be looking at that, with the other Assistant Principals and the teams in ISG.

Melissa Highton talks about her forthcoming new role, at eLearning@ed 2016

Melissa Highton talks about her forthcoming new role, at eLearning@ed 2016

What would success look like? That Edinburgh should be THE place to work if you want to work on Digital Education, that it is innovative, fund, and our practice must be research informed, research linked, research connected. Every educator should be able to choose a range of tools to work with, and have support and understanding of risk around that… Edinburgh would be a place that excellent practitioners come t0 – and stay. Our online students would give us high satisfaction ratings. And our on campus learners would see themselves continuing studies online – preferably with us, but maybe with others.

To do that there are a set of more procedural things that must be in place around efficiency, structures, processes, platforms, to allow you to do the teaching and learning activity that we need you to do to maintain our position as a leader in this area. We have to move away from dependence on central funding, and towards sustainable activity in their departments and schools. I know it’s sexy to spin stuff up locally, it’s got us far, but when we work at scale we need common schools, taking ideas from one part of the institution to others. But hopefully creating a better environment for doing the innovative things you need to do.

Prof. David Reay (@keelincurve); Chair in Carbon Management & Education Assistant Principal, Global Environment & Society

Last year at eLearning@ed I talked about the Sustainability and Social Responsibility course, and today I’ll talk about that, another programme and some other exciting work we are doing all around Global Change and Technology Enhanced Learning.

So with the Online MSc in Carbon Management we have that fun criteria! We had an on campus programme, and it went online with students across the world. We tried lots of things, tried lots of tools, and made all sorts of mistakes that we learned from. And it was great fun! One of my favourite students was joining the first Google Hangout from a bunker in Syria, during the war, and when she had connectivity issues for the course we had to find a tactic to be able to post content via USB to students with those issues.

David Reay speaks about the new Online

David Reay speaks about the new Online “Sustainability & Social Responsibility” MSc at eLearning@ed 2016

So that online course in Sustainability and Social Responsibility is something we’ve put through the new CAIRO process that Fiona Hale is leading on, doing that workshop was hugely useful for trying those ideas, making the mistakes early so we could address them in our design. And this will be live in the autumn, please do all take a look and take it.

And the final thing, which I’m very excited about, is an online “Disaster Risk Reduction” course, which we’ve always wanted to do. This is for post earthquake, post flooding, post fire type situations. We have enormous expertise in this area and we want to look at delivery format – maybe CPD for rescue workers, MOOCs for community, maybe Masters for city planners etc. So this is the next year, this is what I’ll speak about next year.

Prof. Chris Sangwin (@c_sangwin), Chair in Technology Enhanced Science Education, School of Mathematics

I’m new to Edinburgh, joined in July last year, and my interest is in automatic assessment, and specifically online assessment. Assessment is the cornerstone of education, it drives what people do, that is the action they undertake. I’ve been influenced by Kluger and DeNiki 1996 who found that “one third of feedback interventions decreased performance”. This study found that specific feedback on the task was effective, feedback that could be seen as a personal attack was not. Which makes sense, but we aren’t always honest about our failures.

Chris Sangwin talks about automated approaches to assessing mathematics, at eLearning@ed 2016

Chris Sangwin talks about automated approaches to assessing mathematics, at eLearning@ed 2016

So, I’ve developed an automatic assessment system for mathematics – for some but not all things – which uses the computer algebra system (CAS) Maxima, which generates random structured questions, gives feedback, accommodates multiple approaches, and provides feedback on the parts of the answer which does not address the question. This is a pragmatic tool, there are bigger ideas around adaptive learning but those are huge to scope, to build, to plan out. The idea is that we have a cold hard truth – we need time, we need things marking all the time and reliably, and that contrasts with the much bigger vision of what we want for our students for our education.

You can try it yourself here: http://stack.maths.ed.ac.uk/demo/ and I am happy to add you as a question setter if you would like. We hope it will be in Learn soon too.

Prof. Judy Hardy (@judyhardy), Professor of Physics Education, School of Physics and Astronomy.

I want to follow up my talk last year about what we need to focus on “awareness” knowledge, “how to” knowledge, and we need “principles” knowledge. Fewer than a quarter of people don’t modify approaches in their teaching – sometimes that is fine, sometimes it is not. So I want to talk about a few things we’ve done, one that worked, one that did not.

Judy Hardy talks about modifying teaching approaches, at eLearning@ed 2016

Judy Hardy talks about implementing changes in teaching approaches, at eLearning@ed 2016

We have used Peerwise testing, and use of that correlates with exam performance, even when controlling for other factors. We understand from our evidence how to make it work. We have to move from formative (recommended) to summative (which drives behaviour). We have to drive students ownership of this work.

We have also used ACJ – Adaptive Comparative Judgement – to get students to understand what quality looks like, to understand it in comparison to others. They are not bad at doing that… It looks quite good at face value. But when we dug in we found students making judgments on surface features… neatness, length, presence of diagram… We are not at all confident about their physics knowledge, and how they evidence that decision… For us the evidence wasn’t enough, it wasn’t aligned with what we were trying to do. There was very high administrative overheads… A detail that is easily overlooked. For a pilot its fine, to work every day that’s an issue.

Implementing change, we have to align the change with the principles – which may also mean challenge underlying beliefs about their teaching. It needs to be compatible with local, often complex, classroom context, and it takes time, and time to embed.

Victoria: A lot of what we do here does involve taking risk so it’s great to hear that comparison of risks that have worked, and those that are less successful.

Dr Michael Seery, Reader, Chemistry Education. (@seerymk)

Like Chris I joined last July… My background has been in biology education. One of the first projects I worked on was on taking one third of chemistry undergraduate lab reports (about 1200 reports_ and to manage and correct those for about 35 postgraduate demonstrators. Why? Well because it can be hard to do these reports, often inconsistent in format, to assess online and I wanted to seek clarity and consistency of feedback. And the other reason to move online was to reduce administrative burden.

Michael Seery speaks about moving to online learning (image also shows the previous offline administrative tools), at eLearning@ed 2016

Michael Seery speaks about moving to online learning (image also shows the previous offline administrative tools), at eLearning@ed 2016

So Turnitin (Grademark) was what I started looking at. But it requires a Start Date, Due Date, and End date. But our students don’t have those. Instead we needed to retrofit it a bit. So, students submitted to experimental Dropbox, demonstrators filtered submissions and corrected their lab reports, and mark and feedback returned immediately to students… But we had problems… No deadline possible so can’t track turnaround time/impose penalties; “live” correction visible by student, and risk of simultaneous marking. And the Section rubrics (bands of 20%) too broad – that generated a great deal of feedback, as you can imagine. BUT demonstrators were being very diligent about feedback – but that also confused students as minor points were mixed with major points.

So going forward we are using groups, students will submit by week so that due dates ad turnaround times clearer, use TurnItIn assessment by groups with post date, and grading forms all direct mark entry. But our challenge has been retrofitting technologies to the assessment and feedback issue, but that bigger issue needs discussion.

The format for this session is that each of our panel will give a 3-5 minute introductory presentation and we will then turn to discussion, both amongst the panel and with questions and comments from the audience.

Panel discussion/Q&A

Q1) Thank you for a really interesting range of really diverse presentations. My question is for Melissa, and it’s about continuity of connection… UG, online, maybe pre-arrival, returning as a lifelong learning… Can we keep our matriculation number email forever? We use it at the start but then it all gets complex on graduation… Why can’t we keep that as that consistent point of contact.

A1, Melissa) That sounds like a good idea.

Q2) We’ve had that discussion at Informatics, as students lose a lot of materials etc. by loss of that address. We think an @ed.ac.uk alias is probably the way, especially for those who carry on beyond undergraduate. It was always designed as a mapping tool. But also let them have their own space that they can move work into and out of. Think that should be University policy.

A2, Melissa) Sounds like a good idea too!

Q3) I was really pleased to hear assessment and feedback raised in a lot of these presentations. In my role as Vice Principal Assessment and Feedback I’m keen to understand how we can continue those conversations, how do we join these conversations up? What is the space here? We have teaching networks but what could we be missing?

A3, Michael) We all have agreed LOs but if you ask 10 different lab demonstrators they will have 10 different ideas of what that looks like that. I think assessment on a grade, feedback, but also feed forward is crucial here. Those structures seems like a sensible place.

A3, Judy) I think part of the problem is that teaching staff are so busy that it is really difficult  to do the work needed. I think we should be moving more towards formative assessment, that is very much an ideal, far from where we are in practice, but it’s what I would like to see.

Q4) A lot of you talked about time, time being an issue… One of the issues that students raise all of the time is about timeliness of feedback… Do you think digital tools offer a way to do this?

A4, Judy) For me, the answer is probably no. Almost all student work is handwritten for us… What we’d like to do is sit with a student to talk to them, to understand what is going on in their heads, how their ideas are formed. But time with 300 students is against us. So digital tools don’t help me… Except maybe Chris’ online assessment for mathematics.

A4, Chris) The idea of implementing the system I showed is to free up staff time for that sort of richer feedback, by tackling the limited range of work we can mark automatically. That is a limited range though and it diminishes as the subject progresses.

A4, David) We implemented online submission as default and it really helped with timings, NSS, etc. that really helped us. For some assessment that is hard, but it has helped for some.

A4, Michael) Students do really value that direct feedback from academic staff… You can automate some chemistry marking, but we need that human interaction in there too, that’s important.

A4, Sian) I want to raise a humanities orientated way of raising the time issue… For me time isn’t just about the timeline for feedback, but also exploring different kinds of temporality that you can do online. For our MSc in Digital Education we have students blog and their tutors engage in a long form engaged rich way throughout the course, feedback and assessment is much richer than just grading.

Q5) In terms of incorporation of international students here, they are here for one year only and that’s very short. Sometimes Chinese students meet a real clash of expectations around language proficiency, a communication gap between what assessment and feedback is, and what we practice. In terms of technology is there a formative model for feedback for students less familiar with this different academic culture, rather than leaving them confused for one semester and then start to understand.

A5, David) It’s such an important point. For all of our students there is a real challenge of understanding what feedback actually is, what it is for. A lot of good feedback isn’t badged properly and doesn’t show up in NSS. I love the idea of less assessment, and of the timing being thought through. So we don’t focus on summative assessment early on, before they know how to play the game.. I agree really.

A5, Judy) One thing we don’t make much use, is of exemplars. They can be very valuable. When I think about how we get expertise as markers, is because of trying to do it. Students don’t get that opportunity, you only see your own work. Exemplars can help there…

The panel listening to questions from the floor at eLearning@ed 2016

The panel listening to questions from the floor at eLearning@ed 2016

Q6) Maybe for the panel, maybe for Fiona… One thing to build in dialogue, and the importance of formative assessment… Are you seeing that in the course design workshops, use of CAIReO (blog post on this coming soon btw), whether you see a difference in the ways people assess….

A6, Fiona) We have queues of people wanting the workshop right now, they have challenges and issues to address and for some of them its assessment, for others its delivery or pace. But assessment is always part of that. It comes naturally out of storyboarding of learner activities. BUt we are not looking at development of content, we are talking about learning activity – that’s where it is different. Plenty to think about though…

Comment, Ross) Metaphor of a blank piece of paper is good. With learning technologies you can start out with that sense of not knowing what you want to achieve… I think exemplars help here too, sharing of ideas and examples. Days like today can be really helpful for seeing what others are doing, but then we go back to desks and have blank sheets of paper.

Q7) As more policies and initiatives appear in the institution, does it matter if we believe that learning is what the student does – rather than the teacher? I think my believe is that learning occurs in the mind of the learning… So technologies such as distance and digital learning can be a bit strange… Distance and digital teaching maybe makes more sense…

A7) I think that replacing terminology of “teaching” with terminology of “learning” has been taking place. Hesper talks about the problems of the “learnification of education”, when we do that we instrumentalise education. That ignores power structures and issues in many ways. My colleagues and I wrote a Manifesto for Teaching Online and we had some flack about that terminology but we thought that that was important.

Q8) Aspirationally there would be one to one dialogue with students… I agree that that is a good aspiration… And there is that possibility of continuity… But my question was to what extent past, present, and future physical spaces… And to what extent does that enable or challenge good learning or good teaching?

A8, Judy) We use technology in classrooms. First year classes are flipped – and the spaces aren’t very conducive to that. There are issues with that physical space. For group working there are great frustrations that can limit what we can do… In any case this is somewhat inevitable. In terms of online education, I probably have to hand to colleagues…

A8, David) For our institution we have big plans and real estate pressures already. When we are designing teaching spaces, as we are at KB right now, there is a danger of locking ourselves into an estate that is not future proof. And in terms of impinging on innovation, in terms of changing demands of students, that’s a real risk for us… So I suppose my solution to that is that when we do large estate planning, that we as educators and experts in technology do that work, do that horizon scanning, like Sian talked about, and that that feeds into physical space as well as pedagogy.

A8, Sian) For me I want leakier spaces – bringing co-presences into being between on campus and online students. Whole area of digital pedagogical exploration we could be playing with.

A8, Melissa) There is is a very good classroom design service within the Learning and Teaching spaces team in IS. But there is a lag between the spaces we have today, and getting kit in place for current/future needs. It’s an ongoing discussion. Particularly for new build spaces there is really interesting possibility around being thoughtful. I think we also have to think about shifting time and space… Lecture Capture allows changes, maybe we need fewer big lecture rooms… Does the teaching define the space, or the space that designs the teaching. Please do engage with the teams that are there to help.

A8, Michael) One thing that is a danger, is that we chase the next best thing… But those needs change. We need to think about the teaching experience, what is good enough, what is future-proof enough… And where the need is for flexibility.

Victoria: Thanks to all our panel!

eMarking Roll Out at Abertay – Carol Maxwell, Technology Enhanced Learning Support team Leader, Abertay University, chaired by Michael Seery

I am Carol Maxwell from Abertay University and I am based in the Technology Enhanced Learning support team. So, a wee bit about Abertay… We are a very small city centre university, with 4025 students (on campus) and 2091 in partner institutions. We are up 9 places to 86 in Complete University Guide (2017), And our NSS score for feedback turnaround went up by 12%, which we think has a lot to do with our eMarking roll out.

We have had lots of change – a new Principal and new Vice Chancellor in summer 2012. We have many new appointments, a new director of teaching and learning enhancement, and we’ve moved towards central services rather than local admin. We get involved in the PGCert programme, and all new members of staff have to go through that process. We have monthly seminars where we get around 70 people coming along. We have lots of online resources, support for HEA accreditation and lots of things taking place, to give you a flavour of what our team does.

Carol Maxwell talks about the work of the Abertay Teaching and Learning Enhancement Team, at eLearning@ed 2016

Carol Maxwell talks about the work of the Abertay Teaching and Learning Enhancement Team, at eLearning@ed 2016

So the ATLEF project was looking at supporting assessment and feedback practice with technology, this was when our team was part of information services, and that was intended to improve the University’s understanding and awareness of the potential benefits, challenges and barriers associated with a more systematic and strategic approach to technology-enhanced assessment and feedback, we wanted to accelerate staff awareness of technological tools for assessment.

So we did a baseline report on practice – we didn’t have tools there, and instead had to interrogate Blackboard data course by course… We found only 50% of those courses using online assessment were using Grademark to do this. We saw some using audio files, some used feedback in Grade Centre, some did tracked changes in Word, and we also saw lots of use of feedback in comments on eportfolios.

We only had 2% online exams. Feedback on that was mixed, and some was to do with how the actual user experience worked – difficulties in scrolling through documents in Blackboard for instance. Some students were concerned that taking exams at home would be distracting. There was also a perception that online exams were for benefit of teaching staff, rather than students.

So we had an idea of what was needed, and we wanted to also review sector practices. We found Ferrell 2013, and also the Heads of eLearning Forum Electronic Management of Assessment Survey Report 2013 we saw that the most common practice was e-submission as well as hard copy printed by student… But we wanted to move away from paper. So, we were involved in the Jisc Electronic Marking and Assessment project and cycle… And we were part of a think tank where we discussed issues such as retention and archiving of coursework, and in particular the importance of it being a University wide approach.

So we adopted a new Abertay Assessment Strategy. So for instance we now have week 7 as a feedback week. It isn’t for teaching, it is not a reading week, it is specifically for assessment and feedback. The biggest change for our staff was the need for return of coursework and feedback in 10 working days before week 13, and within 15 weeks thereafter, That was a big change. We had been trialing things for year, so we were ready to just go for it. But we had some challenges, we have a literal grading policy, A+, A, B+ etc. which is harder in these tools.

We had senior management, registry, secretariat, teaching staff, teaching and learning staff discussing and agreeing the policy document. We had EMA champions demonstrating current process, we generated loads of supporting materials to. So one of our champions delivered video feedback – albeit with some student feedback to him that he was a little dry, he took it on the chin. One academic uses feedback on PebblePad, we have a lecturer who uses questions a great deal in mathematics courses, letting students attempt questions and then move on after completion only. We also have students based in France who were sharing reflections and video content, and feedback to it alongside their expected work. And we have Turnitin/Grademark, of which the personalised feedback is most valuable. Another champion has been using discussion forums, where students can develop their ideas, see each others work etc. We also hold lots of roadshow events, and feedback from these have raised the issue of needing two screens to actually manage marking in these spaces.

Carol Maxwell talks about the support for staff in rolling out eMarking at Abertay, at eLearning@ed 2016

Carol Maxwell talks about the support for staff in rolling out eMarking at Abertay, at eLearning@ed 2016

The areas we had difficulty with here was around integration, with workarounds required for Turnitin with Blackboard Grade Centre and literal grading; Staff resistance – with roadshows helping’ Moderation – used 3 columns not 2 for marking; Anonymity; returning feedback to students raised some complexities faced. There has been some challenging work here but overall the response has been positive. Our new templates include all the help and support information for our templates to.

So, where to now… Carry on refining procedures and support, need on going training – especially new staff, Blackboard SITS Integration. More online exams (some online and some off site); digital literacy etc. And, in conclusion you need Senior Management support and a partnership approach with academic staff, students and support services required to make a step change in practice.

Q&A

Q1) I’m looking at your array of initiatives, but seeing that we do these things in pockets. The striking thing is how you got the staff on board… I wonder if we have staff on board, but not sure we have students on board… So what did you do to get the students on board?

A1) There was a separate project on feedback with the students, raising student awareness on what feedback was. The student association were an important part of that. Feedback week is intended to make feedback to students very visible and help them understand their importance… And the students all seem to be able to find their feedback online.

Q2, Michael) You made this look quite seamless across spaces, how do you roll this out effectively?

A2) We’ve been working with staff a long time, so individual staff do lots of good things… The same with assessment and feedback… It was just that we had those people there who had great things there… So like the thinking module there is a model with self-enroll wikis… You end up with examples all around. With the roll out of EMA the Principal was keen that we just do this stuff, we have already tested it. But Abertay is a small place, we have monthly meet ups with good attendance as that’s pretty much needed for PGCAP. But it’s easier to spread an idea, because we are quite small.

Q3) For that 10-15 day turnaround how do you measure it, and how do you handle exemptions?

A3) You can have exemptions but you have to start that process early, teams all know that they have to pitch in. But some academic staff have scaled assessment back to the appropriate required level.

At this point we broke for an extended break and poster session, some images of which are included below. 

Amy Burge and Laine Ruus show their posters during the eLearning@ed 2016 Poster Session

Amy Burge and Laine Ruus show their posters during the eLearning@ed 2016 Poster Session

 

Participants explore posters including Simon Fokt's Diversity Reading List poster at eLearning@ed 2016

Participants explore posters including Simon Fokt’s Diversity Reading List poster at eLearning@ed 2016

 

Ross Ward provides an informal LTW drop in session as part of the eLearning@ed 2016 Poster Session

Ross Ward provides an informal LTW drop in session as part of the eLearning@ed 2016 Poster Session

Taking this forward – Nicola Osborne

Again, I was up and chairing so notes are more minimal from these sessions… 

The best of ILW 2016 – Silje Graffer (@SiljeGrr), ILW/IAD

ILW is in its fifth year… We had over 263 events through the event, we reached over 2 million people via social media…

How did we get to this year? It has been amazing in the last few years… We wanted to see how we could reach the students and the staff in a better way that was more empowering for them. We went back to basics, we hired a service design company in Glasgow to engage people who had been involved in ILW before… In an event we called Open ILW… We wanted to put people first. We had 2 full time staff, 3 student staff, 20 school coordinators – to handle local arrangements – and created a kind of cool club of a network!

Silje Graffer talks about the Innovative Learning Week team, at eLearning@ed 2016

Silje Graffer talks about the Innovative Learning Week team, at eLearning@ed 2016

So we went back to the start… We wanted to provide clarity on the concept… We wanted to highlight innovation already taking place, that innovation doesn’t just happen once a year. And to retain that space to experiment.

We wanted to create a structure to support ideas. We turned feedback into a handbook for organisers. We had meet ups every month for organisers, around ideas, development, event design, sharing ideas, developing process… We also told more stories through social media and the website. We curated the programme around ideas in play. We wanted to focus on people making the events, who go through a valuable process, and have scope to apply that.

Silje Graffer talks about some of the highlight events from ILW16, at eLearning@ed 201g

Silje Graffer talks about some of the highlight events from ILW16, at eLearning@ed 201g

So I just wanted to flag some work on openness, there was a Wikipedia Editathon on the history of medicine, we had collaboration – looking at meaningful connections between different parts of the university, particularly looking at learners with autism which was really valuable. Creativity… This wasn’t digital education in itself, but the Board Game Jam was about creating games, all were openly licensed, and you can access and use those games in teaching, available from OER. A great example for getting hands dirty and how that translates into the digital. And iGEM Sandpit and Bio Hackathon, are taking ideas forward to a worldwide event. Smart Data Hack continued again, with more real challenges to meet. Prof Ewan Klein gas taken work forward in the new Data, Design and Society Course… And in the Celebratory mode, we had an online game called Edinburgh is Everywhere, exploring Edinburgh beyond the physical campus! And this was from a student. You can browse all the digital education events that ran on the website, and I can put you in touch with organisers.

Next year its happening again, redeveloped and imagined again.

Q1) Is it running again

A1) Yes! But we will be using some of the redesigning approaches again.

 

CMALT – what’s coming up – Susan Greig (@SusieGreig),

Are you certified… I am based in LTW and I’m really pleased to announce new support for achieving CMALT within the University. And I can say that I am certified!

CMALT is the Certified Member of ALT, it’s recommended for documenting and reflecting on your work, a way to keep pace with technology, it is certified by peers, update certification every three years. So, why did I do CMALT? When back when I put my portfolio forward in 2008 I actually wrote down my reasons – I hoped to plan for my future careers more effectively, the career path isn’t well definied and I was keen to see where this would take me. And looking back I don’t think that career path has become more clear… So still very useful to do.

Susan Greig talking about support for CMALT, at eLearning@ed 2016

Susan Greig talking about support for CMALT, at eLearning@ed 2016

So, to do CMALT you need to submit a portfolio. That is around five areas, operational issues; teaching, learning and/or assessment processes; the wider context; communication; and a specialist area. I did this as an individual submission, but there is also an option to do this together. And that is what we will be doing in Information Services. We will provide ongoing support and general cheer-leading, events which will be open to all, and regular short productive cohort meetings. There will also be regular writing retreats with IAD. So, my challenge to you is can we make the University of Edinburgh the organisation with the most accredited CMALT members in the UK?

If you are interested get in touch. Likely cohort start is August 2016… More presentations from alt 3rd june, showcase event there in july

Making Connections all year long: eLearning@ed Monthly meet ups – Ross Ward (@RossWoss), Educational Design

Today has been a lovely chance to  get to meet and network with peers… Over the last year in LTW  (Learning, Teaching and Web Services) we’ve looked at how we can raise awareness of how we can help people in different schools and colleges achieve what they are trying to do, and how we can support that… And as we’ve gone around we’ve tried to work with them to provide what is needed for their work, we’ve been running roadshows and workshops. Rather than focus on the technologies, we wanted to come from more of a learning and teaching perspective…Around themes of Interactive learning and teaching, assessment and feedback, open educational resources, shakers, makers and co-creators, and exploring spaces… From those conversations we’ve realised there is loads of amazing stuff coming on… And we wanted to share these more widely…

Ross Ward talks about recent elearning@ed/LTW Monthly MeetUps, at eLearning@ed 2016

Ross Ward talks about recent elearning@ed/LTW Monthly MeetUps, at eLearning@ed 2016

Luckily we have a great community already… And we have been working collaboratively between elearning@ed and learning, teaching and web services, and having once a month meetings on one of the themes, sharing experiences and good practices… A way to strengthen networks, a group to share with in physical and digital shared spaces… The aim is that they are open to anyone – academics, learning technologists, support teams… Multiple short presentations, including what is available right now, but not ignoring horizon scanning. It’s a space for discussion – long coffee break, and the pub afterwards. We have a 100% record of going to the pub… And try to encourage discussion afterwards…

So far we’ve looked at Using media in teaching (January); Open Education – including our Wikimedian in residence (February); Things we have/do – well received catch up (March); Learning Design – excellent session from Fiona (April). We put as much as we can on the wiki – notes and materials – and you’ll find upcoming events there too. Which includes: Assessment and Feedback – which will be lively if the sessions here are anything to go by (27th June); CMALT (27th July); Maker Space (August) – do share your ideas and thoughts here.

In the future we are trying to listen to community needs, to use online spaces for some, to stream, to move things around, to raise awareness of the event. All ideas and needs welcomed… Interesting to use new channels… These tend to be on themes so case by case possibilities…

The final part of our day was our wrap up by Prof. Charlie Jeffrey, who came to us fresh from Glasgow where he’d been commenting on the Scottish Parliamentary election results for the BBC… 

Wrap Up – Professor Charlie Jeffrey, Senior Vice Principal.

I’m conscious of being a bit of an imposter here as I’m wrapping up a conference that I have not been able to attend most of. And also of being a bit of an obstacle between you and the end of the day… But I want to join together a few things that colleagues and I have been working on… The unambiguous priority of teaching and learning at Edinburgh, and the work that you do. So, what is the unambiguous priority about? It’s about sharpening the focus of teaching and learning in this university. My hope is that we reach a point in the future that we prize our excellent reputation for learning and teaching as highly as we do our excellent reputation in research. And I’ve been working with a platoon of assistant principals looking at how best to structure these things. One thing to come out of this is the Teaching Matters website which Amy (Burge) so wonderfully edits. And I hope that that is part of that collegiate approach. And Ross, I think if we had blogs and shorter contributions for the website coming out of those meetings, that would be great…

Charlie Jeffrey gives the wrap up at eLearning@ed 2016

Charlie Jeffrey gives the wrap up at eLearning@ed 2016

I’m also conscious of talking of what we do now… And that what we do in the future will be different. And what we have to do is make sure we are fit for the future… Traditional teaching and learning is being transformed by Teaching and Learning… And I wouldn’t want us to be left behind. That’s a competitive advantage thing… But it is is also a pedagogical issues, to do the best we can with the available tools and technologies. I’m confident that we can do that… We have such a strong track record of DEIs, MOOCs, and what Lesley Yellowlees calls he “TESEy chairs”, the Centre of research in Digital Education, an ISG gripped in organisational priorities, and a strong community that helps us to be at the forefront of digital education. Over the last few weeks we’ve had three of the worlds best conferences in digital education, and that’s a brilliant place to be! And an awful lot of that is due to the animation and leadership of Jeff Haywood, who has now retired, and so we’ve asked Sian and Melissa to help ensure that we stay in that absolutely powerful leading position, no pressure whatsoever, but I am very confident that they will be well supported. It’s pretty rare within an organisation to get 90 people to make time to come together and share experience like you have today.

And with that the day was finished! A huge thank you again to all who were part of the event. If you were there – whether presenting or to participate in the poster session or just to listen, I would ask that you complete our feedback survey if you haven’t already. If you weren’t there but are interested in next year’s event or the eLearning@ed community in general, you’ll find lots of useful links below. Video of the event will also be online soon (via MediaHopper – I’ll add the link once it is all live) so anyone reading this should be able to re-watch sessions soon. 

Related Resources

More about eLearning@ed

If you are interested in learning more about the eLearning@ed Forum the best place to start is our wiki: http://elearningforum.ed.ac.uk/.

If you are based at Edinburgh University – whether staff or student – you can also sign up to the Forum’s mailing list where we share updates, news, events, etc.

You can also join us for our monthly meet ups, co-organised with the Learning, Teaching and Web Services team at Edinburgh University. More information on these and other forthcoming events can be found on our Events page. We are also happy to add others’ events to our calendar, and I send out a regular newsletter to the community which we are happy to publicise relevant events, reports, etc. to. If you have something you’d like to share with the eLearning@ed community do just get in touch.

You can also read about some of our previous and more recent eLearning@ed events here on my blog:

 

May 082015
 
Image of surgical student activity data presented by Paula Smith at the Learning Analytics Event

Today I am at the UK Learning Analytics Network organised by the University of Edinburgh in Association with Jisc. Read more about this on the Jisc Analytics blog. Update: you can also now read a lovely concise summary of the day by Niall Sclater, over on the Jisc Analytics blog.

As this is a live blog there may be spelling errors, typos etc. so corrections, comments, additions, etc. are welcome. 

Introduction – Paul Bailey

I’m Paul Bailey, Jisc lead on the Learning Analytics programme at the moment. I just want to say a little bit about the network. We have various bits of project activities, and the network was set up as a means for us to share and disseminate the work we have been doing, but also so that you can network and share your experience working in Learning Analytics.

Housekeeping – Wilma Alexander, University of Edinburgh & Niall Sclater, Jisc

Wilma: I am from the University of Edinburgh and I must say I am delighted to see so many people who have traveled to be here today! And I think for today we shouldn’t mention the election!

Niall: I’m afraid I will mention the election… I’ve heard that Nicola Sturgeon and Alex Salmond have demanded that Tunnucks Teacakes and Caramel Wafers must be served at Westminster! [this gets a big laugh as we’ve all been enjoying caramel wafers with our coffee this morning!]

I’ll just quickly go through the programme for the day here. We have some really interesting speakers today, and we will also be announcing the suppliers in our learning analytics procurement process later on this afternoon. But we kick off first with Dragan.

Doing learning analytics in higher education: Critical issues for adoption and implementation – Professor Dragan Gašević, Chair in Learning Analytics and Informatics, University of Edinburgh

I wanted to start with a brief introduction on why we use learning analytics. The use of learning analytics has become something of a necessity because of the growing needs of education – the growth in the number of students and the diversity of students, with MOOCs being a big part of that realisation that many people want to learn who do not fit our standard idea of what a student is. The other aspect of MOOCs is their scale: as we grow the number of students it becomes difficult to track progress and the feedback loops between students and instructions are lost or weakened.

In learning analytics we depend on two types of major information systems… Universities have had student information systems for a long time (originally paper, computerised 50-60 years ago), but they also use learning environments – the majority of universities have some online coverage of this kind for 80-90% of their programmes. But we also don’t want to exclude other platforms, including communications and social media tools. And no matter what we do with these technologies we leave a digital trace, and that is not a reversible process at this point.

So, we have all this data but what is the point of learning analytics? It is about using machine learning, computer science, etc. approaches in order to inform education. We defined learning analytics as being “measurement, collection, analysis, and reporting” of education but actually that “how” matters less than “why”. It should be about understanding and optimising learning and the environments in which learning occurs. And it is important not to forget that learning analytics are there to understand learning and are about understanding what learning is about.

Some case studies include Course Signals at Purdue. They use Blackboard for their learning management system. They wanted to predict students who would successfully complete students, and to identify those at risk. They wanted to segment their students into at high risk, at risk, or not at risk at all. Having done that they used a traffic light system to reflect that, and they used that traffic light system for students was shown both to staff and students. When they trialed that (Arnold and Pistilli 2012) with a cohort of students, they saw greater retention and success. But if we look back at how I framed this, we need to think about this in terms of whether this changes teaching…

So, also at Purdue, they undertook a project analysing the email content of instructors to students. They found that more detailed feedback, they just increased the summative feedback. So this really indicates that learning analytics really has to feed into changes in teaching practices in our institutions, and we need our learning analytics to provide signalling and guidance that enables teaching staff to improve their practice, and give more constructive feedback. (see Tanes, Arnold, King and Remnet 2011).

University of Michigan looked at “gateway course” as a way to understand performance in science courses (see Wright, McKay, Hershock, Miller and Triz 2014). They defined a measure for their courses, which was “better than expected”. There were two measures for this: previous GPA, and goals set by students for the current course. They then used predictive models for how students could be successful, and ways to help students to perform better than expected. They have also been using technology designed for behavioural change, which they put to use here… Based on that work they generated personalised messages to every students, based on rational for these students, and also providing predicted performance for particular students. For instance an example here showed that a student could perform well beyond their own goals, which might have been influenced by the science course not being their major. The motivator for students here was productive feedback… They interviewed successful students from previous years, and used that to identify behaviours etc. that led to success, and they presented that as feedback from peers (rather than instructors). And i think this is a great way to show how we can move away from very quantitative measures, towards qualitative feedback.

So, to what extent are institutions adopting these approaches? Well, there are very few institutions with institution-wide examples of adoptions. For instance University of Michigan only used this approach on first year science courses. They are quite a distributed university – like Edinburgh – which may be part of this perhaps. Purdue also only used this on some course.

Siemans, Dawson and Lynch (2014) surveyed the use of learning analytics in the HE sector, asking about the level of adoption and type of adoption, ranking these from “Awareness” to “Experimentation” to “Organisation/Students/Faculty”, “Organisational Transformation” and “Sector Transformation”. Siemens et al found that the majority of HE is at the Awareness and Experimentation phase. Similarly Goldstein and Katz (2005) found 70% of institutions at phase 1, it is higher now but bear in mind that 70% doesn’t mean others are further along that process. There is still much to do.

So, what is necessary to move forward? What are the next steps? What do we need to embrace in this process? Well lets talk a bit about direction… The metaphors from business analytics can be useful, borrow lessons from that process. McKinsey offered a really interesting business model of: Data – Model – Transform (see Barton and Court 2012). That can be a really informative process for us in higher education.

Starting with Data – traditionally when we choose to measure something in HE we refer to surveys, particularly student satisfaction surveys. But this is not something with a huge return rate in all countries. More importantly surveys are not the accurate thing. We also have progress statistics – they are in our learning systems as are data but are they useful? We can also find social networks from these systems, from interactions and from course registration systems – and knowing who students hang out with can predict how they perform. We also find that we can get this data, but then how do we process and understand that data? I know some institutions find a lack of IT support can be a significant barrier to the use of learning analytics.

Moving onto Model… Everyone talks about predictive modelling, the question has to be about the value of a predictive model. Often organisations just see this as an outsourced thing – relying on some outsider organisation and data model that provides solutions, but does not do that within the context of understanding what the questions are. And the questions are critical.

And this is, again, where we can find ourselves forgetting that learning analytics is about learning. So there are two things we have to know about, and think about, to ensure we understand what analytics mean:

(1) Instructional conditions – different courses in the same school, or even in the same programme will have a different set of instructional conditions – different approaches, different technologies, different structures. We did some research on an University through their Moodle presence and we found some data that was common to 20-25% of courses, but we did identify some data you could capture that were totally useless (e.g. time online). And we found some approaches that explained 80% of variance, so for example extensive use of Turnitin – not just for plagiarism but also by students for gathering additional feedback. One of our courses defied all trends… they had a Moodle presence but when we followed up on this, found that most of their work was actually in social media so data from Moodle was quite misleading and certainly a partial picture. (see Gasevic, Dawson, Rogers, Gasevic, 2015)

(2) Learner agency – this changes all of the time. We undertook work on the agency of learners, based on log data from a particular course. We explored 6 clusters using cluster matching algorithms… We found that there was a big myth that more time on task would lead to better performance… One of our clusters spent so much time online, another was well below. When we compared clusters we found the top students were that group spending the least time online, the other cluster spending time online performed average. This shows that this is a complex questions. Learning styles isn’t the issue, learning profiles is what matters here. In this course, one profile works well, in another a different profile might work much better. (see Kovanovic, Gasevic, Jok… 201?).

And a conclusion for this section is that our analytics and analysis cannot be generalised.

Moving finally to Transform we need to ensure participatory design of analytics tools – we have to engage our staff and students in these processes early in the process, we won’t get institutional transformation by relying on the needs of statisticians. Indeed visualisations can be harmful (Corrin and de Barba 2014). The University of Melbourne looked at the use of dashboards and similar systems and they reported that for students that were high achieving, high GPA, and high aspirations… when they saw that they were doing better than average, or better than their goals, they actually under-perform. And for those doing less well we can just reinforce issues in their self efficacy. So these tools can be harmful if not designed in a critical way.

So, what are the realities of adoption? Where are the challenges? In Australia I am part of a study commissioned by the Australian Government in South Australia. This is engaging with the entire tertiary Australian institution. We interviewed every VC and management responsible for learning analytics. Most are in phase 1 or 2… Their major goal was to enable personalised learning… the late phases… They seemed to think that magically they would move from experimentation to personalised learning, they don’t seem to understand the process to get there…

We also saw some software driven approaches. They buy an analytics programme and perceive job is done.

We also see a study showing that there is a lack of a data-informed decision making culture, and/or data not being suitable for informing those types of decisions. (Macfadyen and Dawson 2012).

We also have an issue here that researchers are not focused on scalability here… Lots of experimentation but… I may design beautiful scaffolding based on learning analytics, but I have to think about how that can be scaled up to people who may not be the instructors for instance.

The main thing I want to share here is that we must embrace the complexity of educational systems. Learning analytics can be very valuable for understanding learning but they are not a silver bullet. For institutional or sectoral transformation we need to embrace that complexity.

We have suggested the idea of Rapid Outcome Mapping Approach (ROMA) (Macfadyen, Dawson, Pardo, Gasevic 2014) in which once we have understood the objectives of learning analytics, we also have to understand the political landscape in which they sit, the financial contexts of our organisations. We have to identify stakeholders, and to identify the desired behaviour changes we want from those stakeholders. We also have to develop engagement strategy – we cannot require a single approach, a solution has to provide incentives for why someone should/should not adopt learning analytics. We have to analyse our internal capacity to effect change – especially in the context of analytics tools and taking any value form them. And we finally have to evaluate and monitor chance. This is about capacity development, and capacity development across multiple teams.

We need to learn from successful examples – and we have some to draw upon. The Open University adopted their organisational strategy, and were inspired by the ROMA approach (see Tynan and Buckingham Shum 2013). They developed the model of adoption that is right for them – other institutions will want to develop their own, aligned to their institutional needs. We also need cross-institutional experience sharing and collaboration (e.g. SOLAR, the Society for Learning Analytics Research). This meeting today is part of that. And whilst there may be some competition between institutions, this process of sharing is extremely valuable. There are various projects here, some open source, to enable different types of solution, and sharing of experience.

Finally we need to talk about ethical and privacy consideration. There is a tension here… Some institutions hold data, and think students need to be aware of the data held… But what if students do not benefit from seeing that data? How do we prepare students to engage with that data, to understand this data. The Open University is at the leading edge here and have a clear policy on ethical use of student data. Jisc also have a code of practice for learning analytics which I also welcome and think will be very useful for institutions looking to adopt learning analytics.

I also think we need to develop an analytics culture. I like to use the analogy of, say, Moneyball, where analytics make a big difference… but analytics can be misleading. Predictive models have their flaws, their false positives etc. So a contrasting example would be the Trouble with the Curve – where analytics mask underlying knowledge of an issue. We should never reject our tacit knowledge as we look at adopting learning analytics.

Q&A

Q – Niall): I was struck by your comments about asking the questions… But doesn’t that jar with the idea that you want to look at the data and exploring questions out of that data?

A – Dragan): A great question… As a computer scientist I would love to just explore the data, but I hang out with too many educational researchers… You can start from data and make sense of that. It is valid. However, whenever you have certain results you have to ask certain questions – does this make sense in the context of what is taking place, does this make sense within the context of our institutional needs, and does this make sense in the context of the instructional approach? That questioning is essential no matter what the approach.

Q – Wilma) How do you accommodate the different teaching styles and varying ways that courses are delivered?

A – Dragan) The most important part here is about the development of capabilities – at all levels and in all roles including students. So in this Australian study we identified trends, found these clusters… But some of these courses are quite traditional and linear, others are more ambitious… They have a brilliant multi-faceted approach. Learning analytics would augment this… But when we aggregate this information… But when you have more ambitious goals, the more there is to do. Time is required to adopt learning analytics with sophistication. But we also need to develop tools to the needs of tasks of stakeholders… so stakeholders are capable to work with them… But also not to be too usable. There aren’t that many data scientists so perhaps we shouldn’t use visualisations at all, maybe just prompts triggered by the data… And we also want to see more qualitative insights into our students… their discussion… when they are taking notes… And that then really gives an insight… Social interactions are so beneficial and important to benefit student learning.

Q – Wilbert) You mentioned that work in Australia about Turnitin… What was the set up there that led to that… Or was it just the plagiarism prediction use?

A – Dragan) Turned out to be the feedback being received through Turnitin… Not plagiarism side. Primarily it was on the learner side, not so much the instructors. There is an ethical dilemma there if you do expose that to instructors… If they are using the system to get feedback… Those were year one students, and many were international and from Asia and China where cultural expectation of reproducing knowledge is different… So that is also important.

Q) Talking about the Purdue email study, and staff giving formative feedback to students at risk – how did that work?

A) They did analysis of these messages, and the content of them, and found staff mainly giving motivational messages. I think that was mainly because traffic light system indicated at risk nature but not why that was the case… you need that information too..

Q) Was interested in rhetoric of personalised learning by Vice Chancellors, but most institutions being at stage 1 or 2… What are the institutional blockers? How can they be removed?

A) I wish I had an answer there! But the senior leaders are sometimes forced to make decisions based on financial needs, not just about being driven by data or unaware of data. So in Australian institutions many are small organisations, with limited funding… and existence of the institutions is part of what they have to face, quite aside from adoption of learning analytics. But also University of Melbourne is a complex institution, a leading researcher there but cannot roll out same solution across very different schools and courses….

Niall: And with that we shall have to end the Q&A and hand over to Sheila, who will talk about some of those blockers…

Learning Analytics: Implementation Issues – Sheila MacNeill, Glasgow Caledonian University

I was based at CETIS involved in learning analytics for a lot of that time… But for the last year and a half I have been based at Glasgow Caledonian University. And today I am going to talk about my experience of moving from that overview position to being in an institution and actually trying to do it… I’m looking for a bit of sympathy and support, but hoping to also contextualise some of what Dragan talked about.

Glasgow Caledonian University has about 17,000 students, mostly campus based although we are looking at online learning. We are also committed to blended learning. We provide central support for the university, working with learning technologies across the institution. So I will share my journey… joys and frustrations!

One of the first things I wanted to do was to get my head around what kind of systems we had around the University… We had a VLE (Blackboard) but I wanted to know what else people were using… This proved very difficult. I spoke to our IS department but finding the right people was challenging, a practical issue to work around. So I decided to look a big more broadly with a mapping of what we do… looking across our whole technology position. I identified the areas and what fitted into those areas:

  • (e) Assessment and feedback – Turnitin – we see a lot of interest in rubrics and marking and feedback processes that seem to be having a big impact on student success and actually plagiarism isn’t its main usefulness the more you use it, Gradecentre, Wikis/blogs/journals, peer/self assessment, (e)feedback.
  • (e) Portfolios – wikis/blogs/journals, video/audio – doing trials with nursing students of a mobile app in this space.
  • Collaboration – discussion boards, online chat, video conferencing etc.
  • Content – lectures, PDFs, etc….

I’ve been quite interested in Mark (?) idea of a “core VLE”. Our main systems group around SRS (students records system – newly renamed from it’s former name, ISIS), GCU Learn, the Library, 3rd Party Services. When I did hear from our IS team I found such a huge range of tools that our institution has been using, it seems like every tool under the sun has been used at some point.

In terms of data… we can get data from our VLE, from Turnitin, from wikis etc. But it needs a lot of cleaning up. We started looking at our data, trying it on November data from 2012 and 2013 (seemed like a typical month). And we found some data we would expect, changes/increases of use over time. But we don’t have data on a module level, or programme level, etc. Hard to view in detail or aggregate up (yet). We haven’t got data from all of our systems yet. I would say we are still at the “Housekeeping” stage… We are just seeing what we have, finding a baseline… There is an awful lot of housekeeping that needs to be done, a lot of people to talk to…

But as I was beginning this process I realised we had quite a number of business analysts at GCU who were happy to talk. We have been drawing out data. We can make dashboards easily, but USEFUL dashboards are proving more tricky! We have meanwhile been talking about Blackboard about their data analytics platform. It is interesting thinking about that… given the state we are in about learning analytics, and finding a baseline, we are looking at investing some money to see what data we can get from Blackboard that might enable us to start asking some questions. There are some things I’d like to see from, for example, combining on campus library card data with VLE data. And also thinking about engagement and what that means… Frustratingly for me I think that it is quite hard to get data from Blackboard… I’m keen that next license we sign we actually have a clause about the data we want, in the format we want, when we want it… No idea if that will happen but I’d like to see that.

Mark Stubbs (MMU) has this idea of a tube map of learning… This made me think of the Glasgow underground map – going in circles a bit, not all joining up. We really aren’t quite there yet, we are having conversations about what we could, and what we should do. In terms of senior management interest in learning analytics… there is interest. But when we sent out the data we had looked we did get some interesting responses. Our data showed a huge increase in mobile use – we didn’t need a bring your own device policy, students were already doing it! We just need everything mobile ready. Our senior staff are focused on NSS and student survey data, that’s a major focus. I would like to take that forward to understand what is happening, and more structured way…

And I want to finish by talking about some of the issues that I have encountered. I came in fairly naively to this process. I have learned that…

Leadership and understanding is crucial – we have a new IS director which should make a great difference. You need both carrots and stick, and that takes a real drive from the top to make things actually start.

Data is obviously important. Our own colleagues have issues access data from across the institution. People don’t want to share, they don’t know if they are allowed to share. There is a cultural thing that needs investigating – and that relates back to leadership. There are also challenges that are easy to fix such as server space. But that bigger issue of access/sharing/ownership all really matter.

Practice can be a challenge. Sharing of experience and engagement with staff, having enough people understanding systems, is all important for enabling learning analytics here. The culture of talking together more, having a better relationship within an institution, matters.

Specialist staff time matters – as Dragan highlighted in his talk. This work has to be prioritised – a project focusing on learning analytics would give the remit for that, give us a clear picture, establish what needs to be done. To not just buy in technology but truly assess needs before doing that, and in adopting technology. There is potential but learning analytics has to be a priority if it is to be adopted properly.

Institutional amnesia – people can forget what they have done, why, and what they do not do it before… More basic house keeping again really. Understanding, and having tangible evidence of, what has been done and why is also important more broadly when looking at how we use technologies in our institutions.

Niall: Thanks for such an honest appraisal of a real experience there. We need that in this community, not just explaining the benefits of learning analytics. The Open University may be ahead now, but it also faced some of those challenges initially for instance. Now, over to Wilma.

Student data and Analytics work at the University of Edinburgh – Wilma Alexander, University of Edinburgh

Some really interesting talks already to do, I’ll whiz through some sections in fact as I don’t need to retread some of this. I am based in Information Services. We are a very very large, very old University, and it is very general. We have a four year degree. All of that background makes what we do with student data, something it is hard to generalise about.

So, the drivers for the project I will focus on, came out of the understanding we already have about the scale and diversity of this institution. Indeed we are increasingly encouraging students to make imaginative cross overs between schools and programmes which adds to this. Another part of the background is that we have been seriously working in online education, and in addition to a ground breaking digital education masters delivered online, we also have a number of online masters. And further background here is that we have a long term set of process that encourages students to contribute to the discussions within the university, owners and shapers of their own learning.

So, we have an interest in learning analytics, and understanding what students are doing online. We got all excited by the data and probably made the primary error of thinking about how we could visualise that data in pretty pictures… but we calmed down quite quickly. As we turned this into a proper project we framed it much more in the context of empowering students around their activities, about data we already have about our students. We have two centrally supported VLEs at Edinburgh (and others!) which are Blackboard Learn, our main largest system with virtually all on campus programmes use that VLE in some way, but for online distance programmes we took the opportunity to try out Moodle – largely online programmes, and largely created as online distance masters programmes. So, already there is a big distance between how these tools are used in the university, never mind how they are adopted.

There’s a video which shows this idea of building an airplane whilst in the air… this projects first phase, in 2014, has felt a bit like that at times! We wanted to see what might be possible but also we started by thinking about what might be displayed to students. Both Learn and Moodle give you some data about what students do in your courses… but that is for staff, not visible to students. When we came to looking at the existing options… None of what Learn offers quite did what we wanted as none of the reports were easily made student facing (currently Learn does BIRT reports, course reports, stats columns in grade center etc). We also looked at Moodle and there was more there – it is open source and developed by the community so we looked at available options there…

We were also aware that there were things taking place in Edinburgh elsewhere. We are support not research in our role, but we were aware that colleagues were undertaking research. So, for instance my colleague Paula Smith was using a tool to return data as visualisations to students.

What we did as a starting point was to go out and collect user stories. We were asking both staff and students, in terms of information available in the VLE(s), what sort of things would be of interest. We framed this as a student, as a member of staff, as a tutor… as “As a… I want to… So that I can…”. We had 92 stories from 18 staff and 32 students. What was interesting here was that much of what was wanted was already available. For staff much of the data they wanted they really just had to be shown and supported to find the data already available to them. Some of the stuff that came in as “not in scope” was not within the very tight boundaries we had set for the project. But a number of things of interest, requests for information, that we passed on to appropriate colleagues – so one area for this was reading lists and we have a tool that helps with that so we passed that request onto library colleagues.

We also pooled some staff concerns… and this illustrates what both Dragan and Sheila have said about the need to improve the literacy of staff and students using this kind of information, and the need to contextualise it… e.g: “As a teacher/personal tutor I want to have measures of activity of the students so that I can be alerted to who are “falling by the wayside” for instance – a huge gap between activity and that sort of indicator.

Student concerns were very thoughtful. They wanted to understand how they compare, to track progress, they also wanted information on timetables of submissions, assignment criteria/weighting etc. We were very impressed by the responses we had and these are proving valuable beyond the scope of this project…

So, we explored possibilities, and then moved on to see what we could build. And this is where the difference between Learn and Moodle really kicked in. We initially thought we could just install some of the Moodle plugins, and allow programmes to activate them if they wanted to… But that fell at the first hurdle as we couldn’t find enough staff willing to be that experimental with a busy online MSc programme. The only team up for some of that experimentation were the MSc in Digital Education team, which was done as part of a teaching module in some strands of the masters. This was small scale hand cranked from some of these tools. One of the issues with pretty much all of these tools is that they are staff facing and therefore not anonymous.So we had to do that hand cranking to make the data anonymous.

We had lots of anecdotal and qualitative information through focus groups and this module, but we hope to pin a bit more down on that. Moodle is of interest as online distance students… there is some evidence that communication, discussion etc. activity is a reasonable proxy for performance here as they have to start with the VLE.

Learn is a pretty different beast as it is on campus. Blended may not have permeated as strongly on campus. So, for Learn what we do have this little element that produces a little click map of sorts (engagements, discussion, etc)… For courses that only use the VLE for lecture notes, that may not be useful at all, but for others it should give some idea of what is taking place. We also looked at providing guidebook data – mapping use of different week’s sections/resources/quizzes to performance.

We punted those ideas out. The activity information didn’t excite folk as much (32% thought it was useful). The grade information was deemed much more useful (97% thought it was useful)… But do we want our students hooked on that sort of data? Could it have negative effects, as Dragan talked about. And how useful is that overview?

When it came to changes in learning behaviour we had some really interesting and thoughtful responses here. Of the three types of information (discussion boards, grade, activity) it was certainly clear though that grade was where the student interest was.

We have been looking at what courses use in terms of tools… doing a very broad brush view of 2013/14 courses we can see what they use and turn on from: some social/peer network ability – where we think there really is huge value, the percentage of courses actively using those courses on campus are way below those using the VLE for the other functions of Content+Submission/Assessment and Discussion Boards.

So context really is all – reflecting Dragan again here. It has to work for individuals on a course level. We have been mapping our territory here – the university as a whole is hugely engaged in online and digital education in general, and very committed to this area, but there is work to do to join it all up. When we did information gathering we found people coming out of the woodwork to show their interest. The steering group from this project has a representative from our student systems team, and we are talking about where student data lives, privacy and data protection, ethics, and of course also technical issues quite apart from all that… So we also have the Records Management people involved. And because Jisc has these initiatives, and there is an EU initiative, we are tightly engaging with the ethical guidance being produced by both of these.

So, we have taken a slight veer from doing something for everyone in the VLEs in the next year. The tool will be available to all but what we hope to do is to work very closely with a small number of courses, course organisers, and students, to really unpick on a course level how the data in the VLE gets built into the rest of the course activity. So that goes back into the idea of having different models, and applying the model for that course, and for those students. It has been a journey, and it will continue…

Using learning analytics to identify ‘at-risk’ students within eight weeks of starting university: problems and opportunities – Avril Dewar, University of Edinburgh

This work I will be presenting has been undertaken with my colleagues at the Centre for Medical Education, as well as colleagues in the School of Veterinary Medicine and also Maths.

There is good evidence that performance in first year will map quite closely to performance as a whole in a programme. So, with that in mind, we wanted to develop an early warning system to identify student difficulties and disengagement before they reach assessment. Generally the model we developed worked well. About 80% of at risk students were identified. And there were large differences between the most and least at-risk students – the lowest risk score and the highest risk score which suggests this was a useful measure.

The measures we used included:

  • Engagement with routine tasks
  • Completion of formative assessment – including voluntary formative assessment
  • Tutorial attendance (and punctuality where available) – but this proved least useful.
  • Attendance at voluntary events/activities
  • Virtual Learning Environment (VLE) exports (some)
    • Time until first contact proved to be the most useful of these

We found that the measures sometimes failed because the data exports were not always that useful for appropriate (e.g. VLE tables of 5000 colums). Patterns of usage were hard to investigate as raw data on, e.g. time of day of accesses, not properly usable though we think that is useful. Similarly there is no way to know if long usage means a student has logged in, then Googled or left their machine, then returned – or whether it indicates genuine engagement.

To make learning analytics useful we think we need the measures, and the data supporting them, to be simple, to be comprehensible, accessible – and also comparable to data from other systems (e.g. we could have used library data alongside our VLE issues), to scale easily – e.g. common characteristics between schools, not replicating existing measures, discriminates between students – some of the most useful things like the time to first contact, central storage.

We also found there were things that we could access but didn’t use. Some for ethical and some for practical reasons. IP addresses for location was an ethical issue for us, discussion boards similarly we had concern about – we didn’t want students to be put off participating in discussions. Or time taken to answer individual questions. We are concerned that theoretical issues that could be raised could include: evidence that student has been searching essay-buying websites; student is absent from class and claims to be ill but IP address shows another location, etc.

There were also some concerns about the teacher-student relationship. Knowing too much can create a tension in the student-teacher relationship. And the data one could gather about a student could become a very detailed tracking and monitoring system… for that reason we always aim to be conservative, rather than exhaustive in our data acquisition.

We have developed training materials and we are making these open source so that we can partner with other schools, internationally. Whilst each school will have their own systems and data but we are keen to share practice and approaches. Please do get in touch if you would like access to the data, or would like to work with us.

Q&A

Q – Paula) Do you think there is a risk of institutions sleep walking into student dissatisfaction. We are taking a staged approach… but I see less effort going into intervention, to the staff side of what could be done… I take it that email was automated… Scalability is good for that, but I am concerned students won’t respond to that as it isn’t really personalised at all. And how were students in your project, Avril, notified.

A – Avril) We did introduce peer led workshops… We are not sure if that worked yet, still waiting for results of those. We emailed to inform our students if they wanted to be part of this and if they wanted to be notified of a problem. Later years were less concerned, saw the value. First year students were very concerned, so we phrased our email very carefully. When a student was at risk emails were sent individually by their personal tutors. We were a bit wary of telling students of what had flagged them up – it was a cumulative model… were concerned that they might then engage just with those things and then not be picked up by the model.

Niall: Thank you for that fascinating talk. Have you written it up anywhere yet?

Avril: Soon!

Niall: And now to Wilbert…

The feedback hub; where qualitative learning support meets learning analytics – Wilbert Kraan, Cetis

Before I start I have heard about some students gaming some of the simpler dashboards so I was really interested in that.

So, I will be sort and snappy here. The Feedback Hub work has just started… this is musings and questions at this stage. This work is part of the larger Jisc Electronic Management of Assessment (EMA) project. And we are looking at how we might present feedback and learning analytics side by side.

The EMA project is a partnership between Jisc, UCISA and HeLF. It builds on earlier Jisc Assessment and Feedback work And it is a co-design project that identifies priorities, solution areas… and we are now working on solutions. So one part of this is about EMA requirements and workflows, particularly the integration of data (something Sheila touched upon). There is also work taking place on an EMA toolkit that people can pick up and look at. And then there is the Feedback Hub, which I’m working on.

So, there is a whole assessment and feedback lifecycle (as borrowed from a model developed by Manchester Metropolitan, with they permission), This goes from Specifying to Setting, Supporting, Submitting, Marking and production of feedback, Recording of grades etc… and those latter stages is where the Feedback Hub sits.

So, what is a feedback hub really? It is a system that provides a degree programme of life wide view of assignments and feedback. The idea is that it moves beyond the current module that you are doing, to look across modules and across years. There will be feedback that is common across areas, that gives a holistic view of what has already been done. So this is a new kind of thing… When I look at nearest tools I found VLE features – database view of all assignments for a particular student for learner and tutor to see. A simple clickable list that is easy to do and does help. Another type is a tutoring or assignment management system – capturing timetables of assignments, tutorials etc. These are from tutor perspective. Some show feedback as well. And then we have assignment services – including Turnitin – about plagiarism, but also management of logistics of assignments, feedback etc.

So, using those kinds of tools you can see feedback as just another thing that gets put in the learning records store pot in some ways. But feedback can be quite messy, hard to disentangle in line feedback from the document itself. Teachers approach feedback differently… though pedagogically the qualitative formative feedback that appears in these messy ways can be hugely valuable.  Also these online assessment management tools can be helpful for mapping and developing learning outcomes and rubrics – connecting that to the assignment you can gain some really interesting data… There is also the potential for Computer Aided Assessment feedback – sophisticated automated data on tests and assignments which work well in some subjects. And possibly one of the most interesting learning analytics data is on the engagement with feedback. A concern from academic staff is that you can give rich feedback, but if the students don’t use it how useful it is really? So capturing that could be useful…

So, having identified those sources, how do we present such a holistic view? One tool presents this as an activity stream – like Twitter and Facebook – with feedback part of that chronological list of assignments… We know that that could help. Also an expanding learning outcomes rubric – click it to see feedback connected to it, would it be helpful? We could also do text extraction, something like Wordle, but would that help? And the other thing that might see is clickable grades – to understand what a grade means… And finally should we combine feedback hub with analytics data visualisations.

Both learning analytics and feedback track learning progress over time, and try to predict the future. Feedback related data can be a useful learning analytics data source.

Q&A

Q – Me) Adoption and issues of different courses doing different things? Student expectations and added feedback?

A) This is an emerging area… IET in London/University of London have been trialing this stuff… they have opened that box… Academic practice can make people very cautious…

Comment) Might also address the perennial student question of wanting greater quality feedback… Might address deficit of student satisfaction

A) Having a coordinated approach to feedback… From a pedagogical point of view that would help. But another issue there is that of formative feedback, people use these tools in formative ways as well. There are points of feedback before a submission that could be very valuable, but the workload is quite spectacular as well. So balancing that could be quite an interesting thing.

Jisc work on Analytics – update on progress to date– Paul Bailey, Jisc and Niall Sclater. 

Paul: We are going to give you a bit of an update on where we are on the Learning Analytics project, and then after that we’ll have some short talks and then will break out into smaller groups to digest what we’ve talked about today.

The priorities we have for this project are: (1) basic learning analytics solution, an interventions tool and a student tool; (2) code of practice for learning analytics; and (3) learning analytics support and network.

We are a two year project, with the clock ticking from May 2015. We have started by identifying suppliers to initiate contracts and develop products; then institutions will be invited to participate in the discovery stage or pilots (June-Sept 2015). Year 1 in Sept 2015-2016 we will run that discovery stage (10-20 institutions), pilots (10+ institutions); institutions move from discovery to pilot. Year 2 will be about learning from and embedding that work. And for those of you that have worked with us in the past, the model is a bit different: rather than funding you then learning from that, we will be providing you with support and some consultancy and learning from this as you go (rather than funding).

Michael Webb: So… we have a diagram of the process here… We have procured a learning records warehouse (the preferred supplier there is H2P). The idea that VLEs, Student Information Systems and Library Systems feeding into that. There was talk today of Blackboard being hard to get data out of, we do have Blackboard on-board.

Diagram of the Jisc Basic Learning Analytics Solution presented by Paul Bailey and Michael Webb

Diagram of the Jisc Basic Learning Analytics Solution presented by Paul Bailey and Michael Webb

Paul: Tribal are one of the solutions, pretty much off the shelf stuff. Various components and we hope to role it out to about 15 institutions in the first year. The second option there will be the open solution, which is partly developed but needs further work. So the option will be to engage with either one of those solutions, or to engage with both perhaps.

The learning analytics processors will feed the staff dashboards, into a student consent service, and both of those will connect to the alert and intervention system. And there will be a Student App as well.

Michael: The idea is that all of the components are independent so you can buy one, or all of them, or the relevant parts of the service for you.

Paul: The student consent service is something we will develop in order to provide some sort of service to allow students to say what kinds of information can or cannot be shared (of available data from those systems that hold data on them). The alert and intervention system is an area that should grow quite a bit…

So, the main components are the learning records warehouse, the learning analytics processor – for procurement purposes the staff dashboard is part of that, and the student app. And once you have that learning records warehouse is there, you could build onto that, use your own system, use Tableau, etc.

Just to talk about the Discovery Phase, we hope to start that quite soon. The invitation will come out through the Jisc Analytics email list – so if you want to be involved, join that list. We are also setting up a questionnaire to collect readiness information and for institution to express interest. Then in the discovery process (June/July onward) there will be a select preferred approach for the discovery phase. This will be open to around 20 institutions. We have three organisations involved here: Blackboard; a company called DTP Solution Path (as used by Nottingham Trent); and UniCom. For the pilot (September(ish) onward) we have a select solution preference (Year 1-15 (proprietary – Tribal) and 15 open).

Niall: the code of practice is now a document of just more than two pages around complex legal and ethical issues. They can be blockages to move that forward… so this is an attempt to have an overview document to help institution to overcome those issues. We have a number of institutions who will be trialing this. That’s at draft stage right now, and with advisory group to suggest revisions. It is likely to launch by Jisc in June. Any additional issues are being reflected in a related set of online guidance documents.

Effective Learning Analytics project can be found: http://www.jisc.ac.uk/rd/projects/

Another network on 24th June at Nottingham Trent University. At that meeting we are looking to fund some small research type projects – there is an Ideascale page for that. About five ideas in the mix at the moment. Do add ideas (between now and Christmas) and do vote on those. There will be pitches there, for ones to take forward. And if you want funding to go to you as a sole trader rather than to a large institution, that can also happen.

Q&A

Q) Will the open solution be shared on something like GitHub so that people can join in

A) Yes.

Comment – Micheal: Earlier today people talked about data that is already available, that’s in the discovery phase when people will be on site for a day or up to a week in some cases. Also earlier on there was talk about data tracking, IP address etc, and the student consent system we have included is to get student buy-in for that process, so that you are legally covered for what you do as well. And there is a lot of focus on flagging issues, and intervention. The intervention tool is a really important part of this process, as you’ll have seen from our diagram.

For more information on the project see: http://analytics.jiscinvolve.org/wp/

Open Forum – input from participants, 15 min lightning talks.

Assessment and Learning Analytics – Prof Blazenka Divjak, University of Zagreb (currently visiting University of Edinburgh)

I have a background in work with a student body of 80,000 students, and use of learning analytics. And the main challenge I have found has been the management and cleansing of data. If you want to make decisions, learning analytics are not always suitable/in an appropriate state for this sort of use.

But I wanted to today about assessment. What underpins effective teaching? Well this relates to the subject, the teaching methods, the way in which students develop and learn (Calderhead, 1996), and awareness of the relationship between teaching and learning. Assessment is part of understanding that.

So I will talk to two case studies across courses using the same blended approach with open source tools (Moodle and connected tools).

One of these examples is Discrete Math with Graph Theory, a course for the Master of Informatics course with around 120 students and 3 teachers. This uses problem (authentic) posing and problem solving. We have assessment criteria and weighted rubrics (AHP method). So here learning analytics are used for identification of performance based on criteria. We also look at differences between groups (gender, previous study, etc.). Correlation of authentic problem solving with other elements of assessments – hugely important for future professional careers but not always what happens.

The other programme, Project Management for the Master of Entrepreneurship programme, has 60 students and 4 teachers. In this case project teams work on authentic tasks. Assessment criteria + weighted rubrics – integrated feedback. The course uses self-assessment, peer-assessment, and teacher assessment. Here the learning analytics are being used to assess consistency, validity, reliability of peer-assessment. Metrics here can include the geometry of learning analytics perhaps.

Looking at a graphic analysis of one of these courses shows how students are performing against criteria – for instance they are better at solving problems than posing problems. Students can also benchmark themselves against the group, and compare how they are doing.

The impact of student dashboards – Paula Smith, UoE

I’m going to talk to you about an online surgery course – the theory not the practical side of surgery I might add. The MSc in Surgical Sciences has been running since 2007 and is the largest of the medical distance learning programmes.

The concept of learning analytics may be relatively new but we have been interested in student engagement and participation, and how that can be tracked and acknowledged for a long time as it is part of what motivates students to engage. So I am going to be talking about how we use learning analytics to make an intervention but also to talk about action analytics – to make changes as well as interventions.

The process before the project I will talk about had students being tracked via an MCQ system – students would see a progress bar but staff could see more details. At the end of every year we would gather that data, and present a comparative picture so that students could see how they were performing compared to peers.

Our programmes all use bespoke platforms and that meant we could work with the developers to design measures on student engagement – for example number of posts. A crude way to motivate students. And that team also created activity patterns so we could understand the busier times – and it is a 24/7 programme. All of our students work full time in surgical teams so this course is an add on to that. We never felt a need to make this view available to students… this is a measure of activity but how does that relate  to learning? We need more tangible metrics.

So, in March last year I started a one day a week secondment for a while with Wilma Alexander and Mark Wetton at IS. That secondment has the objectives of creating a student “dashboard” which would allow students to monitor their progress in relation to peers; to use the dashboard to identify at-risk students for early interventions; and then evaluate what (if any) impact that intervention had.

So, we did see a correlation between in-course assessment and examination marks. The exam is 75-80% (was 80, now 75) in the first year. It is a heavily weighted component. You can do well in the exam, and get a distinction, with no in course work during the year. The in-course work is not compulsory but we want students to see the advantage of in course assessments. So, for the predictive modelling regression analysis revealed that only two components had any bearing on end of year marks, which were discussion board ratings, and exam performance (year 1); or exam performance (year 2). So, with that in mind we moved away from predictive models we decided to do a dashboard for students to present a snapshot of their progress against others’. And we wanted this to be simple to understand…

So, here it is… we are using Tableau to generate this. Here the individual student can see their own performance in yellow/orange and compare to the wider group (blue). The average is used to give a marker… If the average is good (in this example an essay has an average mark of 65%) that’s fine, if the average is poor (discussion board which are low weighted has an average of under 40, which is a fail at MSc level) that may be more problematic. So that data is provided with caveats.

Paula Smith shows visualisations created using Tableu

Paula Smith shows visualisations created using Tableu

This interface has been released – although my intervention is just an email which points to the dashboard and comments on performance. We have started evaluating it: the majority think it is helpful (either somewhat, or a lot). But worryingly a few have commented “no, unhelpful”, and we don’t know the reasons for that. But we have had positive comments on the whole. We asked about extra material for one part of the course. And we asked students how the data makes them feel… although the majority answered ‘interested’, ‘encouraged’, and ‘motivated’, one commented that they were apathetic about it – actually we only had a 15% response rate for this survey which suggests that apathy is widely felt.

Most students felt the dashboard provided feedback, which was useful. And the majority of students felt they would use the dashboard – mainly monthly or thereabouts.

I will be looking further at the data on student achievement and evaluating it over this summer, and should be written up at the end of the year. But I wanted to close with a quote from Li Yuan, at CETIS: “data, by itself, does not mean anything and it depends on human interpretation and intervention“.

Learning Analytics – Daley Davis, Altis Consulting (London) 

We are a consulting company and we are well established in Australia so I thought it would be relevant to talk about what we do there on learning analytics. Australia are ahead on learning analytics and that may well be because they brought in changes to funding fees in 2006 so they view students differently. They are particularly focused on retention. And I will talk about work we did with UNE (University of New England), a university with mainly online students and around 20,000 students in total. They wanted to improve student attrition. So we worked with them to set up a system for a student early alert system for identifying students at risk on disengaging. It used triggers of student interaction as predictors. And this work cut attrition from 18% to 12% and saving time and money for the organisation.

The way this worked was that students had an automated “wellness” engine, with data aggregated at school and head of school levels. And what happened was that staff were ringing students every day – finding out about problems with internet connections, issues at home etc. Some of these easily fixed or understood.

The system picked up data from their student record system, their student portal, and they also have a system called “e-motion” which asks students to indicate how they are feeling every day – four ratings and also a free text box (that we also mined).

Data was mined with weightings and a student who had previously failed a course, and a student who was very unhappy were both aspects weighted much more heavily. As was students not engaging for 40 days or more (versus other levels, weighted more lightly).

Daley Davis shows the weightings used in a Student Early Alert System at UNE

Daley Davis shows the weightings used in a Student Early Alert System at UNE

Universities are looking at what they already have, coming up with a technical roadmap. But they need to start with the questions you want to answer… What do your students want? What are your KPIs? And how can you measure those KPIs. So, if you are embarking on this process I would start with a plan for 3 years, toward your perfect situation, so you can then make your 1 year or shorter term plans in the direction of making that happen…

Niall: What I want you to do just now is to discuss the burning issues… and come up with a top three…

And [after coffee and KitKats] we are back to share our burning issues from all groups…

Group 1:

  • Making sure we start with questions first – don’t start with framework
  • Data protection and when you should seek consent
  • When to intervene – triage

Group 2:

  • How to decided on what questions to decide on, and what questions and data are important anyway?
  • Implementing analytics – institutional versus course level analytics? Both have strengths, both have risks/issues
  • And what metrics do you use, what are reliable…

Group 3:

  • Institutional readiness for making use of data
  • Staff readiness for making use of data
  • Making meaning from analytics… and how do we support and improve learning without always working on the basis of a deficit model.

Group 4:

  • Different issues for different cohorts – humanities versus medics in terms of aspirations and what they consider appropriate, e.g. for peer reviews. And undergrads/younger students versus say online distance postgrads in their careers already
  • Social media – ethics of using Facebook etc. in learning analytics, and issue of other discussions beyond institution
  • Can’t not interpret data just because there’s an issue you don’t want to deal with.

Group 5:

  • Using learning analytics at either end of the lifecycle
  • Ethics a big problem – might use analytics to recruits successful people; or to stream students/incentivise them into certain courses (both already happening in the US)
  • Lack of sponsorship from senior management
  • Essex found through last three student surveys that students do want analytics.

That issue of recruitment is a real ethical issue. This is something that arises in the Open University as they have an open access policy so to deny entrance because of likely drop out or likely performance would be an issue there… How did you resolve that?

Kevin, OU) We haven’t exactly cracked it. We are mainly using learning analytics to channel students into the right path for them – which may be about helping select the first courses to take, or whether to start with one of our open courses on Future Learn, etc.

Niall: Most universities already have entrance qualifications… A-Level or Higher or whatever… ethically how does that work

Kevin, OU) I understand that a lot of learning analytics is being applied in UCAS processes… they can assess the markers of success etc..

Comment, Wilma) I think  the thing about learning analytics is that predictive models can’t ethically applied to an individual…

Comment, Avril) But then there is also quite a lot of evidence that entry grades don’t necessarily predict performance.

Conclusions from the day and timetable for future events – Niall Sclator

Our next meeting will be in June in Nottingham and I hope we’ll see you then. We’ll have a speaker, via Skype, who works on learning analytics for Blackboard.

And with that, we are done with a really interesting day.

Feb 252015
 
Duncan Shingleton from Design Informatics presents their projects at the University of Edinburgh GeoLocation in Learning and Teaching event.

This afternoon I am attending, and supporting my colleague Tom, at the GeoLocation in Learning and Teaching event at the University of Edinburgh. This is an internal event arranged by the Social and Cloud based Learning and Teaching Service. The event will be focusing on Geolocation technology used in learning and teaching at the University of Edinburgh.

We are kicking off with a brief introduction from Susie Greig to the day noting that “there does seem to be some interest in using GeoLocation in learning and teaching” – something definitely backed up by a very full room for this afternoon’s session!

Dr Hamish MacLeod, Senior Lecturer, Moray House School of Education– will be discussing the INGRESS game, he will describe the many rich features, and why he thinks they are (potentially) relevant to learning.

I think there are two real approaches to learning in gaming… One you might attribute to Marc Prensky – a kind of con folk into learning approach. I have much more sympathy for James Paul Gee‘s take on gaming and learning.

I am talking about INGRESS, a mobile game (iOS and Android) but it is not a casual game, it requires proper engagement. It is a location dependent game – you have to get out there and use it in the world and it demands movement in the world. It is also an “exergame” – perhaps encourages exercise. It is an augmented relaity game, and alternate reality game, and it is open to users – you can contribute, interact, actively contribute to the game.

The game itself uses Google Maps as a basis, and the deceit of the game is that bright sparkly “portals” bring exotic matter to the world… and that exotic matter powers our scanner, our mobile phone… The object is to capture these portals and explore them. There are two factions in the game: green is the enlightened; the blue is the resistance…

The Enlightened is a faction attempting to help aliens called “Shifters” in the world. The Resistance are opposed to the Shifters presence in the world. Immediately shades of post modern theory…

Looking at a player profile you see a name, you see badges for achievements… and Google sits behind all of this… You can link your playing identity to your G+ profile (I haven’t).

The game is planet-wide – at least in terms of locations that are populated. My own neighbourhood is occupied by the enlightened faction… ! You can grab portals from your desk but the object is really to go further out, to explore the world…

The portals are not placed consistently, they tend to be associated with human objects… When you are proximal to a portal you can do various things… You can “hack” the portal to deploy objects useful in the game. You can deploy resonator or recharge it… Portals decay over time… You can also choose to attack portals… All of these portals have a physical existance… When one captures a portal, one finds out about the places one is moving around in… The information about the object the portal is focused on can be edited and added to… additional views can be included… If I really wanted some exercise, I would go up to Calton Hill… They will be less heavily defended because they are more remote than those in the city centre. Unclaimed portals are white… you use “resonators” to claim it… As a player I am level 6… that dictates what type/number of resonators I can deploy… I need other people to help me defend the portal… So there is a collaborative aspect whether you know who you are playing with or not…

There is a massive amount of media associated with the game: those announcing international events around the game; something that appears to be fan fiction, but managed by Google; and there is some back story about the game and the Shapers… Very rich media background to the game…

So, here, now… here is what one might do… Near here you will find a plaque to Clarinda, the name Burns used for Alice Macleroy who corresponded with him… There turns out to also be a plaque at the Carpet[I’ve misheard this] Tollbooth… Things you don’t know about the world around you…

From this game you can expose information, shapes to remember… puzzles and sequences to be echoed back to earn points… But these are not just arbitrary shapes, these are meaningful glyphs… Once we understand what they mean, they will read as meaningful or enigmatic sentences… A lovely illustration of George Meliores mystical number 7 in which we chunk information in order to process it better…

Here we see a (tweet) visualising a Christmas tree composed of links between portals… The two factions do compete in the game but this pattern is a massive act of collaboration and organisation to do this. There are halloween variants too… So the game is played at various levels, from casual to this sort of organised community…

We can add portals, and propose portals… It can take a while for portals to be vetted and recognised… I have managed to establish some… Including Hutton’s Rock on Salisbury Crags, and sites where core samples have been taken to find changes in the magnetic field over time… I’ve been systematic… and you could do that process, of creating portals.

You can also propose missions in the game – so there are missions around Scottish Enlightenment sites, The Royal Mile, Sir William Topaz McGonagall… So these user generated activities, projects… could be taken on to engage with resources in our environment that we wouldn’t usually engage with in that way…

Q&A

Q PW: This reminds me of Geo Caching, but this seems to have far more central control. Is that good or bad?

A HM: It is controlled by Google, of course it is providing them with many points of interest. Offers of suggestions can be slow to do… Geo Caching can be more controllable activity for a group of students to use though…

Q SG: Are you thinking of using this on your programme?

A HM: We are thinking about the Games Based Learning module… We use World of Warcraft there… But we look at designs of games for learning so it is interesting in that contact. But our degree is online and interestingly INGRESS really relates to shared geographical space – WoW is better in a lot of ways.. But you could work on the pattern making aspects.

Comment FH: It could be about time rather than location perhaps…

A HM: To play with this in a geographically co-located group would be interesting, might be other uses entirely for a distributed group of learners

Q TF: If I want students to learn about, say, medical education could I map it onto this game – or another – or does the game need to change?

A HM: You could have a walking tour of Edinburgh highlighting medical locations, historical dimensions and people associated with that… It might be forced or less forced depending on what you want to achieve

Comment COS: It might be fun for orientation sessions for colocated students.

Tom Armitage, Geoservices Support, EDINA –  will present on the mobile mapping and data collection app Fieldtrip GB.

I’m talking about FieldTrip GB, but firstly I just wanted to tell you a bit about what we do at EDINA. We are a Jisc Supported National Datacentre providing services, data, support, etc. Our work covers geospatial services, reference, multimedia, access areas and tools including FieldTrip GB. Digimap is our main geospatial service, we run GoGeo that allows you to search for geospatial data and create and share your own metadata records via GeoDoc – ideal for sharing geospatial research data. We have Unlock which lets you create geospatial search tools, or to georeference your own text. We also have OpenStream which allows you to stream open data from Ordnance Survey into websites/GIS. Finally FieldTrip GB which lets you gather data in the field.

We also have projects: AddressingHistory georeferenced historical Post Office Directories; we are involved in Trading Consequences and Palimpsest projects, both about geoparsing documents and visualising that; Spatial Memories was a project to help visually impaired learners to navigate the world through a mobile app; and finally the COBWEB project which is a large FP7-funded project with many aspects that link into data collection and citizen science.   

So, FieldTrip GB was about bringing some key fields to mobile. To be able to capture images, audio, text, location. To be able to use high quality background maps, and to be able to save maps for use “offline”. It allows you to do custom data collection forms, and to then access that form and collect data via your phone or tablet – it is available for Apple iOS devices or Android devices.

The main screen of the app lets you view online or saved maps, to capture data – both forms and GPS tracking. And the Download button lets you download mapping for use online. Login is via Dropbox… We chose Dropbox because it is free, the terms of use don’t give Dropbox access to users data – preferable to other services. And that also means the data is the property and responsibility of the user. And you can also potentially share Dropbox details to enable crowd sourcing…

So, the powerful bit of FieldTrip GB is the authoring tool… You can drag and drop different types of data capture into a form – text fields, multiple choice questions, ranges to select from, drop down menus, image capture, etc… You can drag and drop these items in, you can label and set limits/labels/choices as you wish. As soon as that is saved, it can be accessed from the app on your phone/tablet… And anyone with that Dropbox login can go in and use that form and submit data…

Those custom forms allow for easy data  management – consistent terms, single data structure, setting increments to aid estimates, reduced errors (or consistent at least!). Once you fill in a form, you click save.. and then you get to locate your data. Shown as a point on the map based on where you are standing. You can move that pin as needed, you can manually correct where the form things you are…

A wee bit about the mapping… We have combined OS OpenData, added contour mapping from other open sets of maps, brought in Open Street Maps, so we have a custom stack of high quality mapping for the UK, built on all open data sources… We have two different maps at the same scale – one is better in urban areas, one better for rural areas, so you see the appropriate mapping in the area you are in (may combine these in light of new data available openly from the Ordnance Survey).

The advantage of offline mapping is that it saves on cost in urban areas, and allows access in rural areas where there may not be internet access of any type. And everything cached loads faster too!

So, you go out, you collect data… You then can go back to the authoring tool to view data, to filter it, browse the data, edit or delete records if you need to, by uploader (if you include that in your form), to download/export it as kml, GeoJson, csv, wms. You can also share maps through Dropbox. GeoJson is good for embedding maps into websites. KML opens up in Google Earth – looks beautiful!

We’ve put together a vague practical lesson plan that you could use with a class… You set up a Dropbox account that you are happy to share. You download FTGB, you design your form, you share the form – and encourage downloading of maps, in the field you then collect the data using the form, you get back you get online and upload your forms/results, you go into the authoring tool and filter as needed (e.g. incomplete forms), then you can export your data and view them in your choice of whatever tools.

In the future release we will be releasing a global edition, based on OpenStreetMap. It will work the same way but with different background mapping. We may also be supporting upload of your own maps to use as a basemap when you are out collecting data. Similarly points of interest/waymarkers. Also extra sensor measurements – phone as a compass for instance, maybe also ambient noise via microphones. Potentially also more complex forms… we have had requests for logic to change later questions based on a form… All to come in future versions!

Q&A

Q: Some of those extra features – your own maps, waymarkers, OSM – would be really useful.

A TA: Would be great to hear that from your as evidence for those developments.

Q: You talked about Dropbox, have you considered OneDrive which the university now has access to.

A TA: Yes, we built it to feed into any cloud storage provider… We started with Dropbox and have stuck with because it is most flexible

Comment NO: We are using FTGB in COBWEB, so we are self hosting rather than using Dropbox, also using access management.

Q JS: Can you embed images in the app for users to use to identify what they are seeing? e.g. an image of a tree.

A TA: YEs, also looked at in COBWEB, also dichotomous trees… Will all come, probably as part of the COBWEB development.

Dr. Anouk Lang, Lecturer in Digital Humanities, School of Literatures, Languages and Cultures will discuss how she uses the SIMILE Exhibit platform, which runs off the Google Maps API, to create an interactive map to use with students to explore the literary culture of Paris in the 1920s.

I’ll be showing you a site I have built (see: http://aelang.net/projects/) using SIMILE Exhibit, using Google Maps Engine. This is a map of Paris with information related to literature in Paris. Paris was a particularly important place for anglophone modernism – lots of Americans moves there – Stein, Joyce, Fitzgerald, and that decade was so important to modernism. The histories of this time are concerned with a linear narrative. When we see a map it is very seductive… But that is a representation, not accurate. But I was particularly keen to map those places that matter… It can be hard to understand the role of spatiality of the places in this movement (or indeed in general).

So, in this tool you can explore by person… So you can for instance view Sylvia Beach‘s life, a book seller central to modernism in Paris. Clicking on a place gives you more information about that place, it’s relevance.

So, how do you build this? You have a script that is free to use. You enter data into a Google Spreadsheet… There are some predefined fields here… I put in bibliographic reference to allow me to use it in teaching. I put in a person as I am interested in the social links within modernism. The reason I like this is taht in the humanities is that we aren’t really trained to use GIS, but a spreadsheet we can just about manage!

So the data is piped in from a Google spreadsheet, but you have to build the front end. I found a guide from Brian Croxall (see also this code on GitHub) will walk you through the process – you can use his JavaScript and tinker with it…. So you get it up and running…

I originally built this for teaching. The 1920s wasn’t recognised as important until much later on 1950s/60s/70s. By then it is clear, in the biographies, who the big important players are. And those who never quite published that master work etc, insert themselves into that history. For instance we have Canadian writers (e.g. Morley Hallaghan, the only person to knock out Ernest Hemingway) who have interesting interactions with the big players. John Glassco’s Memoirs of Montparnasse, documents his bisexual adventures with both male and female writers of this tine… He locates himself close to key locations… But he has a rival, Morley Hallaghn… So he mentions meeting him but never assigns the location/space there… It sheds a whole new life in their relationships that would have been invisible if I’d looked at those works in any other way… mapping their locations was so useful.

Now I built this for research, but it does double duty for teaching. It is a framework for research, but I got students to think about sociality of modernism in Paris. I asked them to find one piece of information relevant to modernism, arts, culture in Paris, and to find the Geolocation associated with that person – the details are often vague in biographies and texts. That task took them a long time… Then the students were given access to the spreadsheet… So you can then see those entries, and visualise them on that map… And we were able to see patterning of which writers stayed where. So you can explore the locations of women versus those of men. So Paris in the 20s had a group of unusually strong women, publishing each others work… so where did they hang out? That concept is in play… That cotillion sense of our everyday place actually shaped literary history. Place is such an interesting lens through which to consider this work. We may only have sparse information of where these people live and stay – and we may have location only for months or a few years… raises useful questions, lets us ask critical things… Mapping this stuff perhaps helps you see biases, particularly around the prominence of particular places versus others.

So, students begin to understand the research process… you have contingent data that you need to make an argument out of.

Something I love about the Digital Humanities is the sense and culture of openness… And when you teach there is a commitment among the best teachers in this subject to share the very best students work online. That makes students very aware of this very public process – they are very serious about, it is their reputation on the line/building up, and a thing to point employers and peers, etc. to in the future…

So, we build this stuff… We need to embed it so students have to learn a snippit of HTML. Students also learn the importance of precision. If students use “1920’s” rather than “1920s” will hide their work in the faceted search. It seems like a tiny thing but in this subject changes in punctuation can be so important – whether in student work or in those writing on Emily Dickenson’s work.

The other thing that this was helpful for was bibliographic referencing… They were expected to get a proper reference… As we clicked in things in class I mentioned errors… As I did that students were editing their own references live in response. The publicness of the sharing made them keen to correct things! I also really like the serendipity of this – and other new tools – in teaching.

I should say that you can’t do spatial analysis in this. But the SIMILE Exhibit tools do let you view a timeline (and click for more data). But the map is  a point map, I would pull the data out and put it into Arc GIS to do serious spatial analysis on this data… So looking for the shapes, comparing literary to tourist areas for instance.

So, if you want to play, I have a sand box. Find it at: http://aelang.net/projects/canada.htm, just email me for access. If you do edit, do include an identifier to ensure you can identify your own entries – and view just those points on the map.

Q&A

Q: Will you put in iTunes?

A Anouk: Will I make it an app? No. Firstly Google Maps Engine has been depreciated by Google so it’s going, so I need to move to OSM. But also an app is not what I need for my students.

Duncan Shingleton, Research Assistant/Technician, School of Design will presentation on various location based research projects Design Informatics has done…

Oxfam: Sixth Sense Transport

Project under Sixth Sense Transport project, mapping people’s transport habits and seeing if we could help the efficiency of drops to Oxfam clothing banks, minimising car journeys. And to maximise the efficiency of volunteer drivers visiting donation bank sites where the fill level is below that which justifies the journey in terms of goods recovered vs time and carbon output.

We partnered with SmartBin to put infrared (IR) sensors on the bin to measure the fill rate of the bin in real time… Looking at patterns of deposit and emptying of bins we can direct the Oxfam drivers to pick up before other (theft) emptying occurs… So, we have a simple app that shows bin sites, shop sites, and where drivers are. Phones with the app went to shop managers and drivers. Tracking fill levels, (volunteer) drivers indicate stock levels.. It allows Oxfam to track high value items, or items that sell better in particular locations can be taken to the right places for sale.

We have also done predictive analysis of whether driver will be on a particular day of a week… likely places to collect from… and that helps managers suggest ad hoc pick ups – e.g. house clearances. And if a bin has been broken into that can be recorded… messaging can happen between drivers. Lots of new communication happen, there was less driving/CO2 impact, and better success with bins…

Walking Though time – negotiating the streets of Edinburgh in 1860

On the app store so you can download the Walking Through Time iOS app… We are all fascinated by maps and exploring the city… We found that visitors and tourists are looking to Edinburgh’s past. So how could you give people the experience of walking through Edinburgh in the past? It’s a simple map that uses historical maps of Edinburgh, from EDINA, and overlay that on your current Google Maps… So you can walk Edinburgh in 1807… And alongside that you can create story trails that people can add to – story trails, places of interest… And view points of interest for people to click on and learn about key buildings and sites.

Comob – Networking people movements

Comob is about networking people, or people’s movements in maps… You can see your own blue dot on the map, but what about other people? So it’s a very simple idea… It clocks you, and everyone else, drawing a boundary around all of you – kettling you in a sense… So you can login to the map, connecting to loads of people in the world… We don’t know why people are using this in the US and in Singapore. We found truckers using it along the Mexican border! If you go to a new place… I went to Manchester, Chris was there too… So we both opened up Comob, it drew a line between us, and I could just follow the line to him! And you can also use Comob to draw portraits of, e.g. audible noise pollution. Can be used by police for kettling crowds – we tried this, it didn’t launch but we tried it…

And families use it… A family in New Zealand use it, a mum can see her sons – one a pilot, one a roadside repair vehicle, to see where they all are in space and time. IT’s a connection, that feeling of comfort and knowing in the network… So she can imagine her sons day in comparison to her… She has an emotional connection with him…

CoGet – Objects hitch hiking on the path of humans

On a similar note we wanted to see what it would be like to map things… And the relationship between things in a network.. When I go to work each morning it’s the same people at the bus stop, on the train commute… It doesn’t vary a lot… So as the system gets to know everyone’s movements can you get an object from one side of the city to another? In CoGet you can work out your position in space, and direction… And where you may be in the future based on that and your speed… And then if you visualise everyone elses position in the network, and their trajectories… You can visualise that too… So if you have an object to move, you can use the app to move that thing, using the app to alert you on who to pass your object to! So, no extra journey, no extra carbon… As you walk you don’t look for opportunities to interact… everyone is stranger… So it is strange when your phone buzzes, and the other person buzzes, and you have that moment of social interaction… You then move around looking for opportunities. A good fun social experiment to take part in!

Mr Seels Garden – Food narratives in the city (mrseelsgarden.org)

Memories of Mr Seels Garden was a project on the food history of Liverpool, inspired by a former vegetable garden. It started very simply as a memory pool of significant places in the food histories. That’s fine spatially… but how might you carry those food histories with objects. You have barcodes, QR codes and RFID tags in retail… So we designed an app that you can use (only in Liverpool) with city and site clouds – you could explore the whole city, or just the area/geolocation of a specific site… As you move nearer a point of interest a story can change about the thing you are carrying… And you could also see stories across multiple products – a tin of tomato, a pineapple, etc…  Some sites have stories for multiple products, some products have multiple sites associated…

That’s enabled by geo-fencing… which brings us to…

Ghost Cinema – cinematic narratives in battersea

As you walk around Battersea, this app will nudge you to alert you about film sites/filming locations, former cinemas etc. Another small site-based geofencing experience… Find out more on the Cinematic Battersea website.

Treasure Trapper – Mobile game in conjunction with Edinburgh Museum and Galleries.

Edinburgh Museums and Galleries have nine sites… But when anyone visits the city they want to go to the Castle, and maybe the NMS… But how do they attract visitors into their group of museums and galleries? So we looked at the bus network, particularly noted that many of their sites are on the tourist bus network. So we wondered if we could use that network to promote their museums and galleries…

Again we thought about unique identifiers, barcodes, QR codes, RFID codes, and number plates… Treasure Trapper works with the city’s bus network: as buses move around the city they automatically gather items on their route… And, a bit like Pokémon, you can chase buses down to grab that object… But you can only drop items off at the museum!

So, again, a very simple iphone app… museums can add items into the system… The app tells you when a bus arriving… once there you can grab it, and return it to the museum where you are rewarded with a free badge or discount code, etc.

Q&A

Q PW: I was particularly taken with the CoGet idea… You showed straight lines and vectors… does it understand streets/paths etc?

A: Only in the sense that it measures every 15 seconds, and learns paths… And will update accordingly

Q PW: Given that Google Maps etc. estimate times in.. have you looked at that?

A: Not so much as those require your destination. But it does work out time based on speed of transit for estimation of subsequent location.

Jonathan Silverton Chair in Technology Enhanced Science Education in the School of Biological Sciences – will present on “Virtual Edinburgh: turning the whole city into a mobile learning environment”

Virtual Edinburgh is actually an idea, in fact one that feels like it already has so much substance already based on the work here. The idea is to turn Edinburgh, the city, into a pervasive learning environment. So I’m basically talking about little more than linking all those apps and ideas together.

As a newcomer to Edinburgh I am struck by just how much there is to learn about the city – the history, the science – and if we can open that up to students to be a resource for them to explore, to be available to them… it becomes a learning city…

So we have apps that feed data to people – WTT, Palimpsest, MESH (Mapping Edinburgh’s Social History), iGeology 3D (developed by BGS)… Already there… feeding data out… But there are also other apps like FieldTrip GB that let you feed in data too – essential for students to be able to engage and contribute if we want students to take. And there’s something of my own here… iSpot which is about identifying organisms in a community of about 50,000. Location is recorded for everything there… And another one that allows you to contribute is the “nearby” function in Wikipedia… Editing Wikipedia exposes this…

And there’s another FIELDTrip, this one from Google, with less exciting information about pubs and cafes…

We are overwhelmed with geolocated data, we are particularly blessed here… With a city with so many hooks on which to build cool things for our students to do… So the idea is to take all the stuff already going on, and make the most of the synergies already going on…

So, a theoretical example…

Calton Hill to King’s Buildings journey… On iSpot 6 spottings of organisms there… Here others have confirmed that sighting (more than a Like on Facebook), with geolocation etc… So that use of the name of the object unlocks knowledge – they can look it up, they can see Wikipedia, Encyclopaedia of Life, etc. Click to explore…

At this point we could link it into Edinburgh research… So we look up that organism and find the Halliday Lab that researches that plant… (arabidopsis thaliana)… So there we are, can find out about what research is going on at Edinburgh that you spotted on Calton Hill… It’s nearly all linked up already… We could do it quickly if this was a hack..

So we have an idea of how this might all work… you have data sets, you have students and researchers creating the things… perhaps this material is also consumed by the local community… and it builds on what’s already going on…

As soon as you overlay different data sets there is an issue of how you give people intuitive access… So we are thinking about using tools like 3D visualisations, as already are in use in the Old College app… We need something intuitive here, as all drop down lists etc. won’t do this stuff…

So, we have a lot of use cases here, some are playful, some are more practical, some artistic… There are lots of different ways this idea can be used in teaching, in research, in art…  Edinburgh can be a city of learning, just as much as it is a city of literature and of heritage… Watch this space for more on the idea.

Q&A

Q MW: I love the vision of the idea, and I’d like the idea of allowing the community to contribute, and of mapping and tracking that, to see where contributions are coming from, understand what is coming in the near future, etc.

A JS: Yup, great idea.

Final Summing Up – Susie Greig

There has been some interest in Wikipedia Nearby. This is an option within Wikipedia to look at things from a geolocation point of view. It is part of the Wikipedia app. You can explore from phones/tablets or from your PC. There have been some interesting references made to Wikipedia Nearby, Wikipedia’s blog talks about this function also being used to trigger users to add images for those pages. We love the idea but me and my colleagues testing suggests that that isn’t quite working yet… Has anyone explored this yet? We just thought Wikipedia is editable, could set up scavenger hunts and trails… So we just wanted to mention that it’s there and if you can get that extra functionality to work, do share!

And finally, thank you to everyone for coming along today!

Apr 252012
 
Screenshot from the ViTAL Webinar on "Flipping"

This lunchtime I have been attending a ViTAL webinar (held via Adobe Connect here) on “flipping” which they describe as “the video-based approach that emerged in the US and has raised huge interest in the UK and Europe”. There is more background in an article on flipping in the UK edition of Wired this month: http://www.wired.com/geekdad/2012/04/flipping-the-classroom/

Our presenter for this session is Carl Gombrich, Programme Director for UCL’s undergraduate interdisciplinary degree: Arts and Sciences BASc. Carl has Maths, Physics and Philosophy degrees and is a professional opera singer!

 

Screenshot from the ViTAL Webinar on "Flipping"

Screenshot from the ViTAL Webinar on "Flipping"

So here are my notes from Carl’s talk:

This is my first webinar – in fact I’m really pretty new to technology in general. He’s currently setting up an interdisciplinary degree of Arts & Sciences. It’s a major launch of a degree for UCL, it starts with 80 students this year. And we’re really thinking in this climate – and the recent changes to student fees, funding etc – about how we can best engage our students. I am entirely focused on teaching – I’m not involved with the REF at all – and I am desperate to do something better than huge lectures to foster engagement with students.

So about 18 months ago I started to hear about “Flipping” with the launch of the Khan academy. I’m a fan of those and would have loved to have had access to those videos at school. So I wanted to think about how lectures could share content and do this ahead of the lecture so that contact time is really saved for stuff that really counts.

The idea of Flipping comes from about 2007 – Bergman and Sams although some say they have been doing this for much longer – where there was real questioning of why we gather students together in person in a room. I wanted to think about their model and think about how to make contact time more useful, more valuable, so wanted to add polling to the face to face sessions so that lecturers can really get a handle on what students want, to foster engagement through questions and why that’s a good idea.

You can see a 12 minute presentation on my blog about the kit I used but lets just run through quickly. I used the Echo 360 lecturcast system – the tool used at UCL. You just download it and it’s a few clicks to get started. I used a bog standard camera and mic – the built in options on laptops are fine. The lecturecast system could pair an image of the speaker with any materials. You can switch between the materials as you want. You can use MS Office docs along with any bespoke images you want. The exciting thing about video is that you can make it pretty interactive. You can stop the material, you can replay it to engage more with something you don’t understand etc. The other kit I used was a tablet – a little graphics tablet – I use Wacom/Bamboo – it just lets you underline, circle, highlight content as you want.

Actually after the presentation I did for the HEA I have learnt far more about how you do this stuff… some of the technologies are far more fluent, allow realtime noting etc. I think PowerPoint for Mathematics is a real killer. You have to see the process as you do in music, it’s visual, you learn best from seeing people thinking aloud. I think Khan does that so well, not everyone agrees but I think he’s a really excellent teacher.

So, that’s what I did. I think that sort of model is transferable to any old-style model. Any old knowledge transfer system should be transposable to the idea of making videos in advance. But if you want to do that what do you do?

Well you need to record lectures in advance – at home, in the office, event outside. Use lecturecast – this bit is easy. Then you ask your students to view the lecture before the timetabled lecture slot. Now that, of course, may not work… So… ask your students to upload 3 questions each with timings based on the video lecture (to indicate when questions arise) and send these questions to Moodle – everyone can see the questions that way and you also have evidance that the student has viewed the lecture and raised a question. Cognitively I think that’s very interesting but inevitably there’s also a command and control aspect here about ensuring students are taking part. And my colleague Matt Jenner has helped me set up some basic tracking in Moodle to know that students are participating. The other thing we dop is take a poll of the most popular, say 10 questions.

I was recently at a conference with Thrum, the man behind the Audacity web programming course at Stanford which you should look at as that is truly revolutionary, and he also uses polls and questions to gauge student need, to shape the teaching.

So back to what to do… the final stage is to go to the timetabled lecture slot with questions – interact, debate, solve problems with the students. That’s where it’s really pedagogically interesting. You get to know the students really well, you can get a sense of learning type (if you believe in those) and you can really get a sense of how they are doing. It’s a way to get back to more personal relationships in learning.

So the good things about this approach are that students can interact with lecturers on questions that interest them, problems they want to work through. Students can be split into groups and perhaps support each other (see Mazur) but the key bit is they get their questions answered. Better relationships are built up especially around mentoring, contact, etc. And submitting questions could be part of formative assessment so that everyone is involved in learning and that can really soldor that engagement. And that old lecture time can be used for summative assessments – short tests, blog pieces, group work, longer assessments etc.

And the bad things here?

Well some are concerned about the kit working, technology issues. But I am really a middle aged late adopter and I can manage, we owe it to our students to engage in this stuff and it’s easy to do.

“It will take me double the time – 1 hr to record the lecture, 1 hr for the interactive class” – well perhaps in the current fee climate we owe it to our students to spend that extra time. But being kinder on the lecturer you also do not have to rerecord the lectures every single year but you can rerecord as needed to update or correct anything. And like writing lecture series you can do this far ahead of term. And colleagues have pointed out to me that we don’t have to spend a full hour video – a series of shorter more intense videos might be better and allow you to really focus on the threshold concepts. I don’t know how much more work this would be – maybe 25% more in the first year but reducing over time. But the gains are so much more than any additional time one puts in.

“I hate working to camera” – I loathe working to camera, particularly I hate still images. It’s a real issue for me. But it’s where we are with the technology… I remember my grandparents generation refusing to use the telephone! We all use email now and I think video is really becoming that ubiquitous. We just have to go through that process of getting used to it.

“Students and colleagues will make fun of me or say inappropriate things about my style or the lecture” – this is falling away because of the ubiquity of video. There is an issue with trolling but it’s not a big issue with this sort of video. BUT there is a good reference in my slides here – students have other things to do, we need to rise above those concerns.

References:

And references from the community in the chatroom here:
Q via John Conway (Moderator)) We’ve had a comment about the Panopto product – it lets students annotate notes and save to their own profile, and they can then make them available online for discussion.
A – Carl): Lecturecast isn’t well used yet in UCL. The idea of polling questions in advance is the reflective thing – students can go away, come back, think about the questions. We learn when we aren’t thinking directly on the topic so those gaps can add some real advantage.
Q) What is the difference of Camtasia and Echocast 360?
A – Carl) I think they are versions of lecturecast systems but fairly similar
A – John) Lecturecast is the concept really. Camtasia is a vendor of several sets of softwares. It’s something that we’ve had to be careful to phrase things – see the previous presentation on Lecturecasts on Ning.
Q) What about doubling student study time?
A – Carl) Well we know the thing students most value about studying at university is the contact time and so I think making that more useful will be appreciated. But perhaps it does require reshaping of expectations. perhaps you shave reading time to allow this video engagement. I don’t think you add too much time and hopefully it will be something they value.
Q) Our experience at Aberystwyth is that lecturers are not keen to videoed and students are not that bothered to see them. The audio and the content are the key thing.
A – Carl) Speaking to colleagues there I have a sense that a face is really important for younger students – perhaps children/young people not adults. The audio is the key bit for older learners. But I’m not hugely sold on video particularly. The ability to draw on the screen, to show the process etc. is really important here.
A – John) We have some material on the usefulness of capturing body language – adding additional feedback and information here.
A – Carl) Matt here at UCL has made another point – there’s something on my blog about “do you need to see your lecturer”. I think a few minutes to see them on video may be enough. If you never see/meet someone in the flesh you lose something BUT once you have that, once you have a sense of them as a human, then you can go back to the virtual and use that sense of them to really better understand what you are engaging with online. I think there not meeting/meeting via video/meeting in the flesh. Both of the latter are important but perhaps we don’t have to do as much in person as we once did.
Comment) In teaching negotiation video is hugely important
A – Carl) That is a hugely important point I hadn’t considered – any teaching that requires understanding human interaction – psychology say – will really make the
Q) Do you make any of your material available under an Open Educational Resource model?
A – Carl) I’m not sure if we’ve worked out the economics of this… if a lecturer makes their materials available for free what does that mean for the lecturer and for the institution, doesn’t it undermine that? I certainly don’t want to release them all before students get here. Maybe I’m just not brave enough here!
Q) Many lecturers are used to presenting materials but some are not used to being facilitated? Should we offer training on how to be a good facilitator? For instance would they need training on how to handle debates in the classroom?
A – Carl) Gosh, maybe. I’ve always done my teaching the way I do. I suppose I just expect teachers to have those skills and I’m lucky that setting up a new degree I can choose my colleagues here. But if you don’t naturally engage with clickers, with new technologies that have proven pedagogical value then yes, you would want/need access to training.
Q) What is you say something untoward on camera?
A – Carl) That’s a really interesting issue and is far beyond just education. I would hope that we would really learn to handle this as society in a sensible way. As educators we should lead though. I think if you make a comment to a group of 200 people that isn’t being recorded should be fine with doing that when you are being recorded and be backed up by your institution.
Q) Could you use some of the captured content in the classroom?
A – Carl) I think you would not want to show long clips but with a bit of planning using a clip related to the key questions as you are addressing those.
Q) What feedback have you had from students?
A – Carl) As I mentioned earlier I am setting things up for September 2012 so I don’t have research base for this teching method yet but we do have research that what students value most is contact time. We are also trialling some split screen head to head debates for students to engage with
Q) How will you evaluate this approach?
A – Carl) Some open ended questions at the end of term will probably be the way to do this. I am cautious about over scrutinising students – I just think that’s the wrong atmosphere for what we’re trying to achieve.
Really most of the first and second year undergraduate courses you might be teaching are already on the web in some way – via existing educational materials online. But you really add the value meeting the teachers face to face and discussing and engaging with them.
Comment) Isn’t this the same as reading before a lecture?
A – Carl) Yes, some of my colleagues have said that! But the medium is really changing. In a way we’ve always asked students to do pre-reading – and they have rarely done that. But I think video, I think polling students is a qualitative shift that makes this difference.
John) Thank you all for coming along today and if you have any further questions and comments do take a look at the ViTAL (Video in Teaching And Learning) Ning community:  http://vital-sig.ning.com. We will address any questions raised there on Ning and perhaps in a webinar in the future.  The next webinar will be on video and pedagogical design.

 

Feb 222012
 
sustainable

Today I am at the Digital Scholarship: A Day of Ideas which is a day of “talks and discussions for staff and PhD students in HSS, to inspire and share ideas for digital research, teaching and scholarship. An exciting programme of invited speakers working in the field of digital scholarship will present their ideas and their work” taking place at The Business School at the University of Edinburgh. The Event has been arranged apart of the excellent Digital HSS programme of activities. The full programme is available here (and I’ve also linked to the related abstracts in the titles for today’s talks below).

::: Update: The videos are live on YouTube here :::

The event is also being webcast and can be viewed here: http://www.digital.hss.ed.ac.uk/?page_id=504

As this is a liveblog the usual caveats apply to this liveblog re:  typos, errors, etc. And please do leave me comments and corrections!

Continue reading »