Jan 282016
 

Today I am at the eLearning@ed event “Using Pebblepad/Atlas for managing the student dissertation life-cycle“.

As usual these notes are being taken live so all comments, corrections, etc. are welcome. Update: In addition to these notes, slides from this session are now available here.

Dissertation Marking and Feedback using ATLAS – Graeme Ferris and Paul Caban

Paul: I’m going to start with a brief history of dissertation management in the Business School. Up until 2002 we were using a paper-based system (version 0). That was great but not scalable. We were expanding so we needed something new.

For version 1 the school had it’s first dedicated developer who built us a system using ColdFusion and SQL. That managed the process but it wasn’t great. ColdFusion is also not a tool particularly well loved in the University. So for version 2 we had started working on php and Postgres… But that project became problematic in lots of ways and didn’t result in a working version that achieved what we needed it to do.

Version 3 was built within the school and was based on php and Postgres, it was built swiftly but needs were changing, the requirements were diverse, and it was becoming unmanagable.

So the dissertation process is loosely:

  1. Sign up – supervisor choice and allocation. People state their preference, allocations are made.
  2. Supervision and submission. Ethics etc. processes are gone through.
  3. Marking. Currently on paper. Agreement and audit trail between markers. Then marks and feedback goes to the students.
  4. Reporting and Quality Assurance processes.

However… The developer that had built this system decided to move to a new role. At the same time we saw this system as in need for reconsideration, high on our risk register, and we had new needs to accommodate. We wanted to provide electronic feedback, other systems looked possible, there were various things already in the VLE which we’d previously had to hand code. We then had an external review and recommendations… Part of that was a steer towards internal tools. Sharepoint was one possibility… Atlas/Pebblepad was the other option and it is really simple to use. By this point we were in late autumn 2014, and needed a new system in place by February for dissertations. So, we decided to use the old part of our tool for student sign up, but do something new with Pebblepad for the rest of the process.

So, for the project we created a design brief. We did a massive stakeholder involvement process, spoke to committees, spoke to lots of staff. We designed a project that would have no (negative) impact on students. And ensured that training and documentation would be in place.

Graeme:With a custom system we have endless freedom… But that was the problem with the previous systems – there were too many options that led to unnecessary complexity. So, for this solution, we wanted to ensure the essentials were in there but jettison some of the more obscure processes requests – exceptions for those not wanting to follow the core process.

So, one of the key things was double-marking – that is a University requirement for any single piece of work over 40 credits. Blind marking is not mandatory across the university for double marking – but it is Business School policy. The difference, to note, is that for anonymous marking you don’t know who the student is – tricky with supervised dissertations of course. Blind marking is where one marker does not see the other’s marks until initial marking has been completed. That requirement had design implications. And once that double blind marking was done the markers have to see each others marks to reconcile and discuss.

In terms of the reconciliation we needed some depth of reasoning to be captured. So, in terms of managing this we wanted a process that involved:

  • Initial marking blind
  • Reconciliation
  • Recording reconciliation notes and comments – crucially always ensuring that the student had the appropriate feedback to the final mark. The supervisor has to manage this to ensure it is correct.
  • Marks fed back.

Early iterations were a sort of time-based model, making use of the permissions for different roles within the system. So permissions set for blind initial marking, meaning that at an agreed data permissions changed to allow reconciliations. That was suggested but… Actually the feedback was that that was unacceptable since markers mark at different paces and timings, it wouldn’t be possible.

This was a particular concern as the available functionality was via role based permissions – which means if you were marking 4 students, and second marking 5 students, all of those marks would be out of blind mode at the same time. To overcome this we created a model using separate workspaces for: initial marking (blind) – marker view and completing marking template, admin view and moving marked submissions; reconciliation – marker view and completion of reconciliation inputs, admin view; archive – locked down for exam board / feedback to students; Reporting function.

So, I’m going to demo all these workspaces now… All academic staff have a tutor role. Admin have lead tutor role – they can see everything. For academic staff it’s anonymous and blind – they can’t see names of students, each other’s names or marks. And we also made use of the inbuilt Atlas concept of “sets”… Very useful indeed. You set up a set… Academic A marks 4 students, make a set for that. When the marker goes in they only see the set they are allocated to… They don’t have to find their students… We have sets for first markers, sets for second markers… Can pick/filter between them. And the other reporting aspect uses sets – for cohorts, for groups, etc. So that you can account just on one MSc, or on a cluster of MScs in the same area, etc. So that’s really lovely. And admin can see all of the sets…

So, as a student you go in via Learn (the VLE), and go through to the submission area via a simple web form. Students were already submitting work via Turnitin in Learn. There is an integration for Turnitin and Atlas… But we aren’t keen on that as it looks at the moment. So, instead, they just submit once via the Pebblepad web form… And that is submitted to Turnitin and is available there for staff to check as appropriate. And the submission goes into the Pebblepad pipeline. We wanted this to be as simple as possible. I am looking at using Turnitin for the next round of marking, but the delay for now is about students seeing their own originality reports – something we usually do as standard in our use of Turnitin in the Business School, but not something that is currently part of the integration with Pebblepad.

As a marker you go in and see the students available to mark, you can view by whether you are first or second marker – based on the set those submissions are in. Our process isn’t sequential so the indicator in the workspace – a green tick – shows that I have marked a submission (there is a second set of indicators but as first marker I won’t see the second marker’s progress – and vice versa – but it is logged).

Then, as you open up a dissertation you can add comments as you read but… It is hard to get these out at the end. We want to be able to get that content out for quality team to look at rubric etc. BUT you can’t turn the comment panel off. Nor can you edit the text in that panel. So, we have feedback template – clicking on that brings up the appropriate template to complete, which guides the markers through the various aspects of the rubric. There is also a space for comments in the box. Originally I expected that to be comments for the other marker but the reality is that this box is used for comments for the students – which makes sense. Once the template is closed it is temporarily cached but you have to click “save and close” in the comment panel to capture that feedback. Then it is added to that comment panel again.

So, that’s initial feedback… When it comes to reconciling the marks… As a marker you’ll see submissions. You will want to know if you are first marker – with responsibility to feedback comments etc – or a second marker – in which case you are just needing to approve marks and feedback. So visually that difference is and has to be clear in the reconciliation panel/dashboard.

When you open up an assignment for reconciliation you can read the assignment, add reconciliation comments etc. There is also capacity for a third marker if the marks differ hugely. But it is crucial to use the correct templates at this stage – one is for student feedback, the other is for reconciliation comments. And there is no easy way to check which content has been added to which template other than manual checking.

As an admin of this system you are able to move submissions around, and to notify markers about that. Submissions are moved 5 or 6 times a day so markers rarely have a long wait for submissions to mark. This is straightforward to do – you just move between the various sets.

The only issues in the system has been around reconciliation of marking because we need to check every single submission to ensure the right type of comments and feedback are captured in the right place. If that needs fixing… Well initially that sat with me but the PG office came on board later last year, but we’d like to devolve the administration of this.

The feedback area is the locked down area. Once everything is verified it becomes non-anonymous and grade shown ready for admin staff to use or report on the submission. One of the limitation of Atlas is the reporting and the ability to summarise the reports. I have to create separate reports for each type of report – would like to do that as a single report for our QA team though.

So, finally… This system does fulfil the remit of anonymous double blind marking. Markers only see their own submission. Initial feedback isn’t released to students. Information can be locked down. But there are issues with templates being a wee bit clunky and problematic. The functionality is limited, reports are too separate and not able to combine at present. However…

We met with Pebblepad just before Christmas. We have asked for reporting from the comment panel – with that we wouldn’t need another template. We’ve asked about integrated reports. And also asked about the ability to turn off functionality if not being used.

Paul: Pebblepad are receptive to feedback, and they have made changes in the past – for instance they captured but didn’t show that matric number, which they have now added.

Ellen: Pebblepad is used in lots of different ways, which is great, but it was initially designed for personal reflective portfolios so they have thought these things through but assessment wasn’t their initial purpose for this tool and that is reflected in some of those challenges.

Graeme: We are also looking at this system for UG dissertations.

Paul: And what we learned here… Know your process… We would have saved loads of development time by knowing who to speak to and what they needed. Some people confused process changes for the new tool. We really needed a very active academic champion because of this. Engagement – you can never have enough. Graeme did loads of training and documentation – many didn’t engage in that and wanted one to one support, so we had to do that too. There has to be someone doing quality control – that is also about quality and level of feedback, not neccassarily to do with technical challenges. And in terms of limitations…. reporting was a real limitation, the data management – we wanted to report by set and couldn’t at the time (Ellen notes that there have been changes recently), and we needed to devolve that system. We also realised it was hard to develop a system without real data, and an understanding of how it could go wrong. But having done this for real we now have that much greater understanding.

Q: Can you integrate with groups in Learn?

Graeme: You can pull across sets, I haven’t tried it with groups.

Connecting up feedback – and possibly everything else – through an eportfolio – Paul McLaughlin, School of Biological Sciences

I wanted to talk about use of Pebblepad with undergraduates, particularly for getting them to connect ideas between courses. I’ve also been trying to induct undergraduates into Senior honours to get them to understand the importance of this… Understanding the importance is like being an actor… to get an Equity card you have to be an actor but to work as an actor you need to have an Equity card…

So, in our first year all biology students do a large biology course. They get extensive video feedback on their first undergraduate essay. We also ask them now to enter a feedback form via PebblePad of the feedback they are likely to get before they get their feedback. And then later on we ask them to reflect at the end of the course about what they have done, and how they will take that forward. They are asked to make an action plan – a bit formulaic but helps students take control of their learning. In the second course we are leading them towards an assessment problem that they need to complete. They get exam feedback around week 4 or 5 and then we encourage them to meet their tutors. Students post their action plan as a note into Euclid. They don’t need to know anything of Pebblepad to use that but they have a good place to start from with students.

Then, at the end of the year we ask them to look back at how it went, to reflect on what went well and less well. To compare semester 1 and semester 2. And students sometimes capture other aspects of life with impact on what they do – e.g. that they need to plan around flat hunting.

In terms of completion of tasks we see that the first few tasks around assessment we get good completion rates. When the process is only for their own benefit in the longterm we see less high completion rates.

So, I also wanted to talk about something else we do where we induct into senior hons as part of a tutorial and encourage both personal and group reflection. The idea is to help students prepare to make the jump to honours level work. We use Padlet as part of this. And we also have two summative exercises as part of that where we use Pebblepad for capturing reflections.

Finally I wanted to talk about work on our distance MSc. I was thinking about what it feels to be a distance learner, and the importance of feeling like you are making progress. I wanted the portfolio to be available for students to support themselves. Now, Pebblepad has the idea of a workbook that you can add ad build up… Overarching this is the graduate attributes the university has put together. So, a student can look at the graduate attributes – we give them three attributes that we think a given course can help provide evidence of. Students can then self-assess and add evidence to back up that attribute and their rating of their own achievement of that.

As the course progresses students choose their own attributes. By the end they have those attributes and the stories that tell where they are with those attributes. This is very connected to careers, to job applications… They have the information to look over and draw upon in their applications and interviews. In fact we also did some mock interviews with colleagues from Careers, using Collaborate. They then had to make an action plan based on that careers interview.

In that online MSc the students use their blogs for reflections and exercises like the interviews must connect back to these, to emphasise the value of regular timely content, engaging throughout.

But there are questions here… How do you assess this efficiently? How do you do quality assurance – especially if all very distributed? Should it speak for itself?

One of the things we’ve been talking about… We do see that that engagement can fall off if we don’t assess or push the issue. In the first two years of undergraduate courses there can be this issue of feeling that this doesn’t count. So, in the future perhaps the best measure of success is engagement – let’s just assess engagement and that can count towards a synoptic course (capstone) that is about reflection based on solid evidence collected through all four years. That would make reflection in the first two years really count. The missing thing for me is how to assess that efficiently.

Comment: I think we have a metric for engagement. We’ve just gone through SLIC’s pilot. That is basically this… 10 credits for additional credit. We had maybe 12 students go through this… We independently double marked, and all that was marked was how the student had responded to the learning outcomes. The students set the learning outcomes. We came up with the rubric and we gave halfway comments on their blogs if they wanted it. And all that we were marking was the reflection on that learning, and specifically the report on that experience at the end. It was remarkable how consistent the richness of engagement etc. was from people across schools, in areas that were not their speciality, and how consistent the marking was.

Paul M: The SLICs… If I wanted to see the SLICs would I be able to?

Commentator: Yes, you can see that by arrangement.

Paul M: In Pebblepad students have control… They can choose how things are shared… But that is also a challenge to see how these things have been used before…

Ellen: There are some case studies… And workflows… But we are also talking about setting up a local user group.

Q: Portfolios are things you might want to actually show an employer… Have you had much experience of employers etc. coming in?

Ellen: You’d actually share a web folio – like a website – which draws on it. But you wouldn’t give them Pebblepad access.

Paul M: Which is why it matters so much to tag things. But those web folios can be shared with named people, or wholly public… And I believe that Pebblepad is for life…

Ellen: Students don’t automatically keep student logins… But they can sign up for free lifetime account between completing course and graduating…

Me: I think it would be useful to look across how blogging is being used in other programmes for reflection, and how assessment works there, and can work there…

Paul M: There is lots of work but in terms of the pedagoguy here… I would also wonder how easy this stuff is to game…

Comment: For the SLICs (Student-Led Individually Created Courses) the quality of students was good, but actually the quality of material was actually brilliant… So you’d immediately see if someone was trying to game it…

Paul M: And it probably would take more to game it than to do it… I’m more concerned about students at the middle or bottom of the distribution, than those at the top… I’ve been considering a 10 credit course… For 20 credits that would be better perhaps but scary perhaps…

Comment: Senate have approved SLICCs. There is a very very strong recognition that students need to take ownership of their learning, and this is a strong way of moving forward on it…

Q: My question is a bit off topic… What happens if the student actually does see the first marker, the second marker, and the reconciliation comments. I think that’s a recognition of differences of opinion, academic discussion, and compromise of views through a different lens.

Paul C: I don’t think it’s a problem as long as everything is properly evidenced.

Greame F: It isn’t a problem… But the concern is about the potential for student appeals and questions… The process is good… But whether students should or shouldn’t see that hasn’t been part of our role here.

 

Question: Has anyone tried the next version of Pebblepad?

Greame F: I don’t think we’ll have access until summer 2016 or 2017.

Paul M: Our version is much more agile than it was… But still some challenges there.

Questionner: But MediaHopper (the new University media service) may also address some of those.

Question: How can this be used for peer assessed group work?

Paul M: You can use Pebblepad for group work, using various permissions etc, but haven’t tried that for peer assessment.

And we finished with some discussion of the Pebblepad responsiveness to feedback – they seem very responsive – particularly for new or unexpected use.

 

 January 28, 2016  Posted by at 1:18 pm LiveBlogs Tagged with:  Add comments

 Leave a Reply

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>

(required)

(required)