Dec 052016
 
Preview image of the Crowd Power: the COBWEB Guide to Citizen Science comic

Last weekend (Sunday 27th November) I gave a talk at the Edinburgh Comic Art Festival 2016 on what it’s like to turn research into a comic book.  We made a (low tech) recording of the talk (watch it here via MediaHopper, or embedded below/on YouTube, see also the prezi here) but I also wanted to write about this project as I wanted to share and reflect on the process of creating a comic book to communicate research.

So, how did our comic, “Crowd Power: the COBWEB Guide to Citizen Science” come to happen in the first place?

Meet COBWEB

As some of you will be aware, over the last four years I have been working on an ambitious EU-FP7-funded citizen science project called COBWEB: Citizen Observatory Web. We have, of course, been communicating our work throughout the project (in fact you can read our communications plan here) but as all the final deliverables and achievements have been falling into place over the last few months, we wanted to find some new ways to share what we have done, what we have accomplished, and what the next steps will look like.

Whilst we are bringing COBWEB to a close, we are also now taking our resultant citizen science software, Fieldtrip Open, through an open source process and building new projects and sustainability plans (which also means considering suitable business models) around that. Open sourcing software isn’t just about making the software available, or giving it the right license, it is also about ensuring it has a real prospect of adoption and maintenance by a community, which means we are particularly keen to support and develop the community around FieldTrip Open. And we want to bring new people in as users and contributors to the software. So, for both dissemination and open sourcing projects we really need to inspire people to find out more about the approaches we’ve taken, the software we’ve built, and to explain where it all came from. But how could we best do that?

During the project we had developed a lot of good resources and assets, with a lot of formal reporting and public deliverables already available, and accompanying engagement with wider audiences (particularly co-design process participants) through social media and regular project newsletters. Those materials are great but we wanted something concise, focused, and tangible, and we also wanted something more immediately engaging than formal reports and technical papers. So, this summer we did some thinking and plotting…  My colleague Tom Armitage joined COBWEB partners in the Netherlands to revisit our geospatial software open sourcing options with the OSGeo community; Tom and I met with the fantastic folk from the Software Sustainability Institute for some advice on going (properly and sustainably) Open Source and building the software community; and my colleague Pete O’Hare looked at the videos, demos, and footage archive we’d accumulated and suggested we make a documentary on the project. After all of that we not only had some solid ideas, but we’d also really started to think about storytelling and doing something more creative for our current target audiences.

Across all of our conversations what became clear was that real need to inspire and engage people. The project is complicated but when have shared our own enthusiasm about the work and its potential, people really take an interest and that open us longer and (sometimes) more technical or practical conversations. But we can’t get everywhere in person so we needed some cost effective ways to do that excitement-building: to explain the project quickly, clearly and entertainingly, as a starting point to trigger follow up enquiries and those crucial next step conversations. So, In August we did follow up on Pete’s suggestion, commissioning a documentary short (that’s a whole other story but click on that link to view the finished film, and huge thanks to our wonderful filmmaker Erin Maguire) to give an overview of the COBWEB project, but we also decided we’d try something we had never done before. We were going to try making a comic…

Why a comic? 

Well, first lets talk terminology… And I should note that if this blog post were a graphic novel, this would be a little side note or separate frame, or me explaining a pro tip to the reader – so imagine that as our format!:

Is it a comic, or is it a graphic novel? I think a lot of people will think about “comics” as being The Dandy, The BeanoManga titles, or one of the long running mass market series’ like The Avengers or Archie. Or maybe you’d think of a comic strip like Peanuts or Calvin & Hobbes. Similarly “graphic novel” seems to be tied to the idea of long form books which look more like literary fiction/non-fiction, with well regarded titles like Fun Home or Kiki de Montparnasse or Persepolis. The difference is hard to explain partly because when you make that sort of distinction, clearly there are a lot of boundary cases… Is it about audience (e.g. teens vs adults), or aesthetic, or page format or critical response or some other criteria? Calvin & Hobbes deals wittily with matters of philosophy, but is widely read by children who engage with its (deceptive) simplicity, charming aesthetic and warm tone and deceptively simple story telling. By contrast, the new The Story of Sex: From Apes to Robots has a lively and pretty course – even for the subject matter – aesthetic (it’s authors are French and I would put the drawing mid-way along the AsterixCharlie Hebdo tastefulness continuum) and it is aimed at a wide audience, but it is co-written by an academic, has been well received by critics, and you’ll find it shelved in the graphic novel section. Comparing these works on any kinds of comic vs. graphic novel grounds won’t tell you anything very useful about style or quality, although it might reveal the personal preference of the reader or reviewer you are talking to…

So, Before I began this project I was pretty sure that what I read are graphic novels – yes, snobbery – but, when you actually talk to people who make these wonderful things, the term – especially for shorter works – is “comics” and that’s accepted as covering the whole continuum, with all the styles, genres, print formats, etc. that you might expect (yes, even graphic novels) included. So, taking my lead from those that write and draw them, I will be using “comic” here – and next time you are discussing, say, female self-realisation in Wonder Woman and the Nao of Brown, you can go ahead and call both of them comics too!

Definition of a comic from the OED online.

Right, back to the topic at hand…

One of the reasons that a COBWEB comic seemed like it might be a good idea is that I really enjoy reading comics, and I particularly love non-fiction comics as a form because they can be so powerful and immediate, bringing complex ideas to life in unexpected ways, but which also leaves you the space to think and reflect. Comics are primarily a visual form and that enables you to explain specialist technologies or sophisticated concepts, or take people on flights of fancy offering creative metaphors that allow you to explain but also re-explain and re-interpret an idea lightly and engagingly. Your audience still need to think and imagine but in a great comic the combination of text and visuals brings something special to the experience. Comics can be more playful, colourful and bright than a formal report, and also much less constrained by physical reality, budget and location than a video or an event. And whether in digital or print form comics feel really pleasingly tangible and polished; they are designed, story-boarded, they feel like a special and finished product. From the non-fiction comics I’d read I could see that comics would work well for talking about technology and research, so they could be a good fit for our project if we could be confident that our target audience and our type of research would be a good fit for the possibilities and restrictions of the form.

For the COBWEB project we wanted to reach out to researchers, developers, and future project partners which are likely to include software and digital companies, NGOs, SMEs, as well as non-professional researchers (community groups etc), and others interested in working with – and hopefully interested (in some cases) in contributing to our codebase – for our open source software. This is defined set of audiences but each audience (and individual) holds highly varied interests and expertise: COBWEB is a complex project, with lots of different components, which means our audiences might be new to all of the concepts we are presenting or they may, say, know a great deal about coding but not environmental projects, or all about the environment but not about using mobile technologies… But we do know these audiences – we already work with developers and researchers, we’ve been working with potential users and contributors throughout the project so we have some idea of interests, aesthetics, etc. We felt pretty confident that many of those we want to reach do read and engage with comics of various types, from web comics like xkcd to beautifully published books like The Thrilling Adventures of Lovelace and Babbage. That overlap and interest helped us feel confident that a comic would be a good fit for our audience, and a really great fit for telling our story.

Finding a comic artist to work with

We now had the bare bones of the idea, and we had a solid idea of our target audience. But we weren’t totally sure about which aspects of our story to draw out, what parts of the COBWEB story we wanted to tell, although we knew it had to inspire, entertain, and be accessible. We also really didn’t know what we wanted our comic to look like. As I started to think about possible collaborators (we knew we needed others to work with/commission) I remembered that very many years ago I’d seen a flyer – in the form of a comic book – for Glasgow Comic Con in a hotel. I did some searching around and found BHP (Black Hearted Press) Comics, an independent comics publisher based in Glasgow that creates their own comic books, but had also recently completed a project with the University of Glasgow’s Hunterian Art Gallery and Museum. Looking around their site I also found The Mighty Women of Science, another book where the subject and aesthetic suggested a good fit with COBWEB (and it was. Spoiler: Mighty Women author Clare Forrest illustrated Chapter 2 of our book). I had no idea what to expect in response but there wasn’t a way to find out if this idea was viable without getting some advice, so I fired off a quick email to BHP Comics…

Screenshot of the BHP website featuring Mighty Women of Science

Screenshot of the BHP website featuring Mighty Women of Science

I had a really swift reply from Sha Nazir from BHP. Sha was interested to talk more about the idea so we set up a meeting and, ahead of that, I trawled through my favourite comics to find some examples of the kind of idea I had in mind. On the day I brought in a few books that I thought did this sort of storytelling well, including: The Influencing Machine, Brooke Gladstone and Josh Neufeld’s overview of the (US) media ecology and culture; The Thrilling Adventures of Lovelace and Babbage – a fictionalised steam punk re-telling of the lives of Charles Babbage and Ada Lovelace, with great technology descriptions and lots of (factual and referenced) footnotes; Filmish – the book of Edward Ross’ critiques and explorations of cinema and film making (in the mould of Scott McCloud’s classic Understanding Comics). I also brought in a copy of Taylor & Francis’ Cartoon Abstracts – scientific papers which have been turned into 1 page cartoons – which is one of the very few examples I have seen of scientific and technological research being adapted into comics.

That initial conversation with Sha was a long and honest chat about the kind of idea we had in mind. Sha had brought his own selection of books – copies of The Mighty Women of Science, Comic Invention, an issue of Rok of the Reds, and Plagued: The Miranda Chronicles – to give me a sense of what BHP work on, the kind of writers and illustrators they work with, the sorts of formats, sizes and print styles we might want to consider. We talked about timelines: ours were really tight. Sha and I met in August and we needed to have a digital copy available and all work invoiced by the end of October (the print copy could follow). The comics could then be used to extend our dissemination and sustainability work, helping us share what we’d accomplished and support keeping that work and code a going concern. That timeline we requested was ridiculous and I am eternally grateful that Sha even considered taking it on (he was optimistic in our meeting but very wisely went away to think about it before we finalised anything). However, to make that timeline work he was clear from the outset that someone (me) would need to be available to check in regularly, to feed into and look over the script, the storyboards, the draft versions – Sha and his colleagues at BHP would take on the work but we also really had to commit to it to. I was up for that although I had a three and a half week holiday to the US scheduled for September so, with the caveat that we’d have to work around time zones, it all looked doable and we started scheduling some check ins.

So, what else did we need to discuss to get this started? Well, I needed to actually describe and give some background to COBWEB. I told Sha about the project in our meeting – and followed up by sending him some of the key project technical documents and reports that summarised our work. Sha was entirely new to the project – like many of those we want the comic to reach – so asked lots of really useful questions that really did highlight the complexity of describing COBWEB. To give you a sense of that: COBWEB has been a 4-year, €8.5 million project with 13 partners in 5 countries; we’ve had 9 workpackages and many more deliverables, we’ve worked with over 1000 volunteers and 7 co-design projects as we developed our software – for which there are 6 separate GitHub libraries. There is a lot there. And there are important unique aspects to the work: the compliance with EU and international standards, including INSPIRE compliant metadata; our focus on UNESCO Biosphere areas; the access management controls in our software; the involvement of policy makers as project partners; the contribution to empowering of citizens in Europe. At the same time our comic didn’t need to be encyclopedia, it just needed to have enough focus on what was important to give a broad picture and to excite people!

On which note… We talked about the audience, who they were, and what messages they should take from the comic. We were very clear from the outset that we were using comics as an engaging medium, but that we expected our audience to have some fairly serious interest in the project, which meant that although nothing should be inappropriate for children, our target audiences were adults and mainly quite technically literate adults. We wanted to explain the work of the project and assumed not prior knowledge of COBWEB, but some (useful) complexity and detail was going to ok where it felt appropriate. And we felt we could assume that readers of the comic may follow up by reading one of the more traditional publications if they then had a specific technical or policy interest to follow up.

At that initial meeting we also talked a bit about artists and art work. With our (crazy) timeline Sha recommended we break the the comic into a small number of chapters and that, once a script was written, these would be illustrated by different artists meaning that we’d get a really lovely variance of styles across the comic (something you’ll see in a growing number of comics, including Kiki de Montparnasse where drawing styles change when “Kiki” works with different artists). Using several different artists was also practical, as it meant that those chapters could be illustrated in parallel by different people – shortening those restrictive publication times.

Initial art work for the COBWEB comic by Kirsty Hunter

Initial art work for the COBWEB comic by Kirsty Hunter

We also talked about formats. The weekly comic book style of Rok of the Reds was going to be cheap to print and it would be easy to hand out – it could almost fit in a pocket – but it didn’t look quite as polished as we wanted. But The Mighty Women of Science had a great format – substantial and beautifully finished thick/card cover and binding, with matt finish pages, in A4 format (useful since all of our display stands, envelopes, etc. are designed for A4 reports/promotional items). It looked like a book, a thin but high quality finish book and, better yet, it was a budget-friendly format for a small print run.

And, as the ideas took shape, Sha and I discussed cost, and an initial estimate of the work to do the digital comic, plus a price for a print run of 1000 A4 copies. A quick sketch of costs came out of that meeting, which allowed me to  talk to my COBWEB colleagues and to check that our budget could accommodate the project. I don’t think it is appropriate to share that price here but it was very reasonable for this much work and, particularly given the timelines we were working with, was enormously good value. Why tell you this? Well, if you are thinking of doing your own comic then I highly recommend talking to some comic artists or publishers before you (potentially) rule it out over costs, since (for us at least) those costs were very fair but were also dependent on things like number of pages and chapters, print formats, etc. so were also (somewhat) within our control.

So, we now had some solid ideas and a plan. We exchanged emails to work out the details, check costs (and check budgets), and get both informal and informal agreements to proceed (which we did quickly because, again, timelines were really tight). A standard contract was prepared and work began immediately at BHP, with me sending over information, background documents and diagrams etc. so that Sha and his colleague Kirsty Hunter could begin to get a script worked out – and could ask any questions as they arose. And, at this point I am going to embed my Prezi from my ECAF16 talk, which covers the production process stage by stage:

Throughout September Sha and Kirsty worked on the script, sending me drafts to comment on, tweak, correct, etc. We arranged several calls from a range of unusually exotic locations – a check in from Seattle, from Davis (California), and then – as I headed off to AoIR – from Berlin. We agreed focal areas early on, with the script starting as a skeleton in four sections:

  1. An introduction to COBWEB and the core concept of citizen science – ensuring all readers share some background knowledge but also making the comic a useful resource to those curious about crowd sourcing and citizen science in general.
  2. Highlights from the co-design work including several real world examples of people and projects who have shaped and been part of the COBWEB community. Much of this came from our co-design project reports, highlighting real challenges and feedback (good and bad) from our volunteer community.
  3. Our “under the bonnet” chapter, on the more chewy technical aspects of the project and including a very cleverly conceived double page spread on quality assurance processes.
  4. What happens next with COBWEB and our software now that the project is over and the open sourcing takes shape, but also where technology is going and how citizen science may fit in to e.g. smart cities.

Those sections were broken into pages and the script rapidly took shape. As the sections and pages were agreed, text for each page was drafted and tweaked. And storyboarding began in earnest…

, but also with the citizen science in a European context

Draft layout sketch for the COBWEB comic (by permission of Sha Nazir/Kirsty Hunter/BHP Comics).

By mid September I had started to receive initial visual ideas and sketches (a delightful treat in a Monday morning inbox!), and, in parallel, the wording and detail of the script was getting finalised. By the end of the month the script and initial drawings were ready enough to share with COBWEB colleagues for their checking and feedback – they did a brilliant job helping me ensure we were using the right types of terminology, not missing anything important, and also catching the less exciting but very important spelling issues, corrections etc. (having many eyes to check a script at several stages was very useful indeed and definitely recommended).

Once the wording was (pretty much) finalised and the storyboards ready, the comic went into the illustration process – seeing those storyboards turn from sketches to fully fledged characters (including a few fun references/”Easter eggs”), then those characters started to gain colour, backgrounds. Drafts were shared and commented on, and finally the final started to take shape. This part of the process followed a different sort of process: it required less input from me at first – a few checks of the pages and visuals – as the work went out to different illustrators for completion. However, once lettering was done there were a few crucial tasks to do: checking all of the text for content, spelling, etc. (which is surprisingly tricky when you’ve been seeing drafts for weeks, you have to adopt a whole different proof-reading level of engagement); building a glossary page for some of the technical terminology (in retrospect this is something I should have done right after that first meeting when the unknown words and acronyms were most obvious); and, because somehow we just hadn’t gotten to it yet, we actually had to think of a title…

A page from Crowd Power: the COBWEB Guide to Citizen Science, featuring real feedback from real co-design project volunteers.

A finished page from Crowd Power: the COBWEB guide to citizen science, featuring real comments from real co-design project volunteers.

What the heck do we call this thing?

In late October, several weeks after beginning work on the comic, we still didn’t have a title. Sha asked me to think about some ideas, and I sketched a few out but also started asking colleagues… We played with variants on the key aspects of citizen science, crowd sourcing, empowerment, etc… We wanted to get COBWEB mentioned, to give a sense of the content, but also to have a title that had a more catchy ring to it. After lots of chats and several lists of possibilities pitching back and forth, “Crowd Power: the COBWEB Guide to Citizen Science” emerged as a winner.

We then had to think about covers. Sha sent through several ideas but one of the most appealing – bringing together an image of a protest march with an image inspired by the Shepard Fairey “Hope” poster for Barack Obama – started to look less than ideal in a post-Brexit context, and with Trump newly elected president. Protests as a shorthand for people power are great, but at a time of genuine political complexity, polarisation, and a high likelihood of real protest movements, we decided that this was an image for the book and promotion, but not for the cover. Some other ideas looked good, but didn’t seem to bring forward the idea of real people, and environmental research as successfully. In the end we settled on an image that is, essentially, a cut scene from the comic, featuring a group of friends using COBWEB out in the wilds, as seen by our (nameless but brilliant) narrator:

Crowd Power: the COBWEB Guide to Citizen Science cover image

Another opportunity to look at our cover art. Eagle eyed cartoon fans may note a certain similarity between our curious walkers and the Scooby Doo gang…

One of the things I was asked early in the process had been “do you want the narrator or main characters to be human? Or can they be animals? Or giant floaty heads?”… I said that anything was fine, as long as it made sense – so a duck or a seal or some sort of animal that would appear in our actual co-design projects were fine, but not a penguin or dragon (or anything that wouldn’t make sense in that context). One of the things I loved about Sha, Kirsty and Clare’s illustrations was that they responded to that flexibility by building in diversity, quirkiness, and little in-jokes (indeed there are several “Easter eggs” in Crowd Power).You’ll notice from the cover that our narrator (throughout) is female. Sha and I had talked about women being well represented in the comic but I was also delighted, when the more finished version of the illustrations came through, to see a range of racial and ethnic diversity quietly represented in our book. The project was diverse in many ways, and we also want to be entirely welcoming to anyone who would like to be part of the COBWEB and FieldTrip Open community. The range of people in the comic subtly reflects that desire to include and engage and is, I think, one of the reasons that comics can be so powerful for messaging values, beliefs, and intentions as part of and alongside the core narrative.

With the title and cover art completed, and a further final proof read. Make that two. Make that three… And a few very last minute corrections… the COBWEB comic went off to the printers and the digital copy immediately went live on the COBWEB project website.  Now, to get the comic out to our audience…

Finding our audience

As soon as the digital copy of the Comic went live we tweeted and shared it with project partners and those interested in the project.

The feedback within the project team was excellent, with some of the team keen to use pages from the comic in their own presentations as an introduction or overview of their work. For the team I think the comic – and the documentary that went live shortly afterwards – provided some sense of stepping back and reflecting on what had been done. At the end of a four year project it can be much easier to know what wasn’t completed, or didn’t go to plan, or didn’t develop as you’d expect. Looking over the story of the project, what had been achieved, how much work had taken place is very rewarding and reminds you of all the excitement and accomplishments of that project.

Feedback from our wider contacts and social media communities was excited and interested. We have shared the comic openly on the website and explicitly state that it can be downloaded, circulated, kept, used elsewhere… We are keen that it is seen and read and used by whoever wants to do that. If I have one regret it is that in all of our conversations we didn’t agree to make the book available under a Creative Commons license – more by omission than because of any particular issue with doing that. Sha has been great about us using images of work in progress – you’ll see a series of sketches, etc. in that Prezi – and shares our keenness that the book is seen and accessible. We commissioned it to be free to access – whether download or print – but it would have been wise to agree licensing terms more directly to avoid any possible doubts.

Then, the week of the Edinburgh Comic Art Festival 13 boxes of comics appeared at the EDINA offices in Argyle House and they looked absolutely glorious! The print copies triggered a ripple of excitement through the office and also generated lots of interest at ECAF – which seemed like a great place to see how our comic fared with a mixed but interested audience.

As the year comes to a close we will be circulating copies to our COBWEB project partners but also that core target audience as we go out and about developing the FieldTrip Open community, sharing copies with developers, researchers, etc.

So, what do you think? 

If you would like a (print or digital) copy, and/or would like to talk to us about how we can support your citizen science project, please do get in touch. I would also love to hear your feedback on the comic and any suggestions you may have about communities that may like to work with us in turning FieldTrip Open into a really vibrant open source project in the future. Do leave a comment here or email me.

Some important acknowledgements

Enormous thanks to Sha Nazir and Kirsty Hunter, who created the fantastic Crowd Power comic with Clare Forrest, Jack Lothian and Kirk Kristofferson. Sha and Kirsty explicitly gave me their permission to share images of works in progress for this post and my ECAF talk this weekend, which I hugely appreciated. It has been an absolute delight to work with all at BHP Comics and I would recommend contacting them if you are considering embarking upon/commissioning a similar piece of work.

Further resources

Some useful links are provided here so that you can quickly access our materials, the comic, or any of the COBWEB project website or code:

Update: Video now available via YouTube

Aug 092016
 
Notes from the Unleashing Data session at Repository Fringe 2016

After 6 years of being Repository Fringe‘s resident live blogger this was the first year that I haven’t been part of the organisation or amplification in any official capacity. From what I’ve seen though my colleagues from EDINA, University of Edinburgh Library, and the DCC did an awesome job of putting together a really interesting programme for the 2016 edition of RepoFringe, attracting a big and diverse audience.

Whilst I was mainly participating through reading the tweets to #rfringe16, I couldn’t quite keep away!

Pauline Ward at Repository Fringe 2016

Pauline Ward at Repository Fringe 2016

This year’s chair, Pauline Ward, asked me to be part of the Unleashing Data session on Tuesday 2nd August. The session was a “World Cafe” format and I was asked to help facilitate discussion around the question: “How can the respository community use crowd-sourcing (e.g. Citizen Science) to engage the public in reuse of data?” – so I was along wearing my COBWEB: Citizen Observatory Web and social media hats. My session also benefited from what I gather was an excellent talk on “The Social Life of Data” earlier in the event from the Erinma Ochu (who, although I missed her this time, is always involved in really interesting projects including several fab citizen science initiatives).

I won’t attempt to reflect on all of the discussions during the Unleashing Data Session here – I know that Pauline will be reporting back from the session to Repository Fringe 2016 participants shortly – but I thought I would share a few pictures of our notes, capturing some of the ideas and discussions that came out of the various groups visiting this question throughout the session. Click the image to view a larger version. Questions or clarifications are welcome – just leave me a comment here on the blog.

Notes from the Unleashing Data session at Repository Fringe 2016

Notes from the Unleashing Data session at Repository Fringe 2016

Notes from the Unleashing Data session at Repository Fringe 2016

If you are interested in finding out more about crowd sourcing and citizen science in general then there are a couple of resources that made be helpful (plus many more resources and articles if you leave a comment/drop me an email with your particular interests).

This June I chaired the “Crowd-Sourcing Data and Citizen Science” breakout session for the Flooding and Coastal Erosion Risk Management Network (FCERM.NET) Annual Assembly in Newcastle. The short slide set created for that workshop gives a brief overview of some of the challenges and considerations in setting up and running citizen science projects:

Last October the CSCS Network interviewed me on developing and running Citizen Science projects for their website – the interview brings together some general thoughts as well as specific comment on the COBWEB experience:

After the Unleashing Data session I was also able to stick around for Stuart Lewis’ closing keynote. Stuart has been working at Edinburgh University since 2012 but is moving on soon to the National Library of Scotland so this was a lovely chance to get some of his reflections and predictions as he prepares to make that move. And to include quite a lot of fun references to The Secret Diary of Adrian Mole aged 13 ¾. (Before his talk Stuart had also snuck some boxes of sweets under some of the tables around the room – a popularity tactic I’m noting for future talks!)

So, my liveblog notes from Stuart’s talk (slightly tidied up but corrections are, of course, welcomed) follow. Because old Repofringe live blogging habits are hard to kick!

The Secret Diary of a Repository aged 13 ¾ – Stuart Lewis

I’m going to talk about our bread and butter – the institutional repository… Now my inspiration is Adrian Mole… Why? Well we have a bunch of teenage repositories… EPrints is 15 1/2; Fedora is 13 ½; DSpace is 13 ¾.

Now Adrian Mole is a teenager – you can read about him on Wikipedia [note to fellow Wikipedia contributors: this, and most of the other Adrian Mole-related pages could use some major work!]. You see him quoted in two conferences to my amazement! And there are also some Scotland and Edinburgh entries in there too… Brought a haggis… Goes to Glasgow at 11am… and says he encounters 27 drunks in one hour…

Stuart Lewis at Repository Fringe 2016

Stuart Lewis illustrates the teenage birth dates of three of the major repository softwares as captured in (perhaps less well-aged) pop hits of the day.

So, I have four points to make about how repositories are like/unlike teenagers…

The thing about teenagers… People complain about them… They can be expensive, they can be awkward, they aren’t always self aware… Eventually though they usually become useful members of society. So, is that true of repositories? Well ERA, one of our repositories has gotten bigger and bigger – over 18k items… and over 10k paper thesis currently being digitized…

Now teenagers also start to look around… Pandora!

I’m going to call Pandora the CRIS… And we’ve all kind of overlooked their commercial background because we are in love with them…!

Stuart Lewis at Repository Fringe 2016

Stuart Lewis captures the eternal optimism – both around Mole’s love of Pandora, and our love of the (commercial) CRIS.

Now, we have PURE at Edinburgh which also powers Edinburgh Research Explorer. When you looked at repositories a few years ago, it was a bit like Freshers Week… The three questions were: where are you from; what repository platform do you use; how many items do you have? But that’s moved on. We now have around 80% of our outputs in the repository within the REF compliance (3 months of Acceptance)… And that’s a huge change – volumes of materials are open access very promptly.

So,

1. We need to celebrate our success

But are our successes as positive as they could be?

Repositories continue to develop. We’ve heard good things about new developments. But how do repositories demonstrate value – and how do we compare to other areas of librarianship.

Other library domains use different numbers. We can use these to give comparative figures. How do we compare to publishers for cost? Whats our CPU (Cost Per Use)? And what is a good CPU? £10, £5, £0.46… But how easy is it to calculate – are repositories expensive? That’s a “to do” – to take the cost to run/IRUS cost. I would expect it to be lower than publishers, but I’d like to do that calculation.

The other side of this is to become more self-aware… Can we gather new numbers? We only tend to look at deposit and use from our own repositories… What about our own local consumption of OA (the reverse)?

Working within new e-resource infrastructure – http://doai.io/ – lets us see where open versions are available. And we can integrate with OpenURL resolvers to see how much of our usage can be fulfilled.

2. Our repositories must continue to grow up

Do we have double standards?

Hopefully you are all aware of the UK Text and Data Mining Copyright Exception that came out from 1st June 2014. We have massive massive access to electronic resources as universities, and can text and data mine those.

Some do a good job here – Gale Cengage Historic British Newspapers: additional payment to buy all the data (images + XML text) on hard drives for local use. Working with local informatics LTG staff to (geo)parse the data.

Some are not so good – basic APIs allow only simple searchers… But not complex queries (e.g. could use a search term, but not e.g. sentiment).

And many publishers do nothing at all….

So we are working with publishers to encourage and highlight the potential.

But what about our content? Our repositories are open, with extracted full-text, data can be harvested… Sufficient but is it ideal? Why not do bulk download from one click… You can – for example – download all of Wikipedia (if you want to).  We should be able to do that with our repositories.

3. We need to get our house in order for Text and Data Mining

When will we be finished though? Depends on what we do with open access? What should we be doing with OA? Where do we want to get to? Right now we have mandates so it’s easy – green and gold. With gold there is PURE or Hybrid… Mixed views on Hybrid. Can also publish locally for free. Then for gree there is local or disciplinary repositories… For Gold – Pure, Hybrid, Local we pay APCs (some local option is free)… In Hybrid we can do offsetting, discounted subscriptions, voucher schemes too. And for green we have UK Scholarly Communications License (Harvard)…

But which of these forms of OA are best?! Is choice always a great thing?

We still have outstanding OA issues. Is a mixed-modal approach OK, or should we choose a single route? Which one? What role will repositories play? What is the ultimate aim of Open Access? Is it “just” access?

How and where do we have these conversations? We need academics, repository managers, librarians, publishers to all come together to do this.

4. Do we now what a grown-up repository look like? What part does it play?

Please remember to celebrate your repositories – we are in a fantastic place, making a real difference. But they need to continue to grow up. There is work to do with text and data mining… And we have more to do… To be a grown up, to be in the right sort of environment, etc.

Q&A

Q1) I can remember giving my first talk on repositories in 2010… When it comes to OA I think we need to think about what is cost effective, what is sustainable, why are we doing it and what’s the cost?

A1) I think in some ways that’s about what repositories are versus publishers… Right now we are essentially replicating them… And maybe that isn’t the way to approach this.

And with that Repository Fringe 2016 drew to a close. I am sure others will have already blogged their experiences and comments on the event. Do have a look at the Repository Fringe website and at #rfringe16 for more comments, shared blog posts, and resources from the sessions. 

Jun 292016
 

Today I am at theFlood and Coastal Erosion Risk Management Network (FCERM.net) 2016 Annual Assembly in Newcastle. The event brings together a really wide range of stakeholders engaged in flood risk management. I’m here to talk about crowd sourcing and citizen science, with both COBWEB and University of Edinburgh CSCS Network member hats on, as the event is focusing on future approaches to managing flood risk and of course citizen science offers some really interesting potential here. 

I’m going to be liveblogging today but as the core flooding focus of the day is not my usual subject area I particularly welcome any corrections, additions, etc. 

The first section of the day is set up as: Future-Thinking in Flood Risk Management:

Welcome by Prof Garry Pender

Welcome to our third and final meeting of this programme of network meetings. Back at our first Assembly meeting we talked about projects we could do together, and we are pleased to say that two proposals are in the process of submissions. For today’s Assembly we will be looking to the future and future thinking about flood risk management.  There is a lot in the day but also we encourage you to discuss ideas, form your own breakout groups if you want.

And now onto our first speaker. Unfortunately Prof Hayley Fowler, Professor of Climate Change Impacts, Newcastle University cannot be with us today. But Chris Kilby has stepped in for Hayley.

Chris Kilby, Newcastle University – What can we expect with climate change? 

Today is 29th June, which means that four years ago today we had the “Toon Monsoon” –  around 50mm in 2 hours and the full city was in lockdown. We’ve seen some incidents like this in the last year, in London, and people are asking about whether that is climate change. And that incident has certainly changed thinking and practice in the flood risk management community. It’s certainly changed my practice – I’m now working with sewer systems which is not something I ever imagined.

Despite spending millions of pounds on computer models, the so-called GCMs, these models seem increasingly hard to trust as the academic community realise how difficult to predict flooding risk actually is. It is near impossible to predict future rainfall – this whole area is riven with great uncertainty. There is a lot of good data and thinking behind them, but I now have far more concern about the usefulness of these models than 20 years ago – and that’s despite the fact that these models are a lot better than they were.

So, the climate is changing. We see some clear trends both locally and globally. A lot of these we can be confident of. Temperature rises and sea level rise we have great confidence in those trends. Rainfall seasonality change (more in winter, less in summer), and “heavy rainfall” in the UK at least, has been fairly well predicted. What has been less clear is the extremes of rainfall (convective), and extremes of rainfall like the Toon Monsoon. Those extremes are the hardest to predict, model, reproduce.

The so called UKCP09 projections, from 2009, are still there and are still the predictions being used but a lot has changed with the models we use, with the predictions we are making. We haven’t put out any new projections – although that was originally the idea when UK CP09 projections came  out. So, officially, we are still using UKCP09. That produced coherant indications of more frequent and heavy rainfall in the UK. And UKCP09 suggests 15-20% increased in Rmed in winter. But these projections are based on daily rainfall, what was not indicated here was the increased hourly rate. So some of the models looking at decreased summer rainfall, which means a lower mean rainfall per hour, but actually that isn’t looking clear anymore. So there are clear gaps here, firstly with hourly level convective storms, and all climate models have the issue of when it comes to “conveyer belt” sequences of storms, it’s not clear models reliably reproduce these.

So, this is all bad news so far… But there is some good news. More recent models (CMIP5) suggest some more summer storms and accommodate some convective summer storms. And those newer models – CMIP5 and those that follow – will feed into the new projections. And some more good news… The models used in CP09, even high resolution regional models, ran on a resolution of 25km and downscaled using weather generator to 5km but no climate change information beyond 25km. Therefore within the 25km grid box the rain fall is averaged and doesn’t adequately resolve movement of air and clouds, adding a layer of uncertainty, as computers aren’t big/fast enough to do a proper job of resolving individual cloud systems. But Hayley, and colleagues at the Met Office, have been running higher resolution climate models which are similar for weather forecasting models at something like a 1.5km grid size. Doing that with climate data and projecting over the long term do seem to resolve the convective storms. That’s good in terms of new information. Changes look quite substantial: summer precipitation intensities are expected to increase by 30-40% for short duration heavy events. That’s significantly higher than UKCP09 but there are limitations/caveats here too… So far the simulations are on the South East of England only, simulations have been over 10 years in duration, but we’d really want more like 100 year model. And there is still poor understanding of the process and of what makes a thunderstorm – thermodynamic versus circulation changes may conflict. Local thermodynamics are important but that issue of circulation, the impacts of large outbreaks of warm air from across the continent, and that conflict between those processes is far from clear in terms of what makes the difference.

So, Hayley has been working on this with the Met Office, and she now has an EU project with colleagues in the Netherlands which is producing interesting initial results. There is a lot still to do but it does look like a larger increase in convection than we’d previously thought. Looking at winter storms we’ve seen an increase over the last few years. Even the UKCP09 models predicted some of this but so far we don’t see a complete change attributable to climate change.

Now, is any of this new? Our working experience and instrumental records tend to only go back 30-40 years, and that’s not long enough to understand climate change. So this is a quick note of historical work which has been taking place looking at Newcastle flooding history. Trawling through the records we see that the Toon Monsoon isn’t unheard of – we’ve had them three or four times in the last century:

  • 16th Set 1913 – 2.85 inches (72mm) in 90 minutes
  • 22nd June 1941 – 1.97 inches (50mm) in 35 minutes and 3.74 inches (95mm) in 85 minutes
  • 28th June 2012 – 50mm in 90 minutes

So, these look like incidents every 40 years or so looking at historic records. That’s very different from the FEH type models and how they account for Fluvial flooding, storms, etc.

In summary then climate models produce inherently uncertain predictions but major issues remain with extremes in general, and hourly rainfall extremes. Overall picture that is emerging is of increasing winter rainfall (intensity and frequency), potential for increased (summer) convective rainfall, and in any case there is evidence that climate variability over the last century has included similar extremes to those observed in the last decade.

And the work that Hannah and colleagues are working on are generating some really interesting results so do watch this space for forthcoming papers etc.

Q&A

Q1) Is that historical data work just on Newcastle?

A1) David has looked at Newcastle and some parts of Scotland. Others are looking at other areas though.

Q2) Last week in London on EU Referendum day saw extreme rainfall – not as major as here in 2012 – but that caused major impacts in terms of travel, moving of polling station etc. So what else is taking place in terms of work to understand these events and impacts?

A2) OK, so impacts wise that’s a bit difference. And a point of clarification – the “Toon Monsoon” wasn’t really a Monsoon (it just rhymes with Toon). Now the rainfall in London and Brighton being reported looked to be 40mm in an hour, which would be similar or greater than in Newcastle so I wouldn’t downplay them. The impact of these events on cities particularly is significant. In the same way that we’ve seen an increase in fluvial flooding in the last ten years, maybe we are also seeing an increase in these more intense shorter duration events. London is certainly very vulnerable – especially with underground systems. Newcastle Central here was closed because of water ingress at the front – probably wouldn’t happen now as modifications have been made – and metro lines shut. Even the flooding event in Paris a few weeks back was most severely impacting the underground rail/metro, road and even the Louvre. I do worry that city planners have build in vulnerability for just this sort of event.

Q3) I work in flood risk management for Dumfries and Galloway – we were one of the areas experiencing very high rainfall. We rely heavily in models, rainfall predictions etc. But we had an event on 26th/27th January that wasn’t predicted at all – traffic washed off the road, broke instrument peaks, evacuations were planned. SEPA and the Met office are looking at this but there is a gap here to handle this type of extreme rainfall on saturated ground.

A3) I’m not aware of that event, more so with flooding on 26th December which caused flooding here in Newcastle and more widespread. But that event does sound like the issue for the whole of that month for the whole country. It wasn’t just extreme amounts of daily rainfall, but it was the fact that the previous month had also been very wet. That combination of several months of heavy rainfall, followed by extreme (if not record breaking on their own) events really is an issue – it’s the soul of hydrology. And that really hasn’t been recognised to date. The storm event statistics tend to be the focus rather than storms and the antecedent conditions. But this comes down to flood managers having their own rules to deal with this. In the North East this issue has arisen with the River Tyne where the potential for repurposing rivers for flood water retention has been looked at – but you need 30 day predictions to be able to do that. And if this extreme event following a long period of rain really changes that and poses challenges.

Comment – Andy, Environment Agency) Just to note that EA DEFRA Wales have a programme to look at how we extend FEH but also looking at Paleo Geomorphology to extend that work. And some interesting results already.

Phil Younge, Environment Agency – The Future of Flood Risk Management

My role is as Yorkshire Major Incident Recovery Manager, and that involves three things: repairing damage; investing in at-risk communities; and engaging with those communities. I was brought in to do this because of another extreme weather event, and I’ll be talking about the sorts of things we need to do to address these types of challenges.

So, quickly, a bit of background on the Environment Agency. We are the National flood risk agency for England. And we have a broad remit including risk of e.g. implications of nuclear power stations, management of catchment areas, work with other flood risk agencies etc. And we directly look after 7100 km of river, coastal and tidel raised defences; 22,600 defences, with assets worth over 20 billion. There are lots of interventions we can make to reduce the risk to communities. But how do we engage with communities to make them more resiliant for whatever the weather may throw at them? Pause on that thought and I’ll return to it shortly.

So I want to briefly talk about the winter storms of 2015-16. The Foss Barrier in York is what is shown in this image – and what happened there made national news in terms of the impact on central York. The water levels were unprecedentedly high. And this event was across the North of England, with record river levels across the region and we are talking probably 1 metre higher than we had experienced before, since records began. So the “what if” scenarios are really being triggered here. Some of the defences built as a result of events in 2009 were significantly overtopped, so we have to rethink what we plan for in the future. So we had record rainfall, 14 catchments experienced their highest ever river flow. But the investment we had put in made a difference, we protected over 20,000 properties during storms Desmond and Eva – even though some of those defences have been overtopped in 2009. We saw 15k households and 2,600 businesses flooded in 2009, with 150 communities visited by flood support officers. We issued 92 flood warnings – and we only do that when there is a genuine risk to loss of life. We had military support, temporary barriers in place, etc. for this event but the levels were truly unprecedented.

Significant damage was done to our flood defences across the North of England. In parts of Cumbria the speed and impact of the water, the force and energy of that water, have made huge damage to roads and buildings. We have made substantial work to repair those properties to the condition they were in before the rain. We are spending around £24 million to do that and do it at speed for October 2016.

But what do we do about this? Within UK PLC how do we forecast and manage the impact and consequence of flooding across the country? Following the flooding in Cumbria Oliver Letwin set up the Flood Risk Resilience Review, to build upon the plans the Environment Agency and the Government already has, to look at what must be done differently to support communities across the whole of England. The Review has been working hard across the last four months, and there are four strands I want to share:

  • Modelling extreme weather and stress testing resilience to flood risk – What do we plan for? What is a realistic and scalable scenario to plan for? Looking back at that Yorkshire flooding, how does that compare to existing understanding of risk. Reflecting on likely extreme consequences as a yardstick for extreme scenarios.
  • Assessing the resilience of critical local infrastructure – How do we ensure that businesses still run, that we can run as a community. For instance in Leeds on Boxing Day our telecommunications were impacted by flooding. So how can we address that? How do we ensure water supply and treatment is fit for these events? How can we ensure our hospitals and health provision is appropriate? How can we ensure our transport infrastructure is up and running. As an aside the Leeds Boxing Day floods happened on a non working day – the Leeds rail station is the second busiest outside London so if that had happened on a working day the impact could have been quite different, much more severe.
  • Temporary defences – how can we move things around the country to reduce risk as needed, things like barriers and pumps. How do we move those? How do we assess when they are needed? How do we ensure we had the experience and skills to use those temporary defences? A review by the military has been wrapped into this Resilience Review.
  • Flood risk in core cities – London is being used as a benchmark, but we are also looking at cities like Leeds and how we invest to keep these core key cities operating at times of heightened flood risk.

So, we are looking at these areas, but also how we can help our communities to be more resilient. The Environment Agency are looking at community engagement and that’s particularly part of what we are here to do, to develop and work with the wider FCERM community.

We do have an investment programme from 2015-2021 which includes substantial capital investment. We are investing significantly in the North of England (e.g. £54 per person for everyone in Yorkshire, Lancashire, and Cumbria, also East Midlands and Northumbria. And that long planning window is letting us be strategic, to invest based on evidence of need. And in the Budget 2016 there was an additional £700 million for flood risk management to better protect 4,745 homes and 1,700 businesses. There will also be specific injections of investment in places like Leeds, York, Carlisle etc. to ensure we can cope with incidents like we had last year.

One thing that really came out of last year was the issue of pace. As a community we are used to thinking slowly before acting, but there is a lot of pressure from communities and from Government to act fast, to get programmes of work underway within 12 months of flooding incidents. Is that fast? Not if you live in an affected area, but it’s faster than we may be used to. That’s where the wealth of knowledge and experience needs to be available to make the right decisions quickly. We have to work together to do this.

And we need to look at innovation… So we have created “Mr Nosy”, a camera to put down culverts(?) and look for inspect them. We used to (and do) have teams of people with breathing apparatus etc. to do this, but we can put Mr Nosy down so that a team of two can inspect quickly. That saves time and money and we need more innovations that allow us to do this.

The Pitt  Review (2008) looked at climate change and future flood and coastal risk management discussed the challenges. There are many techniques to better defend a community, we need the right blend of approach: “flood risk cannot be managed by building ever bigger “hard” defences”; natural measures are sustainable; multiple benefits for people, properties and wildlife; multi-agency approach is the way forward. Community engagement is also crucial to inform the community to understand the scale of the risk, to understand how to live with risk in a positive way. So, this community enables us to work with research, we need that community engagement, and we need efficiency – that big government investment needs to be well spent, we need to work quickly and to shortcut to answers quickly but those have to be the right answers. And this community is well placed to help us ensure that we are doing the right things so that we can assure the communities, and assure the government that we are doing the right things.

Q&A

Q1) When is that Review due to report?

A1) Currently scheduled for middle of July, but thereabouts.

Q2) You mentioned the dredging of watercourses… On the back of major floods we seem to have dredging, then more 6 weeks lately. For the public there is a perception that that will reduce flood risk which is really the wrong message. And there are places that will continue to flood – maybe we have to move coastal towns back? You can’t just keep building walls that are bigger and bigger.

A2) Dredging depends so much on the circumstances. In Calderdale we are making a model so that people can understand what impact different measures have. Dredging helps but it isn’t the only things. We have complex hydro-dynamic models but how do we simply communicate how water levels are influenced, the ways we influence the river channel. And getting that message across will help us make changes with community understanding. In terms of adaptation I think you are spot on. Some communities will probably adapt because of that, but we can’t just build higher and higher walls. I am keen that flood risk is part of the vision for a community, and how that can be managed. Historically in the North East cities turned their backs on the river, as water quality has improved that has changed, which is great but brings its own challenges.

Q3) You mentioned a model, is that a physical model?

A3) Yes, a physical model to communicate that. We do go out and dredge where it is useful, but in many cases it is not which means we have to explain that when communities think it is the answer to flooding. Physical models are useful, apps are good… But how do we get across some of the challenges we face in river engineering.

Q4) You talked about community engagement but can you say more about what type of engagement that is?

A4) We go out into the communities, listen to the experiences and concerns, gathering evidence, understanding what that flooding means for them. Working with the local authorities those areas are now producing plans. So we had an event in Calderdale marking six months since the flood, discussing plans etc. But we won’t please all the people all of the time, so we need to get engagement across the community. And we need that pace – which means bringing the community along, listening to them, bringing into our plans… That is challenging but it is the right thing to do. At the end of the day they are the people living there, who need to reassured about how we manage risk and deliver appropriate solutions.

The next section of the day looks at: Research into Practice – Lessons from Industry:

David Wilkes – Global Flood Resilience, Arup – Engineering Future Cities, Blue-Green Infrastructure

This is a bit of an amalgam of some of the work from the Blue-Green Cities EPSRC programme, which I was on the advisory board of, and some of our own work at Arup.

Right now 50% of the global population live in cities – over 3.2 billion people. As we look forward, by the middle of this century (2050) we are expecting growth so that around 70% of the world population will live in cities, so 6.3 billion.

We were asked a while ago to give some evidence to the Third Inquiry of the All Party Parliamentary Group for Excellence in the Built Environment info flood migration and resilience, and we wanted to give some clear recommendations that: (1) Spatial planning is the key to long term resilience; (2) Implement programme of improved surface water flood hazard mapping; (3) Nurture capacity within professional community to ensure quality work in understanding flood risk takes place, and a need to provide career paths as part of that nurturing.

We were called into New York to give some support after Hurricane Sandy. They didn’t want a major reaction, a big change, instead they wanted a bottom up resilient approach, cross cutting areas including transportation, energy, land use, insurance and infrastructure finance. We proposed an iterative cycle around: redundancy; flexibility; safe failure; rapid rebound; constant learning. This is a quantum shift from our approach in the last 100 years so that learning is a crucial part of the process.

So, what is a Blue-Green city? Well if we look at the January 2014 rainfall anomaly map we see the shift from average annual rainfall. We saw huge flooding scarily close to London at that time, across the South East of England. Looking at the December 2015 we see that rainfall anomaly map again showing huge shift from the average, again heavily in the South East, but also South West and North of England. So, what do we do about that? Dredging may be part of this… But we need to be building with flood risk in mind, thinking laterally about what we do. And this is where the Blue-Green city idea comes in. There are many levels to this: Understand water cycle at catchment scale; Align with other drivers and development needs; identify partners, people who might help you achieve things, and what their priorities are; build a shared case for investment and action; check how it is working and learn from experience.

Looking, for instance, at Hull we see a city long challenged by flooding. It is a low lying city so to understand what could be done to reduce risk we needed to take a multi faceted view across the long term: looking at frequency/likelihood of risk, understand what is possible, looking at how changes and developments can also feed into local development. We have a few approaches available… There is the urban model, of drainage from concrete into underground drainage – the Blue – and the green model of absorbing surface water and managing it through green interventions.

In the Blue-Green Cities research approach you need to work with City Authority and Community Communications; you need to Model existing Flood Risk Management; Understand Citizens Behaviour, and you need to really make a clear Business Case for interventions. And as part of that process you need to overcome barriers to innovation – things like community expectations and changes, hazards, etc. In Newcastle, which volunteered to be a Blue-Green city research area, we formed the Newcastle Learning and Action Alliance to build a common shared understanding of what would be effective, acceptable, and practical. We really needed to understand citizens’ behaviours – local people are the local experts and you need to tap into that and respect that. Please really value Blue-Green assets but only if they understand how they work, the difference that they make. And indeed people offered to maintain Blue-Green assets – to remove litter etc. but again, only if they value and understand their purpose. And the community really need to feel a sense of ownership to make Blue-Green solutions work.

It is also really important to have modelling, to show that, to support your business case. Options include hard and soft defences. The Brunton Park flood alleviation scheme included landscape proposals, which provided a really clear business case. OfWATT wanted investment from the energy sector, they knew the costs of conventional sewerage, and actually this alternative approach is good value, and significantly cheaper – as both sewer and flood solution – than the previous siloed approach. There are also Grey-Green options – landscaping to store water in quite straightforward purposes, more imaginative purposes, and the water can be used for irrigation, wash down, etc. Again, building the business case is absolutely essential.

In the Blue-Green Cities research we were able to quantify direct and indirect costs to different stakeholders – primary industry, manufacturing, petroleum and chemical, utilities sector, construction, wholesale and retail, transport, hotels and restaurants, info and communication, financial and professional, other services. When you can quantify those costs you really have a strong case for the importance of interventions that reduce risk, that manage water appropriately. That matters whether spending tax payers money or convincing commercial partners to contribute to costs.

Commission of Inquiry into flood resilience of the future: “Living with Water” (2015), from the All Party Group for Excellence in the Built Environment, House of Commons, talk about “what is required is a fundamental change in how we view flood management…”

Q&A

Q1) I wanted to ask about how much green we would have to restore to make a difference? And I wanted to ask about the idea of local communities as the experts in their area but that can be problematic…

A1) I wouldn’t want to put a figure on the green space, you need to push the boundaries to make a real difference. But even small interventions can be significant. If the Blue-Green asset interrupts the flood path, that can be hugely significant. In terms of the costs of maintaining Blue-Green assets, well… I have a number of pet projects and ideas and I think that things like parks and city bike ways, and to have a flood overflow that also encourages the community to use it, will clearly be costlier than flat tarmac. But you can get Sustrans, local businesses, etc. to support that infrastructure and, if you get it right, that supports a better community. Softer, greener interventions require more maintenance but that can give back to the community all year round, and there are ways to do that. You made another point about local people being the experts. Local people do know about their own locality. Arguably as seasoned professionals we also know quite a bit. The key thing is to not be patronising, not to pretend you haven’t listened, but to build concensus, to avoid head to head dispute, to work with them.

Stephen Garvin, Director Global Resilience Centre, BRE – Adapting to change – multiple events and FRM

I will be talking about the built environment, and adaptations of buildings for flood resilience. I think this afternoon’s workshops can develop some of these areas a bit. I thought it would be good to reflect on recent flooding, and the difficulty of addressing these situations. The nature of flooding can vary so greatly in terms of the type and nature of floods. For instance the 2007 floods were very different from the 2012 flooding and fro the 2014 floods in terms of areas effected, the nature of the flood, etc. And then we saw the 2015/16 storms – the first time that every area at risk of flooding in Scotland and the North of the UK flooded – usually not all areas are hit at once.

In terms of the impact water damage is a major factor. So, for instance in Cumbria 2015, we had record rainfall, over-topped defences, Rivers Eden and Petrol, Water depth of 1.5m in some properties in Carlisle. That depth of flooding was very striking. A lot of terraced properties, with underfloor voids, were affected in Carlisle. And water was coming in from every direction. We can’t always keep water from coming in, so in some ways the challenge is getting water out of the properties. How do we deal with it? Some of these properties had had flood resilience measures before – such as raising the height of electrical sockets – but they were not necessarily high enough or useful enough in light of the high water. And properties change hands, are rented to new tenants, extensions are added – the awareness isn’t consistently there and some changes increase vulnerability to flooding.

For instance, one property had, after 2005 less severe floods had led to flood prevention measures being put in place – door surrounds, airbrick covers, and despite those measures water inundated the property. Why? Well there had been a conservatory added which, despite best efforts to seal it, let in a great deal of water. They had also added an outdoor socket for a tumble dryer a few feet off the ground. So we have to think about these measures – are they appropriate? Do they manage the risk sufficiently? How do we handle the flood memory? You can have a flood resilient kitchen installed, but what happens when it is replaced?

There are two approaches really: Flood resilience essentially allows the water to come in, but the building and its materials are able to recover from flooding; by comparison Flood resistance is about keeping water out, dry proof materials etc. And there are two dimensions here as we have to have a technical approach in design, construction, flood defences, sustainable approaches to drainage; and non-technical approaches – policy, regulation, decision making and engagement, etc. There are challenges here – construction are actually very small companies on the whole – more than 5 people is a big company. And we see insurers who are good at swinging into action after floods, but they do not always consider resilience or resistance that will have a long term impact so we are working to encourage that approach, that idea of not replacing like for like but replacing with better more flood resilient or resistant options. For instance there are solutions for apertures that are designed to keep water out to high depths – strong PVC doors, reinforced, and multi-point lockable for instance. In Germany, in Hamburg they have doors like this (though perforated brick work several feet higher!). You can also use materials to change materials, change designs of e.g. power sockets, service entries, etc.

Professor Eric Nehlsen came up with the idea of cascading flood compartments with adaptive response, starting from adaptation to flooding dry and wet-proofing (where we tend to work) through to more ambitious ideas like adaptation by floating and amphibious housing… Some US coastal communities take the approach of raising properties off the ground, or creating floating construction, particularly where hurricanes occur, but that doesn’t feel like the right solution in many cases here… But we have to understand and consider alternative approaches.

There are standards for floor repair – supported by BRE and industry – and there are six standards that fit into this area, which outline approaches to Flood risk assessment, planning for FRR, Property surveys, design and specification of flood resilient repair, construction work, maintenance and operation (some require maintenance over time). I’m going to use those standards for an FRR demonstration. We have offices in Watford in a Victorian Terrace, a 30 square metre space where we can test cases – have done this for energy efficiency before, have now done for flooding. This gives us a space to show what can be achieved, what interventions can be made, to help insurers, construction, policy makers see the possibilities. The age of the building means it is a simple construction – concrete floor and brick walls – so nothing fancy here. You can imagine some tests of materials, but there are no standards for construction products for repair and new builds for flood resistance and resilience. It is still challenging to drive adoption though – essentially we have to disrupt normal business and practice to see that change to resistant or resilient building materials.

Q&A

Q1) One of the challenges for construction is that insurance issue of replacing “like by like”…

A1) It is a major challenge. Insurance is renewed every year, and often online rather than by brokers. We are seeing some insurers introducing resilience and resistance but not wide-scale yet. Flood resilience grants through ECLG for Local Authorities and individuals are helpful, but no guarantee of that continuing. And otherwise we need to make the case to the property owner but that raises issues of affordability, cost, accessibility. So, a good question really.

Jaap Flikweert – Flood and Coastal Management Advisor, Royal HaskoningDHV – Resilience and adaptation: coastal management for the future

I’m going to give a practitioners perspective on ways of responding to climate change. I will particularly talk about adaptation which tends to be across three different areas/meanings: Protection (reduce likelihood); Resilience (reduce consequence); and Adaptation, which I’m going to bluntly call “Relocation” (move receptors away). And I’ll talk about inland flooding, coastal flooding and coastal erosion.. But I tend not to talk as much on coastal erosion as if we focus only on risk we can miss the opportunities. But I will be talking about risk – and I’ll be highlighting some areas for research as I talk.

So, how do we do our planning to think about how we do our city planning to manage the risk. I think the UK – England and Wales especially – are at the lead here in terms of Shoreline Management Plans – they are long term and broad scale view, there is a policy for coastal defence (HtL (Hold the Line)/MR (Managed Realignment)/NAI (No Active Intervention), Strong interaction with other sectors. Scotland are making progress here too. But there is a challenging to be flexible, to think about the process of change.

Setting future plans can be challenging – there is a great deal of uncertainty in terms of climate change, in terms of finances. We used to talk about a precautionary approach but I think we need to talk about “Managed-adaptive” approaches with decision pathways. For instance The Thames Barrier is an example of this sort of approach. This isn’t necessarily new work, there is a lot of good research to date about how to do this but it’s much more about mainstreaming that understanding and approach.

When we think about protection we need to think about how we sustain defences in a future with climate change? We will see loading increase (but extent uncertain); value at risk will increase (but extent uncertain); coastal squeeze and longshore impacts. We will see our beaches disappear – with both environmental and flood risk implications. An example from the Netherlands shows HtL feasible and affordable up to about 6m in sea level rise; with sandy solutions (also deal with coastal squeeze), and radical innovation is of vital importance.

We can’t translate that to the UK, it is a different context, but we need to see this as inspirational. In the UK we won’t hold the line for ever… So how do we deal with that? We can think about the structures, and I think there is research opportunity here about how we justify buying time for adaption, and how we design for short life (~20 years), and how we develop adaptable solutions. We can’t Hold the Line forever, but some communities are not ready for that change so we have to work on what we can achieve and how.

In terms of Resilience we need to think about coastal flooding – in principle not different from inland flooding, design to minimise impact, but in practice that is more difficult with lower change/higher consequence raising challenges of less awareness, more catastrophic if it does happen. New Orleans would be a pertinent example here. And when we see raised buildings – as David mentioned – those aren’t always suitable for the UK, they change how a community looks which may not be acceptable… Coastal erosion raises its own challenges too.

When we think of Adaptation/Relocation we have to acknowledge that protection is always technically possible but what if it was unaffordable or unsustainable. For example a disaster in Grahamstown, Queensland saw a major event in January 2011 lead to protective measures but the whole community moving in land in December 2011. There wasn’t a delay on funding etc. as this was an emergency, it forced the issue. But how can we do that in a planned way? We have Coastal change Pathfinders. This approach is very valuable including actual relocation, awareness, engagement lessons, policy innovation. But the approach is very difficult to mainstream because of funding, awareness, planning policy, local authority capacity. And here too I see research opportunities around making the business case for adaptation/relocation.

To take an example here that a colleague is working on. Fairbourne, Gwynedd, on the West Coast of Wales, is a small community, with a few buildings from the 1890s which has grown to 400 properties and over 800 people. Coastal defences were improved in 1981, and again in 2012. But this is a community which really shouldn’t be in that location in the long term, they are in the middle of flood plans. The Parish Council have adopted an SMP policy which has goals across different timings: in the short term to Hold the Line; medium term – Managed Realignment, and long term – No Active Intervention. There is a need to plan now to look at how we move from one position to another… So this isn’t dissemination needed here, it is true communication and engagement with the community, identifying who that community is to ensure that is effective.

So, in closing I think there is research needed around design for short life; consultation and engagement – about useful work done, lessons learned, moving from informing to involving to ownership, defining what a community is; Making the business case for supporting adaptation/relocation – investment in temporary protection to buy time; investment in increasing communities’ adaptive capacity; value of being prepared vs unprepared – damage (to the nation) such as lack of mobility, employability, burden on health and social services. And I’d like to close with the question: should we consider relocation for some inland areas at risk of flooding?

Q&A

Q1) That closing question… I was driving to a settlement in our area which has major flood risk, is frequently cut off by snow in the summer. There are few jobs there, it is not strategically key although it has a heritage value perhaps. We could be throwing good money after bad to protect a small settlement like that which has minimal contribution. So I would agree that we should look at relocation of some inland properties. Also, kudos to the parish council of Fairbourne for adopting that plan. We face real challenges as politicians are elected on 5 year terms, and getting them persuaded that they need to get communities to understand the long term risks and impacts is really challenging.

A1) I think no-one would claim that Fairbourne was an easy process. The Council adopted the SMP but who goes to parish meetings? But BBC Wales picked it up, rather misreported the timelines, but that raised interest hugely. But it’s worth noting that a big part of Gwynedd and mid Wales faces these challenges. Understanding what we preserve, where investment goes… How do we live with the idea of people living below sea level. The Dutch manage that but in a very different way and it’s the full nation who are on board, very different in the UK.

Q2) What about adopting Dutch models for managing risk here?

A2) We’ve been looking at various ways that we can learn from Dutch approaches, and how that compares and translates to a UK context.

And now, in a change to plans, we are rejuggling the event to do some reflection on the network – led by Prof. Garry Pender – before lunch. We’ll return with 2 minute presentations after that. Garry is keen that all attending complete the event feedback forms on the network, the role of the network, resources and channels such as the website, webinars, events, etc. I am sure FCERM.net would also welcome comments and feedback by email from those from this community who are not able to attend today. 

Sharing Best Practice – Just 2-minutes – Mini presentations from delegates sharing output, experience and best practice

 

I wasn’t able to take many notes from this session, as I was presenting a 2 minute session from my COBWEB colleague Barry Evans (Aberystwyth University), on our co-design work and research associated with our collaboration with the Tal-y-bont Floodees in Mid-Wales. In fact various requirements to re-schedule the day meant that the afternoon was more interactive but also not really appropriate for real time notation so, from hereon, I’m summarising the day. 

At this point in the day we moved to the Parallel Breakout sessions on Tools for the Future. I am leading Workshop 1 on crowd sourcing so won’t be blogging them, but include their titles here for reference:

  • Workshop 1 – Crowd-Sourcing Data and Citizen Science – An exploration of tools used to source environmental data from the public led by Nicola Osborne CSCS Network with case studies from SEPA. Slides and resources from this session will be available online shortly.
  • Workshop 2 – Multi-event modelling for resilience in urban planning An introduction to tools for simulating multiple storm events with consideration of the impacts on planning in urban environments with case studies from BRE and Scottish Government
  • Workshop 3 – Building Resilient Communities Best-practice guidance on engaging with communities to build resilience, led by Dr Esther Carmen with case studies from the SESAME project

We finished the day with a session on Filling the Gaps– Future Projects:

Breakout time for discussion around future needs and projects

I joined a really interesting Community Engagement breakout session, considering research gaps and challenges. Unsurprisingly much of the discussion centred on what we mean by community and how we might go about identifying and building relationships with communities. In particular there was a focus on engaging with transient communities – thinking particularly about urban and commuter areas where there are frequent changes in the community. 

Final Thoughts from FCERM.net – Prof. Garry Pender 

As the afternoon was running behind Garry closed with thank yous to the speakers and contributors to the day. 

Oct 202015
 
Digital Footprint campaign logo

I am involved in organising, and very much looking forward to, two events this week which I think will be of interest to Edinburgh-based readers of this blog. Both are taking place on Thursday and I’ll try to either liveblog or summarise them here.

If you are are based at Edinburgh University do consider booking these events or sharing the details with your colleagues or contacts at the University. If you are based further afield you might still be interested in taking a look at these and following up some of the links etc.

Firstly we have the fourth seminar of the new(ish) University of Edinburgh Crowd Sourcing and Citizen Science network:

Citizen Science and the Mass Media

Thursday, 22nd October 2015, 12 – 1.30 pm, Paterson’s Land 1.21, Old Moray House, Holyrood Road, Edinburgh.

“This session will be an opportunity to look at how media and communications can be used to promote a CSCS project and to engage and develop the community around a project.

The kinds of issues that we hope will be covered will include aspects such as understanding the purpose and audience for your project; gaining exposure from a project; communicating these types of projects effectively; engaging the press; expectation management;  practical issues such as timing, use of interviewees and quotes, etc.

We will have two guest presenters, Dave Kilbey from Natural Apptitude Ltd, and Ally Tibbitt from STV, followed by plenty of time for questions and discussion. The session will be chaired by Nicola Osborne (EDINA), drawing on her experience working on the COBWEB project.”

I am really excited about this session as both Dave and Ally have really interesting backgrounds: Dave runs his own app company and has worked on a range of high profile projects so has some great insights into what makes a project appealing to the media, what makes the difference to that project’s success, etc; Ally works as STV and has a background in journalism but also in community engagement, particularly around social and environmental projects. I think the combination will make for an excellent lunchtime session. UoE staff and students can register for the event via Eventbright, here.

On the same day we have our Principal’s Teaching Award Scheme seminar for the Managing Your Digital Footprints project:

Social media, students and digital footprints (PTAS research findings)

Thursday, 22nd October 2015, 2 – 3.30pm, IAD Resources Room, 7 Bristo Square, George Square, Edinburgh.

“This short information and interactive session will present findings from the PTAS Digital Footprint research http://edin.ac/1d1qY4K

In order to understand how students are curating their digital presence, key findings from two student surveys (1457 responses) as well as data from 16 in-depth interviews with six students will be presented. This unique dataset provides an opportunity for us to critically reflect on the changing internet landscape and take stock of how students are currently using social media; how they are presenting themselves online; and what challenges they face, such as cyberbullying, viewing inappropriate content or whether they have the digital skills to successfully navigate in online spaces.

The session will also introduce the next phase of the Digital Footprint research: social media in a learning & teaching context.  There will be an opportunity to discuss e-professionalism and social media guidelines for inclusion in handbooks/VLEs, as well as other areas.”

I am also really excited about this event, at which Louise Connelly, Sian Bayne, and I will be talking about the early findings from our Managing Your Digital Footprints project, and some of the outputs from the research and campaign (find these at: www.ed.ac.uk/iad/digitalfootprint).

Although this event is open to University staff and students only (register via the Online Bookings system, here), we are disseminating this work at a variety of events, publications etc. Our recent ECSM 2015 paper is the best overview of the work to date but expect to see more here in the near future about how we are taking forward this work. Do also get in touch with Louise or I if you have any questions about the project or would be interested in hearing more about the project, some of the associated training, or the research findings as they emerge.