Enhanced programme maps

Part of the programme support role for CETIS is to record and monitor the use and development of technologies across the programmes. This information is primarily stored in our PROD database, and then I contextualise the data primarily through blog posts like this one.

We now have an openly available linked data store of our PROD data, which means we can mash up data from with data for other sources. (See this post for more information). An example of this is that we now have integrated geo-locations for every institution in the UK in the triple store, allowing us to create enhanced google maps. As an illustration of this I’ve created maps for both the Design and Delivery programmes. Clicking on the screenshots below will take you to the interactive version of each map which includes a link to each projects PROD entry and Design Studio page.

Design Projects

Delivery Projects

More information how the maps were made is available here.

In conversation with Dawn Wood – an organic approach to evaluation

Just before Christmas I spoke with Dawn Wood from the PC3 (Personalised Curriculum through Coaching) project. Dawn gave me an update on what is happening with the project, but we also talked about how their evaluation has moved away from the traditional research approach.

PC3 has been exploring ways in which the principles of development coaching can be used to give students a more personalised experience throughout their university course. The project’s initial focus was at a course level, looking at how to enable students to design a personalised course from having developed a deeper understanding of their learning needs via coaching. This did not prove as fruitful as hoped, so the project was refocused on embedding coaching directly within existing modules/courses and enabling students to develop self and peer coaching. This is used to help students focus on their immediate and long-term development needs.

As the PC3 Research Officer, Dawn undertakes the day-to-day administration of the project as well as working with the team to develop, implement and disseminate the project’s research and evaluation activities. At the moment, Dawn’s main focus is on providing coaching to students and running workshops on coaching practice for staff and students. She also spends a lot of time networking and having conversations about what coaching can provide. As the project moves into the final stages, Dawn tells me she’ll be concentrating more on data analysis and writing the final reports!

How would you describe the approach to evaluation that the PC3 project adopted initially?

The project started by focusing on the PLC (Personalised Learning through Coaching) module as a means for students to identify the direction they wanted their studies to take. As part of planning the evaluation of the PLC module, detailed ethics documentation was prepared. This included an outline of the demographics of the target population, how data would be captured and revisited over the remaining three years of the project, details of how consent would be sought, and so forth. Overall, it was a very traditional research approach with a clear structure for specific data capture points.

At that time, the students would only know me as a relatively distant ‘researcher’, even though I was involved in writing some of the module materials.

What happened to change that approach?

Mainly, it came down to structural changes within the institution, such that some of the pilot cohorts we had identified would no longer exist.

Another strong influence was the lack of success with the original module. Twelve individuals had agreed to take part, but the data we were able to collect was very scattered, with only two complete sets. There was a real lack of engagement with the evaluation process. Essentially the 12 students were committed to enhancing their own learning via the PC3 coaching process, but were not so committed to the underlying research evaluation process. As we had originally planned to encourage uptake to the PC3 process using the evidence gathered from this first pilot, this compounded the problems created by the changes in the institution.

So this was the impetus to move to a more organic, flexible research approach?

Absolutely. We changed from being primarily researcher led and instead ask students ‘What can you commit to for the purposes of evaluating the impact of PC3?’ We negotiate this with students, and agree the data capture method they will use, such as written or video reflections. The details are then added in an appendix to our now considerably thinner ethical documentation!

We have also been more flexible and creative in terms of the evidence that we are using. For example, we repurpose assessed reflections as a source of evidence. This has meant we aren’t always able to design our own instruments, and we also have to engage our stakeholders in the process of identifying possible sources of evidence and negotiating with them what they are prepared to share. However, this kind of approach has meant we are much more aware of the resources available to us, we are better placed to pick up on the unexpected, and we identify issues sooner.

How would you describe this revised approach and what does it mean for your own role?

We now have an approach that is based in action research. As part of that I am more involved in the coaching process, rather than just looking in on it. Having direct experience of coaching students has meant that I have been able to identify issues that had not emerged from other sources. For example, it became clear there was a conflict between what tutors wanted to achieve in tutorials and what was possible within the coaching process. This had not come out in our interviews with staff, but now we are looking at this more deeply and working on improving embedding of the coaching process.

I feel this approach gives me a greater understanding of the student perspective. We have also worked with tutors to capture their views on how successful they have found the coaching process. Findings from interviews with one tutor team have been collated, anonymised and fed back to the team to help further improve how coaching works within their course.

Can you tell us some more about how students are more directly involved in designing the evaluation process, and the implications of this?

Yes, we now also have a group of student coaching ambassadors. Drawing on their experiences of coaching, the student ambassadors are looking at how best to promote the benefits of coaching to other students and staff. They have recently completed the same training the project provides for staff, and are in the process of developing a plan for promotional activities.

As for the implications, with the student ambassadors we had to balance our need to be able to document their activities, with our desire to hand over to them genuine responsibility for the project. After all, if this was to be their project, how could we impose a particular evidence gathering approach on them? We resolved this by engaging them in the process of designing the evaluation plan for the student ambassador project and asking them to propose ways in which they could provide the evidence we need. We hope that this will lead to greater buy in to the process.

Finally, are there any words of wisdom you’d like to pass on to other projects undertaking their own evaluations?

We’ve learned along the way, and part of that has been a realisation that a broader vision is sometimes better. Be prepared to change your approach and willing to look at alternatives. The traditional approach to research may not provide the evidence you had hoped for, so keep alert to opportunities.

The team are currently preparing additional coaching materials that will be made available on the Design Studio. In the meantime, there are various coaching resources on the PC3 blog, including videos demonstrating coaching conversations and coaching workshop resources.

Enhanced by Zemanta

Feature article on Dynamic Learning Maps project

The latest installment of Lou McGill’s series of posts highlighting the work of the Design and Delivery programmes takes an in-depth look at the Dynamic Learning Maps project, from the University of Newcastle. DLM was funded through the Curriculum Delivery Programme. It’s focus was on creating dynamic visualisatons of the medical curriculum.

“The DLM project aimed to make visible the complex medical ‘spiral curriculum’ where topics are revisited with increasing depth over a 5 year programme, and to reveal connections across curricula to support modular learning. High level goals included a desire to promote lifelong learning, enhance employability and enable personalisation for students. The diverse nature of stakeholders with unique views of the curricula increased the complexity of the project and led to some interesting paths and choices as the team developed the maps. Simply getting agreement on what constitutes a curriculum map proved challenging and required a highly flexible approach and agile development. Like many technical development projects DLM had to balance the need to develop a common understanding with the need to reflect different stakeholder requirements. Agreeing on what elements to include and the level of detail was important as well as the fundamental issues around how they might be used by both staff and students.”

You can read the full article full article on the JISC CETIS Other Voices blog.

Tracks in the snow: finding and making sense of the evidence for institutional transformation

The title ‘Tracks in the snow’ was designed to fit with the ALT-C 2011 conference theme (Thriving in a colder more challenging climate)

The kinds of evaluation we often do to show the impact of our work has a lot of parallels with following tracks in the snow. We are trying to identify a specific footprint relating to our activity and looking back at the impression it has made. A very focused evaluation plan might serve us very well for small-scale interventions, however …

… in a long-term project aimed at achieving major institutional change, things become a lot more complicated because other people start trampling their wellies all over the same space!

This paper stems from work with a range of institutions including some JISC activities (particularly the four year Curriculum Design projects) and also a review of Enhancing Learning Through Technology in Wales. Almost without exception, the people who were working to enhance or transform learning were doing so against a backdrop of significant structural change that makes it all the more difficult to know which factors are having the most impact.

Despite the external pressures, our institutions still have a core mission and are working towards a vision for where they want to be. The extent to which this vision is as dependent on external stimuli as you might think is something we’ll come back to later. 

I would argue that in this kind of situation it is fairly pointless to try to separate out all of the tracks on this journey. What we are really interested in is whether the institution as a whole has moved closer to its goals. We often talk about significant enhancement projects as change projects and we need to think about what it is we are actually trying to change.

I think what we are trying to change is institutional culture and I define culture as ‘the way things are done around here.’

What this post tries to do is:

•Look at a few models that have influenced the way I think about these issues

•Look at the experiences of some major change projects in the light of these models and

•Make some suggestions for things we could be trying to capture and measure that may bring us a little closer to understanding whether we are actually changing culture.

I don’t claim to have the answers but I hope the suggestions may lead to some interesting discussions on this blog or perhaps form the basis for an online session.

These notes represent a lightning run through models I find interesting to do with people and change and there is a list of references at the end of the post.

The image isn’t really to do with people or change at all but Chaos Theory makes a nice backdrop for the other models. The underlying premise here is of course that even in systems where you might expect quite deterministic cause and effect relationships, small variations in initial conditions can cause quite major fluctuations in outcomes. As human behaviour is the ultimate in non-linear systems, let’s not start out with any hope that patterns of change are going to be readily predictable.

The first group are models that concern the adoption of innovation.

The work of Malcolm Gladwell is very well known: the idea that once an innovation reaches a ‘Tipping Point’ continued adoption is self-sustaining. I’ve also been influenced a lot by the work of Albert Angehrn from INSEAD (and also Conner and Patterson’s eight stage model of the adoption of innovation). Angehrn has created an excellent simulation tool which is used in the JISC infoNet change management training courses and without giving too much away for those who may be interested in trying it, you very quickly learn that you need to understand both the formal and informal networks that exist in the organization in order to successfully influence enough people to get a change embedded (to reach that Tipping Point).

The second set of influences is work on social networks. By this I don’t mean Facebook and Twitter but rather how social ties in general operate. Some interesting work on the subject by Christakis and Fowler explains all sorts of social phenomena in terms of how social networks operate – one example is clusters of obesity affecting people who are at some social distance from one another. They suggest there is no simple cause and effect but it is to do with people adopting behaviours that spread to others and individuals making changes that affect other people’s perception of what is the norm.

The final influence is some work, by Gerry Johnson of Manchester University, on how institutions evolve their strategic direction. His diagram of the cultural web is used a lot but is often separated from its original context which looked at the ways in which strategy, often perceived to be a very logical planning process taking account of external factors, is actually much more heavily influenced by the institutional paradigm.

The paradigm is a generalized set of beliefs about the way the organization is and how it operates and it encompasses the politics, the rituals and the myths that make up organizational culture. External facts that don’t fit the paradigms can often be ignored in quite a surprising fashion. Johnson found strategy development was best understood by undertaking cognitive maps of key stakeholder’s views and triangulating these against events. He based his work in the retail industry but it is interesting to reflect on the extent to which views about a university’s core market and strengths and its response to the current climate are based on the institutional paradigm rather than the facts.

The common thread in a lot of these models is that it is the connections between people and the shared beliefs and behaviours that are particularly important. A critical mass of connections can create traits in networks that persist over time whilst individuals come and go. This for me is institutional culture.

Which brings me back to how we undertake change projects using technology. The traditional way used to be: bid for some funding from JISC, think of an acronym for your project (the sillier the better) and get out there and evangelise about what you were doing.

A lot of recent projects have approached change quite differently:

Many of the JISC curriculum design projects have deliberately not branded and marketed their activities, instead they have concentrated very hard on showing how what they are doing aligns with broader institutional goals.

Welsh universities, when they reviewed their progress with technology enhanced learning, felt that ‘branded’ projects simply created new silos within the organization and that real change was effected by plugging into existing networks.

A number of projects in the JISC Flexible Service Delivery programme adopted what they termed a ‘guerilla’ approach to change and

A very successful JISC Curriculum Delivery project at Exeter University, that is already having considerable impact in terms of its work on Students as agents of Change, realised that ‘change happens one conversation at a time’.

It might seem as if operating within the paradigm rather than explicitly setting out to challenge it is achieving considerable success in terms of stakeholder engagement but making it even more difficult for these projects to demonstrate their impact on institutional culture because they are so embedded. However, by looking at the bigger picture and by focusing on the glue rather than the building blocks, they seem to be coming up with a mixture of qualitative and quantitative ways of achieving this.

Finally, a run through some of the techniques that have been effective and highlight tips for measuring transformational and cultural change.

The first tip is to start from a baseline. You can’t measure the distance travelled unless you have some idea where you started. It was a requirement of funding that the curriculum design projects had to do this and many of them were initially very sceptical about the value of trying to take a snapshot of such a moving target. They took various approaches to determining the baseline and these are summed up in a report by Helen Beetham (see references)

Suffice it to say that the process mattered more than the specific techniques and within a very short space of time the project teams were convinced of the value of this work.

Some projects undertook quantitative analysis. The OU created a set of profiles looking at its courses from a range of perspectives including financial and pedagogic perspectives. Bolton did some basic but powerful analytics after finding that many discussions relating to course approval and validation were based on assumptions rather than facts.

Producing statistics about learning outcomes and assessment types helped show where some modules appeared to be over assessed or where the same learning outcome was assessed multiple times. Quite simply it changed the conversation. Snapshots of these types of analytics will show changes in practice in particular disciplines over time.

Every bit as important as having some kind of quantitative evidence is capturing what Birmingham City University has termed the ‘lived experience’ of curriculum design.

This is similar to Johnson’s work on cognitive mapping. Regardless of what the facts may actually be, it is people’s perceptions and assumptions that affect day-to-day practice.

BCU has made extensive use of video to capture and analyse stakeholder experiences. Greenwich University has used techniques such as Rich Pictures very effectively to identify certain myths that needed to be exploded before the institution could make progress in changing some of its processes.

Finally, what is needed in terms of looking at culture change is finding ways to look at how the individuals who make up the organization (and hence the culture) are connected and communicate. The most significant changes tend to happen when you get linkup and collaboration between parts of the organization that didn’t previously talk to one another.

This idea that it is the connections rather than the entities that really create the structure doesn’t just apply to social networks. You will find that architects looking to innovate also look not so much at the structures as at the linkages, the connections, the spaces between them.

Some of this is actually quite easy to measure in the Web 2.0 connected world. As a very small example this diagram shows the interconnectedness of people who were tweeting using a particular hash tag at a meeting. It would be interesting to look at a series of snapshots like this over time to track how the network expands or contracts and the frequency with which particular tags are used –  in other words who is talking to who and what they talking about. In some institutions there may only be a very small percentage of staff who are actually using tools like this at the moment and a growth in this type of communication would represent a significant cultural change.

In other cases it may be appropriate to track the formal networks in the institution i.e. the committees that exist and the parts of the organisation that are represented on the committees.

Many projects are finding that changes in the language used within the organisation reflect changes in perception and understanding of curriculum design and other processes. This language is manifest in policy and procedural documents and guidance used for staff development purposes. Again identifying phrases and descriptions that are key at the baselining stage and mapping how this changes over time can give an important perspective on institutional transformation.

Tips for evaluating Culutre Change

•Start from a Baseline

 •Capture the ‘Lived Experience’

 •Look at the Connections rather than the silos

•Find out about how others are doing this via the Design Studio
http://jiscdesignstudio.pbworks.com

References

Angehrn, A. Schönwald, I. Euler, D. and Seufert, S., (2005). Behind EduChallenge: An Overview of Models Underlying the Dynamics of a Simulation on Change Management in Higher Education. SCIL-Universität St Gallen, SCIL Report 7, December 2005. Retrieved 1 Sept 2011 from: http://www.scil.ch/fileadmin/Container/Leistungen/Veroeffentlichungen/2006-01-euler-seufert-behind-educhallenge.pdf

Bartholomew, P. (2010) TechnologySupported Processes for Agile and Responsive Curricula Project Interim Report. Retrieved 1 Sept 2011 from: http://www.jisc.ac.uk/media/documents/programmes/curriculumdesign/tsparcinterimreportoct2010.pdf

Beetham, H. (2009) Synthesis Report: Baselining the Institutional Processes of Curriculum Design. Retrieved 1 Sept 2011 from: http://www.jisc.ac.uk/whatwedo/programmes/elearning/curriculumdesign.aspx

Christakis, N. and Fowler, J. (2010) Connected: The Amazing Power of Social Networks and How They Shape Our Lives. Harper Collins.

Conner, D. R. & Patterson, R. B. (1982). Building Commitment to Organisational Change. Training and Development Journal, 36(4), 18-30.

Gladwell, M. (2000). The tipping point: how little things can make a big difference. Boston: Little Brown.

Johnson, G. (1988). Rethinking incrementalism. Strategic Management Journal, 9(1), 75-91. Retrieved 1 Sept 2011 from: http://onlinelibrary.wiley.com/doi/10.1002/smj.4250090107/abstract

Leslie, K. Dunne, E. Newcombe, M. Taylor, L. and Potter, D. (2010) Integrative Technologies Project, Final Report. Retrieved 1 Sept 2011 from: http://www.jisc.ac.uk/media/documents/programmes/curriculumdelivery/integrate_FinalReport.pdf

Debating our achievements

One of the activities that took place at the last programme meeting was a debate on the topic:

This house believes that this programme will not actually change the pedagogic practice of curriculum design’’

The debate took place under the Chatham House Rule but, by popular demand, we have been persuaded to bend the rules slightly and summarise some of the points made for and against the motion.

The debate took a slightly different slant to that which might have been expected and concentrated more on the value of undertaking these activities as part of a formal programme as well as looking at whether the changes made to institutional processes did or did not change pedagogy. It was an interesting and heated discussion and these are just a few of the key points.

Points for the motion:

  • Change only happens when real pressure builds up and this has little to do with either institutional strategy or funded initiatives.
  • The changes so far have been organisational and structural but we are not yet seeing systemic and transformational change that leads through into a different student experience.
  • The ‘engine room’ for curriculum change is not in the formal policy making bodies that exist in the sector.
  • We can’t really tell how many of these changes would have happened anyway and what extra the programme added.
  • Projects who started from a principled pedagogic stance have had more disappointments than those who started by looking at institutional processes.
  • Better forms filled in online are not enough to improve pedagogy in themselves – we still need to address poor course design.
  • We started with some naïve assumptions about institutional change.

Points against the motion:

  • Turmoil presents open doors that we can push against. Everyone now accepts that the status quo cannot go on so the external environment offers fertile ground in which to effect real change.
  • It was a masterstroke that the requirements of the programme obliged us to spend a long time in reviewing the existing baseline. This meant we arrived at solutions people were already signed up to.
  • The timing was exactly right – had we started any later we would have been too far into the problem zone and less ready with solutions.
  • We had a situation where people wanted to change and needed to change and the programme provided the opportunity to change.
  • The programme has impacted 25% of HEIs and about 20% of the overall student body and we are seeing some real cultural shifts such as academics ranking learning design tools in the top five things they wanted to learn about.
  • Changing process isn’t enough but it changes the rules and people have to respond to that.
  • The programme offered more than money – it was a licence to work in ways that weren’t just bottom up. The intervention speeded the process up and amplified the results.
  • Better designed processes can lead to better courses.
  • We may not have had a credible theory of institutional change at the start but what JISC set up was a fantastic living laboratory.

The result:

The motion was defeated by a significant majority although a slight shift in opinion shows that the team in favour of the motion did make their case very well and raised some important points that need to be considered when communicating the work of the programme.

Accreditation! A games based approach to supporting curriculum development

**NB this post has been amended from a post on my CETIS blog**

Earlier this week Rachel Forsyth and Nicola Whitton from the SRC (Supporting Response Curricula) Project at MMU led a webinar titled “Models of Responsiveness”. The session focused on the ways the team have been working with staff across the institution around the complex internal and external issues and drivers around developing “responsive” curricula. The project has done a lot of work in developing a model for measuring responsiveness (see screen shot below) and more information on their work around this is available in the Design Studio.

A Model of Course Responsiveness (SRC)

A Model of Course Responsiveness (SRC)

A core part of the SRC project has been around developing ways to engage staff in not only recognising the need for change but also in helping staff (technical, administrative and academic) make changes in an appropriate and timely manner. The team also recognised that certain aspects of the course approval process could be quite dry. So, to try and make a more engaging experience, as well as a series of traditional support materials, the team have developed a board game called Accreditation! which has been designed specifically to increase knowledge of course approval processes.

Accreditation!

Accreditation!

Working in pairs, players have to move through three zones, and are faced with a series of series of course approval related dilemmas. Five “quality” stars are needed in order for players to move from zone to zone. Although we only had time to look at a couple of the dilemmas during the session, it was clear that they have been based on very real experiences and are great discussion starters.

Of course games don’t appeal to everyone, and Nicola did point out that at a recent conference some players got a bit carried away with the gaming element and just wanted to win. However, I do think that this approach could have a lot of potential to engage and start discussions around the many aspects of curriculum design.

The game has been released under a CC licence and is available from the Design Studio, and if you did want to use it, you could also develop your own dilemmas too. The team are keen to get feedback from anyone who has used it too.

A recording of the very engaging presentation (c. 1 hour in duration) is available here.

Overview of Design Bash 11

Last month CETIS held a Design Bash at the University of Oxford, and there were a number of colleagues from the Design and Delivery Programmes in attendance. The day is an opportunity for those involved in the design process to get together and share tools, experiences, designs and practice. From the work emerging from the Design programme, I was particularly keen to encourage people to share their workflows i.e. how and when they are/could be using various tools (f2f or online).

I’ve written a number pre and post blogs about the day, which I’ve combined using the memolane service. Unfortunately the set up of this blog doesn’t let me embed the timeline however ( you can see an example here) , the screenshot below does give a view of it. You can view the story (and read the blog posts) by follow this link.

I’d be interested if you have any thoughts or examples of workflows.

Time line of Design Bash 11 blog posts

Outputs, deliverables and other stuff

**NB this post has been amended from a post on my CETIS blog**

Sustaining and embedding changes to curriculum design practices and processes was the theme for the Design Programme meeting held last week in Nottingham.

The projects are now in their final year of a four year funding cycle, and the focus of the activities and discussions were to:

“*Explore how projects can best ensure their activities result in real and sustained changes to curriculum design processes and practices and how to evidence this impact
*Showcase innovative practice from the Curriculum Design programme and explore and discuss how these outputs can assist in transforming curriculum design more widely in other institutions
*Further explore how projects can contribute to the programme level narrative around how institutions are changing the processes and practices relating to curriculum design and the role technology plays within this”

So that by then end of the two days, projects would (hopefully) be able to:

“* outline a clear approach to sustaining their innovations and changes to the curriculum design practices and processes
*outline benefits realisation proposals for embedding their outputs to support institutional enhancement and realising the benefits of their projects more widely
*all projects will have a clearer understanding of the good practice, innovation and findings which have emerged from programme and how this can enhance their own projects and practice.”

Unsurprisingly all the projects have been on quite a journey over the past three and half years. There have been changes to project staff; most projects have had at least one change of Vice Chancellor had to deal with the various re-shuffling of senior management teams which that inevitably brings. For projects concerned with institutional level change and indeed with any project tasked with embedding a change in practice these changes at senior management have been particularly challenging. Set this against the current political climate we have to give credit to all the projects for managing to navigate their way through particularly choppy waters. But will projects leave a legacy which actually is able to sustain and embed changes to practice?

Paul Bailey and Peter Chatterton led a session on managing change and used a really nice visual metaphor of a snowball to represent the different push-pull and self momentum that projects can often find themselves in. I think it’s fair to say that most projects have found that in their discussions and base-lining activities that the “curriculum design” space was ripe for conversations. A number of projects have had to deal with some significant pressures of scope creep, and being seen as the panacea for whole host of related issues.

Stephen Brown and the projects from one of the programme cluster groups then led a session on sustaining change. This allowed for a very useful discussions around project identity, outputs and deliverables and how to “hand on” using that great catchall term, the “stuff” projects have produced. Helen Beetham has written up this session on the Programme Blog far more eloquently than I could. From the marketplace activity where projects were given an opportunity to show off their wares, there is a lot of great “stuff” coming out of this programme.

One of the high points of the meeting was the debate, where the quite challenging motion proposed was “This house believes that this programme will not actually change the pedagogic practice of curriculum design”. I won’t go into details on the substance of the debate here, however one question that I should have raised (but of course didn’t ) was – if this programme can’t, then what will? When JISC did fund a programme specifically around changing pedagogic practice (the Design for Learning Programme) one of the clear messages that came out was that projects couldn’t make any sustained impact on practice if they weren’t embedded in wider institutional processes around the curriculum design process. Whilst I can see that some projects maybe don’t see themselves as having direct impact on practice as they are more focused on the business process end of things; at a programme level I believe there is growing evidence that overall there are quite significant impacts being made. I’m not sure if this was planned or just one of those serendipitous coincidences but I think this post from Martin Weller whilst the meeting was in full swing is a good example of precisely how the programme is changing the pedagogic practice of curriculum design.

Directions of travel

More than just pretty pictures, the outcomes from the ‘directions of travel’ exercise will be used to update the relevant pages on the Design Studio, and will be written up to support the development of high level messages and impact indicators in the final year of the programme. Thanks to everyone who took part in this activity.

Thoughts on sustainability

I thought I’d share some thoughts on the first session at the recent programme meeting in Nottingham. With the session being ably led by Stephen Brown, I was able to gather a number of key points from the discussion, which will be added to the Design Studio pages on sustainability and working for change.

Change can be hampered by institutional location (lack of status, association with specific agendas, etc) but enhanced by having an ‘agile’ location i.e. project team that is ambiguously or multiply located across the institution, project activities that are taken up at different sites and with different ownership. Having champions in departments and professional services is essential to this agility/hybridity.

Collaboration such as we have through the cluster groups can also help with this issue. People coming from ‘outside’ to talk about change do not carry any particular baggage about status, role or location etc. Also the CAMEL-type approach of the clusters allows for ideas to be communicated across institutions at different levels – between managers, systems people, staff developers, etc etc – again making sure the message is multiply anchored.

In the interests of getting things done it can be helpful to have a clear project identity, perhaps based around a particular location, team members, acronym etc. We have seen evidence of project identities being used to open up particular kinds of conversation, to act as a shorthand for certain kinds of change (perhaps an alternative short-hand to that used by management). The space of a project can be helpfully neutral in terms of institutional baggage. It also carries the cachet of external funding and external interest. However, towards the end of a project’s lifetime it is important that the identity is diffused and subsumed into the areas of the institution where project outcomes are being taken up, or project goals being pursued, or the project’s direction of travel is being supported. The project should not continue to hang around like a ghost or a bad smell, but hand on its values and goals before making a quiet exit.

The main focus of this session was on how this ‘handing on’ is best managed, in other words how innovations initiated by projects are sustained after their funding ends. We were encouraged by Stephen to think about how the outputs of our projects align with the ongoing responsibilities and priorities of different institutional stakeholders. (An aside from me on the distinction between outputs and outcomes: outcomes are inevitably situated in a context i.e. this is what happened here. Outputs may inform change in other contexts.) One concern in this kind of conversation is that ‘outputs’ can be conceived of too narrowly as products, whereas the most valuable outputs can sometimes be the processes that have led to a particular outcome. The processes themselves – baselining, iterative evaluation and feedback, consultations, collaborations, team relationships – can become embedded. Representations of processes – guides, toolkits, process models, workflows – can also be shared to support similar changes in other contexts, or in the same context going forward into the future.

Another question that came up – from the Enable project – is whether projects should always aim to make stakeholders happy. One valuable outcome might be to make stakeholders less happy so they demand changes to processes and resources in the institution. So if outputs need to be good examples of a better world, an outcome may be stakeholder dissatisfaction with the world as it is.

A final question from me: can projects simultaneously make things better (perhaps by picking the most tractable issues within a general problem space that they have defined) and be the conscience for more radical and continuous change going into the future?

Link to JISC’s sustaining and embedding innovations good practice guide