Tracks in the snow: finding and making sense of the evidence for institutional transformation

The title ‘Tracks in the snow’ was designed to fit with the ALT-C 2011 conference theme (Thriving in a colder more challenging climate)

The kinds of evaluation we often do to show the impact of our work has a lot of parallels with following tracks in the snow. We are trying to identify a specific footprint relating to our activity and looking back at the impression it has made. A very focused evaluation plan might serve us very well for small-scale interventions, however …

… in a long-term project aimed at achieving major institutional change, things become a lot more complicated because other people start trampling their wellies all over the same space!

This paper stems from work with a range of institutions including some JISC activities (particularly the four year Curriculum Design projects) and also a review of Enhancing Learning Through Technology in Wales. Almost without exception, the people who were working to enhance or transform learning were doing so against a backdrop of significant structural change that makes it all the more difficult to know which factors are having the most impact.

Despite the external pressures, our institutions still have a core mission and are working towards a vision for where they want to be. The extent to which this vision is as dependent on external stimuli as you might think is something we’ll come back to later. 

I would argue that in this kind of situation it is fairly pointless to try to separate out all of the tracks on this journey. What we are really interested in is whether the institution as a whole has moved closer to its goals. We often talk about significant enhancement projects as change projects and we need to think about what it is we are actually trying to change.

I think what we are trying to change is institutional culture and I define culture as ‘the way things are done around here.’

What this post tries to do is:

•Look at a few models that have influenced the way I think about these issues

•Look at the experiences of some major change projects in the light of these models and

•Make some suggestions for things we could be trying to capture and measure that may bring us a little closer to understanding whether we are actually changing culture.

I don’t claim to have the answers but I hope the suggestions may lead to some interesting discussions on this blog or perhaps form the basis for an online session.

These notes represent a lightning run through models I find interesting to do with people and change and there is a list of references at the end of the post.

The image isn’t really to do with people or change at all but Chaos Theory makes a nice backdrop for the other models. The underlying premise here is of course that even in systems where you might expect quite deterministic cause and effect relationships, small variations in initial conditions can cause quite major fluctuations in outcomes. As human behaviour is the ultimate in non-linear systems, let’s not start out with any hope that patterns of change are going to be readily predictable.

The first group are models that concern the adoption of innovation.

The work of Malcolm Gladwell is very well known: the idea that once an innovation reaches a ‘Tipping Point’ continued adoption is self-sustaining. I’ve also been influenced a lot by the work of Albert Angehrn from INSEAD (and also Conner and Patterson’s eight stage model of the adoption of innovation). Angehrn has created an excellent simulation tool which is used in the JISC infoNet change management training courses and without giving too much away for those who may be interested in trying it, you very quickly learn that you need to understand both the formal and informal networks that exist in the organization in order to successfully influence enough people to get a change embedded (to reach that Tipping Point).

The second set of influences is work on social networks. By this I don’t mean Facebook and Twitter but rather how social ties in general operate. Some interesting work on the subject by Christakis and Fowler explains all sorts of social phenomena in terms of how social networks operate – one example is clusters of obesity affecting people who are at some social distance from one another. They suggest there is no simple cause and effect but it is to do with people adopting behaviours that spread to others and individuals making changes that affect other people’s perception of what is the norm.

The final influence is some work, by Gerry Johnson of Manchester University, on how institutions evolve their strategic direction. His diagram of the cultural web is used a lot but is often separated from its original context which looked at the ways in which strategy, often perceived to be a very logical planning process taking account of external factors, is actually much more heavily influenced by the institutional paradigm.

The paradigm is a generalized set of beliefs about the way the organization is and how it operates and it encompasses the politics, the rituals and the myths that make up organizational culture. External facts that don’t fit the paradigms can often be ignored in quite a surprising fashion. Johnson found strategy development was best understood by undertaking cognitive maps of key stakeholder’s views and triangulating these against events. He based his work in the retail industry but it is interesting to reflect on the extent to which views about a university’s core market and strengths and its response to the current climate are based on the institutional paradigm rather than the facts.

The common thread in a lot of these models is that it is the connections between people and the shared beliefs and behaviours that are particularly important. A critical mass of connections can create traits in networks that persist over time whilst individuals come and go. This for me is institutional culture.

Which brings me back to how we undertake change projects using technology. The traditional way used to be: bid for some funding from JISC, think of an acronym for your project (the sillier the better) and get out there and evangelise about what you were doing.

A lot of recent projects have approached change quite differently:

Many of the JISC curriculum design projects have deliberately not branded and marketed their activities, instead they have concentrated very hard on showing how what they are doing aligns with broader institutional goals.

Welsh universities, when they reviewed their progress with technology enhanced learning, felt that ‘branded’ projects simply created new silos within the organization and that real change was effected by plugging into existing networks.

A number of projects in the JISC Flexible Service Delivery programme adopted what they termed a ‘guerilla’ approach to change and

A very successful JISC Curriculum Delivery project at Exeter University, that is already having considerable impact in terms of its work on Students as agents of Change, realised that ‘change happens one conversation at a time’.

It might seem as if operating within the paradigm rather than explicitly setting out to challenge it is achieving considerable success in terms of stakeholder engagement but making it even more difficult for these projects to demonstrate their impact on institutional culture because they are so embedded. However, by looking at the bigger picture and by focusing on the glue rather than the building blocks, they seem to be coming up with a mixture of qualitative and quantitative ways of achieving this.

Finally, a run through some of the techniques that have been effective and highlight tips for measuring transformational and cultural change.

The first tip is to start from a baseline. You can’t measure the distance travelled unless you have some idea where you started. It was a requirement of funding that the curriculum design projects had to do this and many of them were initially very sceptical about the value of trying to take a snapshot of such a moving target. They took various approaches to determining the baseline and these are summed up in a report by Helen Beetham (see references)

Suffice it to say that the process mattered more than the specific techniques and within a very short space of time the project teams were convinced of the value of this work.

Some projects undertook quantitative analysis. The OU created a set of profiles looking at its courses from a range of perspectives including financial and pedagogic perspectives. Bolton did some basic but powerful analytics after finding that many discussions relating to course approval and validation were based on assumptions rather than facts.

Producing statistics about learning outcomes and assessment types helped show where some modules appeared to be over assessed or where the same learning outcome was assessed multiple times. Quite simply it changed the conversation. Snapshots of these types of analytics will show changes in practice in particular disciplines over time.

Every bit as important as having some kind of quantitative evidence is capturing what Birmingham City University has termed the ‘lived experience’ of curriculum design.

This is similar to Johnson’s work on cognitive mapping. Regardless of what the facts may actually be, it is people’s perceptions and assumptions that affect day-to-day practice.

BCU has made extensive use of video to capture and analyse stakeholder experiences. Greenwich University has used techniques such as Rich Pictures very effectively to identify certain myths that needed to be exploded before the institution could make progress in changing some of its processes.

Finally, what is needed in terms of looking at culture change is finding ways to look at how the individuals who make up the organization (and hence the culture) are connected and communicate. The most significant changes tend to happen when you get linkup and collaboration between parts of the organization that didn’t previously talk to one another.

This idea that it is the connections rather than the entities that really create the structure doesn’t just apply to social networks. You will find that architects looking to innovate also look not so much at the structures as at the linkages, the connections, the spaces between them.

Some of this is actually quite easy to measure in the Web 2.0 connected world. As a very small example this diagram shows the interconnectedness of people who were tweeting using a particular hash tag at a meeting. It would be interesting to look at a series of snapshots like this over time to track how the network expands or contracts and the frequency with which particular tags are used –  in other words who is talking to who and what they talking about. In some institutions there may only be a very small percentage of staff who are actually using tools like this at the moment and a growth in this type of communication would represent a significant cultural change.

In other cases it may be appropriate to track the formal networks in the institution i.e. the committees that exist and the parts of the organisation that are represented on the committees.

Many projects are finding that changes in the language used within the organisation reflect changes in perception and understanding of curriculum design and other processes. This language is manifest in policy and procedural documents and guidance used for staff development purposes. Again identifying phrases and descriptions that are key at the baselining stage and mapping how this changes over time can give an important perspective on institutional transformation.

Tips for evaluating Culutre Change

•Start from a Baseline

 •Capture the ‘Lived Experience’

 •Look at the Connections rather than the silos

•Find out about how others are doing this via the Design Studio
http://jiscdesignstudio.pbworks.com

References

Angehrn, A. Schönwald, I. Euler, D. and Seufert, S., (2005). Behind EduChallenge: An Overview of Models Underlying the Dynamics of a Simulation on Change Management in Higher Education. SCIL-Universität St Gallen, SCIL Report 7, December 2005. Retrieved 1 Sept 2011 from: http://www.scil.ch/fileadmin/Container/Leistungen/Veroeffentlichungen/2006-01-euler-seufert-behind-educhallenge.pdf

Bartholomew, P. (2010) TechnologySupported Processes for Agile and Responsive Curricula Project Interim Report. Retrieved 1 Sept 2011 from: http://www.jisc.ac.uk/media/documents/programmes/curriculumdesign/tsparcinterimreportoct2010.pdf

Beetham, H. (2009) Synthesis Report: Baselining the Institutional Processes of Curriculum Design. Retrieved 1 Sept 2011 from: http://www.jisc.ac.uk/whatwedo/programmes/elearning/curriculumdesign.aspx

Christakis, N. and Fowler, J. (2010) Connected: The Amazing Power of Social Networks and How They Shape Our Lives. Harper Collins.

Conner, D. R. & Patterson, R. B. (1982). Building Commitment to Organisational Change. Training and Development Journal, 36(4), 18-30.

Gladwell, M. (2000). The tipping point: how little things can make a big difference. Boston: Little Brown.

Johnson, G. (1988). Rethinking incrementalism. Strategic Management Journal, 9(1), 75-91. Retrieved 1 Sept 2011 from: http://onlinelibrary.wiley.com/doi/10.1002/smj.4250090107/abstract

Leslie, K. Dunne, E. Newcombe, M. Taylor, L. and Potter, D. (2010) Integrative Technologies Project, Final Report. Retrieved 1 Sept 2011 from: http://www.jisc.ac.uk/media/documents/programmes/curriculumdelivery/integrate_FinalReport.pdf

Leave a Reply

The following information is needed for us to identify you and display your comment. We’ll use it, as described in our standard privacy notice, to provide the service you’ve requested, as well as to identify problems or ways to make the service better. We’ll keep the information until we are told that you no longer want us to hold it.
Your email address will not be published. Required fields are marked *