In conversation with Dawn Wood – an organic approach to evaluation

Just before Christmas I spoke with Dawn Wood from the PC3 (Personalised Curriculum through Coaching) project. Dawn gave me an update on what is happening with the project, but we also talked about how their evaluation has moved away from the traditional research approach.

PC3 has been exploring ways in which the principles of development coaching can be used to give students a more personalised experience throughout their university course. The project’s initial focus was at a course level, looking at how to enable students to design a personalised course from having developed a deeper understanding of their learning needs via coaching. This did not prove as fruitful as hoped, so the project was refocused on embedding coaching directly within existing modules/courses and enabling students to develop self and peer coaching. This is used to help students focus on their immediate and long-term development needs.

As the PC3 Research Officer, Dawn undertakes the day-to-day administration of the project as well as working with the team to develop, implement and disseminate the project’s research and evaluation activities. At the moment, Dawn’s main focus is on providing coaching to students and running workshops on coaching practice for staff and students. She also spends a lot of time networking and having conversations about what coaching can provide. As the project moves into the final stages, Dawn tells me she’ll be concentrating more on data analysis and writing the final reports!

How would you describe the approach to evaluation that the PC3 project adopted initially?

The project started by focusing on the PLC (Personalised Learning through Coaching) module as a means for students to identify the direction they wanted their studies to take. As part of planning the evaluation of the PLC module, detailed ethics documentation was prepared. This included an outline of the demographics of the target population, how data would be captured and revisited over the remaining three years of the project, details of how consent would be sought, and so forth. Overall, it was a very traditional research approach with a clear structure for specific data capture points.

At that time, the students would only know me as a relatively distant ‘researcher’, even though I was involved in writing some of the module materials.

What happened to change that approach?

Mainly, it came down to structural changes within the institution, such that some of the pilot cohorts we had identified would no longer exist.

Another strong influence was the lack of success with the original module. Twelve individuals had agreed to take part, but the data we were able to collect was very scattered, with only two complete sets. There was a real lack of engagement with the evaluation process. Essentially the 12 students were committed to enhancing their own learning via the PC3 coaching process, but were not so committed to the underlying research evaluation process. As we had originally planned to encourage uptake to the PC3 process using the evidence gathered from this first pilot, this compounded the problems created by the changes in the institution.

So this was the impetus to move to a more organic, flexible research approach?

Absolutely. We changed from being primarily researcher led and instead ask students ‘What can you commit to for the purposes of evaluating the impact of PC3?’ We negotiate this with students, and agree the data capture method they will use, such as written or video reflections. The details are then added in an appendix to our now considerably thinner ethical documentation!

We have also been more flexible and creative in terms of the evidence that we are using. For example, we repurpose assessed reflections as a source of evidence. This has meant we aren’t always able to design our own instruments, and we also have to engage our stakeholders in the process of identifying possible sources of evidence and negotiating with them what they are prepared to share. However, this kind of approach has meant we are much more aware of the resources available to us, we are better placed to pick up on the unexpected, and we identify issues sooner.

How would you describe this revised approach and what does it mean for your own role?

We now have an approach that is based in action research. As part of that I am more involved in the coaching process, rather than just looking in on it. Having direct experience of coaching students has meant that I have been able to identify issues that had not emerged from other sources. For example, it became clear there was a conflict between what tutors wanted to achieve in tutorials and what was possible within the coaching process. This had not come out in our interviews with staff, but now we are looking at this more deeply and working on improving embedding of the coaching process.

I feel this approach gives me a greater understanding of the student perspective. We have also worked with tutors to capture their views on how successful they have found the coaching process. Findings from interviews with one tutor team have been collated, anonymised and fed back to the team to help further improve how coaching works within their course.

Can you tell us some more about how students are more directly involved in designing the evaluation process, and the implications of this?

Yes, we now also have a group of student coaching ambassadors. Drawing on their experiences of coaching, the student ambassadors are looking at how best to promote the benefits of coaching to other students and staff. They have recently completed the same training the project provides for staff, and are in the process of developing a plan for promotional activities.

As for the implications, with the student ambassadors we had to balance our need to be able to document their activities, with our desire to hand over to them genuine responsibility for the project. After all, if this was to be their project, how could we impose a particular evidence gathering approach on them? We resolved this by engaging them in the process of designing the evaluation plan for the student ambassador project and asking them to propose ways in which they could provide the evidence we need. We hope that this will lead to greater buy in to the process.

Finally, are there any words of wisdom you’d like to pass on to other projects undertaking their own evaluations?

We’ve learned along the way, and part of that has been a realisation that a broader vision is sometimes better. Be prepared to change your approach and willing to look at alternatives. The traditional approach to research may not provide the evidence you had hoped for, so keep alert to opportunities.

The team are currently preparing additional coaching materials that will be made available on the Design Studio. In the meantime, there are various coaching resources on the PC3 blog, including videos demonstrating coaching conversations and coaching workshop resources.

Enhanced by Zemanta

Leave a Reply

The following information is needed for us to identify you and display your comment. We’ll use it, as described in our standard privacy notice, to provide the service you’ve requested, as well as to identify problems or ways to make the service better. We’ll keep the information until we are told that you no longer want us to hold it.
Your email address will not be published. Required fields are marked *