Twitter story from October Design Programme Meeting

The penultimate Curriculum Design Programme meeting took place earlier this week in Nottingham. Three and a half years into the funding cycle, the meeting focused on life after programme. What are the most effective ways to share, embed, build on the changes instigated by projects within and across institutions?

To give an overview of the two days, I’ve collated the “story” based on the #jisccdd twitter stream.

View “Sustaining and embedding changes to curriculum design practices and processes” on Storify

In conversation with Rebecca Galley – Using video to gather evidence

Earlier this year we heard from a number of projects as part of an Elluminate session on Using video within the Curriculum Design programme: from personal reflection to evaluation evidence. During the session, Rebecca Galley and her colleague Andrew Charlton-Perez talked about whole project use of video within the Open University Learning Design Initiative project (OULDI).

Picking up from that session, I spoke with Rebecca recently to further explore her views on the pros and cons of using video as part of the evaluation process.

To start can you tell us how video is being used as part of the evaluation of OULDI?

Rebecca: We’ve used video in three different ways:

Firstly, to capture expert evaluation of Cloudworks. Many of these people are very busy, or we just happened to get talking about Cloudworks at a conference. We have found it much easier to grab experts for a quick, fairly informal video ‘chat’ than to book them for a more formal interview. Added to that the results have been just as useful for informing development. On another project, the OU-OPAL project, we’ve used the same ‘on the spot expert video‘ approach to capture barriers and enablers to Open Educational Practices (OEP) prior to the development of an OEP maturity matrix. We’ve also used video to promote wider engagement in the project consultation process through spotlight discussions.

Secondly, we offered a range of reporting options to academics trialling the OULDI tools and resources. One of the options was a reflective video log. One academic at Reading chose to keep a curriculum design video diary. He found it a very rich and interesting process. So much so that he felt it impacted positively on his design process by enabling him to reflect on and consolidate his design activity in a way that he wouldn’t have done otherwise. For the project team, this was especially interesting because the video log captures the learning design process from a personal perspective.

Finally, and more recently, within our Cluster we’ve been developing a series of video podcasts on the theme ‘Design problem, design solution’. These will be uploaded to YouTube in late September. In these four minute videos we have interviewed a range people involved in the project and asked them what they perceived their design problem was and how our tools, resources, methodology helped them solve it. We think that these videos authentically and compellingly show the degree and range of impact the projects have had. (This is an example of an early problem/solution interview.)

That’s a great range of how video is being used. Now let’s talk about the practicalities, what approach have you taken to collecting video data?

Rebecca: We’ve gone for a very low-tech approach, mainly because the primary benefits of video for us are flexibility and authenticity. So, we generally use the flip or iphone camera and keep editing to a minimum. Where appropriate and agreed we have made all interviews, whether audio, video or transcript openly available and have archived videos on YouTube or Vimeo and embedded in Cloudworks.

Are there any practical issues that others looking to use video should be aware of?

Rebecca: Our biggest practical problem has been coping with the wide variety of file types we have been getting because all the partners use different cameras. In the end we have had to purchase specialist editing software. We bought Sony Vegas Movie Studio Platinum.

In your presentation you hinted at the resource implications of managing and analysing video data, and referred to light touch analysis. Could you tell us a bit more about that?

Rebecca: The OULDI project is quite complex with a number of strands across five institutions, for this reason we are using Nvivo to collate and code all our data. The Nvivo software allows for the collation and coding of all sorts of data, which is really useful. For example, it means that we can do a search on data which relates to novice users of Compendiumld and get a whole range of data back including video excerpts, excerpts from blog postings, Compendium designs and textual feedback and compare this with data we have from expert users. For ease, everything has gone into Nvivo but we have had to be really disciplined and focused about what and how we code – and why – otherwise we could spend all our time on analysis that we can’t use.

In terms of presenting video findings, you mentioned archiving using YouTube or Vimeo, with more open presentation on Cloudworks. Why have you taken this approach?

Rebecca: The OULDI project is very much informed by the open education, OER and open scholarship movements. We see value in publicly sharing our conceptual ideas and thinking, research tools, methodologies, and findings, and opening these up for community discussion and critique. We hope that our use of Cloudworks has enabled a community of interest to develop around the project, for example educators who have been involved in earlier project workshops and trials are able to stay in touch with development and give us feedback on these from their own perspective which is really valuable.

Finally, are there any caveats you would put on the use of video?

Rebecca: I think that the use of video for research and evaluation, especially where it is made publicly available, significantly changes the relationships between researchers, project participants (previously ‘subjects’), the audience and the video data itself. I’m not saying this is necessarily a bad thing but the impact of this on the reliability and validity of the data needs to be recognised and critically considered. My personal thinking is that it should only ever be used as a small part of a broader multi-methods approach.

Design Bash 2011

On 30 September, CETIS are once again running a Design Bash at the University of Oxford.

As in previous years this event will be very hands on allowing people to share their learning designs, tools and systems and to explore potential collaborations. Once again, we’ll be using Cloudworks to share resources and activity on the day.

This year we hope to extend out from our core learning design community to involve those involved with building and using tools and standards dealing with course information, describing learning opportunities (xcri), and competencies e.g. The Co-gent project.

As with last year, I hope that projects will be able to send representatives to the event, which is free to attend. More information and registration is available by following this link.

Different routes to evidencing value

As well as encouraging curriculum innovation, the Transforming Curriculum Delivery Through Technology Programme had a strong focus on gathering and making effective use of evidence. This was intended to inform the 15 funded projects, but also to provide evidence to the wider sector of the potential benefits of using technology to support learning. It was very much part of the current context of striving for more evidence, particularly “evidence for ‘what works?’” (Sharpe, 2011) and of funded projects adding value by generating evidence (Draper & Nicol, 2006).

But how did the projects do it? What kinds of approach to evaluation did they adopt, and what methods did they use?

As evaluation support consultant to the programme, I have prepared a report that summarises the evaluation methods and techniques used by projects. While this is not a comprehensive guide to evaluation, it does give an insight into the wide variety of approaches that were applied. Hopefully, this will spark ideas in those setting out on the process of project evaluation.

The Curriculum Delivery Programme did not advocate a particular approach to evaluation, although projects were expected to undertake their own evaluation or engage an evaluator. This self-directed evaluation was perhaps unusual when the programme started, and it’s fair to say it was uncertain what direction different projects would take. Afterall, in this diverse programme, projects teams brought a wide mix of skills, and for some evaluation was entirely new.

Many of the standard data collection methods were applied including focus groups, interviews, observation and questionnaires. Technology was also employed, with many projects undertaking video interviews that were later compiled and made available on the web. (Such as this overview of student interviews from Making Assessment Count.) The more unusual methods included time motion studies, which were used to forecast efficiency savings; cognitive mapping, here’s an example causal map from DUCKLING; and social network analysis using Gephi. Projects also found innovate ways of engaging with students such as employing them as researchers.

On the basis of 15 very different projects, it is not possible to draw conclusions on the ‘best’ approach to evaluate curriculum innovations. There were, however, a number of projects that applied or adapted action research, and a typical evaluation cycle based in action research is described in the report. Other approaches that feature in the report include appreciative inquiry, balanced scorecard, and engaging an independent evaluator as well as more traditional formative evaluation approaches.

Draper, S. and Nicol, D. (2006) Transformation in e-learning. Short paper delivered at ALT-C 2006

Sharpe, R. (2011) Evidence and evaluation in learning technology research. Research in Learning Technology. 19(1) p1-4.

The big picture…

I will be writing a series of  posts about the Institutional Approaches to Curriculum Design programme as a guest on the JISC CETIS ‘other voices’ blog.

The first post Curriculum Design: The Big Picture is published today and sets the scene for the series which will be looking at the range of institutional processes and systems that the projects are considering and changing to support their curriculum design activities. This first post highlights the approaches they’ve taken to illustrate and map some of their processes and where they connect.

There are some really interesting approaches to capturing the range of processes and highlighting where they connect. This has revealed where there is potential to share data between systems to simplify activities, improve efficiency by reducing duplication and enhancing the student and staff experience. After speaking with a few of the project teams I am struck most by the value of these mapping activities to stimulate new conversations and change the way staff view the systems that support their activities. It is this cultural change that will ultimately be transformational and make the project activities sustainable long after project funding ends.

CongenT tool, creating, using and sharing learning outcomes

**NB this post has been amended from a post on my CETIS blog **

How do you write learning outcomes? Do you really ensure that they are meaningful to you, to you students, to your academic board? Do you sometimes cut and paste from other courses? Are they just something that has to be done and are a bit opaque but do they job?

I suspect for most people involved in the development and teaching of courses, it’s a combination of all of the above. So, how can you ensure your learning outcomes are really engaging with all your key stakeholders?

Creating meaningful discussions around developing learning outcomes with employers was the starting point for the CogenT project (funded through the JISC Life Long Learning and Workforce Development Programme). Last week I attended a workshop where the project demonstrated the online toolkit they have developed. Initially designed to help foster meaningful and creative dialogue during co-circular course developments with employers, as the tool has developed and others have started to use it, a range of uses and possibilities have emerged.

As well as fostering creative dialogue and common understanding, the team wanted to develop a way to evidence discussions for QA purposes which showed explicit mappings between the expert employer language and academic/pedagogic language and the eventual learning outcomes used in formal course documentation.

Early versions of the toolkit started with the inclusion of number of relevant (and available) frameworks and vocabularies for level descriptors, from which the team extracted and contextualised key verbs into a list view.

List view of Cogent toolkit

List view of Cogent toolkit

(Ongoing development hopes to include the import of competencies frameworks and the use of XCRI CAP.)

Early feedback found that the list view was a bit off-putting so the developmers created a cloud view.

Cloud view of CongeT toolkit

Cloud view of CongeT toolkit

and a Blooms view (based on Blooms Taxonomy).

Blooms View of CogenT toolkit

Blooms View of CogenT toolkit

By choosing verbs, the user is directed to set of recognised learning outcomes and can start to build and customize these for their own specific purpose.

CogenT learning outcomes

CogenT learning outcomes

As the tool uses standard frameworks, early user feedback started to highlight the potential for other uses for it such as: APEL; using it as part of HEAR reporting; using it with adult returners to education to help identify experience and skills; writing new learning outcomes and an almost natural progression to creating learning designs. Another really interesting use of the toolkit has been with learners. A case study at the University of Bedfordshire University has shown that students have found the toolkit very useful in helping them understand the differences and expectations of learning outcomes at different levels for example to paraphrase student feedback after using the tool ” I didn’t realise that evaluation at level 4 was different than evaluation at level 3″.

Unsurprisingly it was the learning design aspect that piqued my interest, and as the workshop progressed and we saw more examples of the toolkit in use, I could see it becoming another part of the the curriculum design tools and workflow jigsaw.

A number of the Design projects have revised curriculum documents now e.g. PALET and SRC, which clearly define the type of information needed to be inputted. The design workshops the Viewpoints project is running are proving to be very successful in getting people started on the course (re)design process (and like Co-genT use key verbs as discussion prompts).

So, for example I can see potential for course design teams after for taking part in a Viewpoints workshop then using the Co-genT tool to progress those outputs to specific learning outcomes (validated by the frameworks in the toolkit and/or ones they wanted to add) and then completing institutional documentation. I could also see toolkit being used in conjunction with a pedagogic planning tool such as Phoebe and the LDSE.

The Design projects could also play a useful role in helping to populate the toolkit with any competency or other recognised frameworks they are using. There could also be potential for using the toolkit as part of the development of XCRI to include more teaching and learning related information, by helping to identify common education fields through surfacing commonly used and recognised level descriptors and competencies and the potential development of identifiers for them.

Although JISC funding is now at an end, the team are continuing to refine and develop the tool and are looking for feedback. You can find out more from the project website. Paul Bailey has also written an excellent summary of the workshop.

I’d be interested in hearing any thoughts from projects.

Transforming curriculum delivery through technology: New JISC guide and radio show launched

A new JISC guide ” Transforming curriculum delivery through technology: Stories of challenge, benefit and change” has been launched today.

a mini-guide to the outcomes of the JISC Transforming Curriculum Delivery Through Technology programme, summarises the headline benefits of technology in curriculum delivery made evident by the work of the 15 projects in the programme The outcomes of these projects provide a rich insight into the ways in which institutions and individual curriculum areas can make use of technology to respond more robustly to the demands of a changing world.”

You can access PDF and text only versions of the guide, or order a print copy by following this link

The latest installment of the JISC on Air series, Efficiences, enhancements and transformation: how technology can deliver includes interviews with two projects involved in the programme, (Making the New Diploma a Success and eBioLabs) discussing the impact achieved in two very different contexts and disciplines.

From challenge to change: how technology can transform curriculum delivery

**NB this post has been amended from a post on my CETIS blog**

A recording of the online presentation “From challenge to change: how technology can transform curriculum delivery” by Lisa Gray (JISC Progamme Manager), Marianne Sheppard (Researcher/Analyst, JISC infoNet and project co-ordinator for the Support and Synthesis project) and myself is now available online.

Session Synopsis:
During 2008–2010, the JISC Transforming Curriculum Delivery through Technology Programme investigated the potential of technology to support more flexible and creative models of curriculum delivery in colleges and universities. The 15 projects within the programme sought to address a wide range of challenges such as: improving motivation, achievement and retention; managing large cohorts; supporting remote and distance learners; engaging learners with feedback; responsiveness to changing stakeholder needs; delivering resource efficiencies which enhance the quality of the learning experience. Through the various project investigations, the programme has learned how and where technology can not only add value but can transform the way in which the curriculum is delivered in different contexts.

This session summarized the key messages and findings emerging from the work of the projects and demonstrated some of the outputs from the projects available from the Design Studio.

Communicating technical change – the trojan horse of technology

**NB this post has been amended from a post on my CETIS blog**

As the Curriculum Design Programme is now entering its final year, the recent Programme meeting focused on effective sharing of outputs. The theme of the day was “Going beyond the obvious, talking about challenge and change”.

In the morning there were a number of breakout sessions around different methods/approaches of how to effectively tell stories from projects. I co-facilitated the “Telling the Story – representing technical change” session.

Now, as anyone who has been involved in any project that involved implementing of changing technology systems, one of the keys to success is actually not to talk too much about the technology itself – but to highlight the benefits of what it actually does/will do. Of course there are times when projects need to have in-depth technical conversations, but in terms of the wider project story, the technical details don’t need to be at the forefront. What is vital is that that the project can articulate change processes both in technical and human work-flow terms.

Each project in the programme undertook an extensive base-lining exercise to identify the processes and systems (human and technical) involved in the curriculum design process ( the PiP Process workflow model is a good example of the output of this activity).

Most projects agreed that this activity had been really useful in allowing wider conversations around the curriculum design and approval process, as there actually weren’t any formal spaces for these types of discussions. In the session there was also the feeling that actually, technology was the trojan horse around which the often trickier human process issues could be discussed. As with all educational technology related projects all projects have had issues with language and common understandings.

So what are the successful techniques or “stories” around communicating technical changes? Peter Bird and Rachael Forsyth from the SRC project shared their experiences with using and external consultant to run stakeholder engagement workshops around the development of a new academic database. They have also written a comprehensive case study on their experiences. The screen shot below captures some of the issues the project had to deal with – and I’m sure that this could represents views in practically any institution.

MMU have now created their new database and have a documentation which is being rolled out. You can see a version of it in the Design Studio. There was quite a bit of discussion in the group about how they managed to get a relatively minimal set of fields (5 learning outcomes, 2 assessments) – some of that was down that well known BOAFP (back of a fag packet) methodology . . .

Conversely, the PALET team at Cardiff are now having to add more fields to their programme and module forms now they are integrating with SITS and have more feedback from students. Again you can see examples of these in the Design Studio. The T-Sparc project have also undertaken extensive stakeholder engagement (in which they used a number of techniques including video which was part of another break out session) and are now starting to work with a dedicated sharepoint developer to build their new webforms. To aid collaboration the user interface will have discussion tabs and then the system will create a definitive PDF for a central document store, it will also be able to route the data into other relevant places such as course handbooks, KIS returns etc.

As you can see from the links in the text we are starting to build up a number of examples of course and module specifications in the Design Studio, and this will only grow as more projects start to share their outputs in this space over the coming year. One thing the group discussed which the support team will work with the projects to try and create is some kind of check list for course documentation creation based on the findings of all the projects. There was also a lot of discussion around the practical issues of course information management and general data management e.g. data creation, storage, workflow, versioning, instances.

As I pointed out in my previous post about the meeting, it was great to see such a lot of sharing going on in the meeting and that these experiences are now being shared via a number of routes including the Design Studio.

Talking about change at the May Design Programme Meeting

“Going beyond the obvious, talking about challenge and change” was the title of yesterday’s Design Programme meeting in Birmingham yesterday. The activities provided a range very engaging and thought provoking discussions from all the project teams. To give a flavour of the day I’ve created a twitter story from they day:
[View the story “Talking about challenge and change” on Storify].

One other thing that struck me yesterday was the spirit of community sharing, the projects seemed genuinely interested in sharing experiences and approaches with each other. Using Tony Hirst’s twitter connectedness visualisation tool, you can see the interconnectedness of the people using the programme hash tag (#jisccdd).Visualisation of twitter connections, 11 May, #jisccdd

Of course, this only gives one view of connections/community for the programme, but imho it is an important one – as it shows the concentration of activity of the day within a core group. If you look at the graph today there are probably more people and connections outwith the core connections.