Glossary

Some words or phrases have a specific meaning within the context of Active Implementation or refer to particular theories or practices. Definitions and explanations can be found on a specific glossary document when the word or phrase is {highlighted like this}.


Effective Implementation

 

The HOW – this refers to how effective the implementation is, paying attention to the stages of implementation, and creating and maintaining the supporting infrastructure needed to ensure that the innovation is in place, is used as intended and leads to the intended outcomes.

Alongside the articulation of the Effective Practices (‘the What’), it was important to consider how to effectively implement them. Guided by the Active Implementation frameworks, particularly the Implementation Stages and the Implementation Drivers, CELCIS supported the practitioners in Dundee to understand how to change practice and drive forward the development and implementation of the new practice. In this section, we provide insight into what was required to do this and speak to some of the people in Dundee who were involved.

The Implementation Stages

Active Implementation helps us to see implementation not as a short-term event, but as a long-term process composed of four stages:

  • Exploration – Understand and Decide: This first stage involves exploring and understanding the system, identifying the needs and assets, creating readiness for changes across the system, assessing how the programme or practice fits with the needs of children and families, as well as assessing the feasibility of implementation.
  • Installation – Get Ready: This stage involves building the infrastructure necessary to implement the innovation, which includes continuing to build readiness for change, focusing on local sites and building practitioner and organisational capacity.
  • Initial Implementation – Get started to get better: This stage includes the initial efforts of staff to use the programme or practice, with attention to using data for continuous improvement.
  • Full Implementation – Sustain and Grow: This final stage focuses on ensuring that the new practice continues to be delivered to a high standard by the majority of the workforce (as demonstrated through routine data collection), and that the outcomes are socially significant.

Whilst Exploration is the starting point of implementation, the progression from one stage to another is not necessarily linear, as this video explains.

When implementation is sustained by strong Enabling Contexts and is focused on a well-defined and clear innovation, such as an evidence-based programme (EBP), it might take up to four or five years to reach Full Implementation. It will likely take longer where the innovation is complex or not yet operationalised, or where the system is complex. That has been the case in the ANEW programme, where complexity came into play and no off-the-shelf evidence-based programme or practice was available to use (see the WHAT section).

In the process of establishing a well-defined innovation, the ANEW work invested time in testing out new elements of practice and a range of practice and data tools across a cohort of sites. In practice, this has required an iterative interplay between the stages, predominantly the exploration and installation stages, whilst continuously further articulating an evidence-informed approach.

Focusing on the long-term impact of complex change

Because the ANEW programme was working within a complex system and needed to create new and innovative ways to address the challenges in Dundee, our team knew that the implementation process may take several years to come to fruition. It was therefore important to build commitment and buy-in from practitioners and leadership in Dundee, knowing that the ANEW programme was a long-term process.

Pressures were felt at times because of the perceived slow pace of the work, but the ANEW programme resisted the enticement of ‘quick-wins’ (small changes of practice, happening in small pockets rather than across the system) and remained committed to advancing the operationalisation and implementation of a complex innovation. Dundee City Council and NHS Tayside are continuing the journey towards full implementation.

The Implementation Drivers - How practice was changed within the Dundee children’s services workforce  

To make the ‘What’ a reality, practitioners need to understand the changes, and be supported to make these changes, and that is where the Implementation Drivers can provide a helpful framework for guiding the implementation efforts.

ANEW Reliable benefits.png
© Fixsen & Blasé, 2008 
  • Competency drivers help to develop, improve and sustain practitioners’ confidence and competence to deliver high-quality practice, focusing on their selection, training and coaching to the practice.
  • Organisational drivers help to ensure sustainability both at system and organisational levels. They ensure that the roles, functions and structures support the innovation and that a system is in place to enable decisions to be informed by data (Decision Support Data System)
  • Leadership drivers refer to the decisions and actions of leaders to resolve adaptive issues and technical problems that arise from supporting changes in ways of work and supporting complex change.

Find out more about the Implementation Drivers from this video.

A number of approaches were taken as part of the ANEW programme to change practice among Dundee’s health visiting, early years and school staff. Importantly, these approaches are inter-related and mutually reinforcing. The main approaches adopted have been: 

Strengthening the workforce through training and ‘coaching to practice’:

In order to successfully implement any change to practice, it is important to ensure that the workforce is supported to develop new knowledge to underpin skills development. However, research shows that when we go beyond training alone and also provide coaching for practitioners who are involved in a change to practice we see the greatest changes on the ground. This type of coaching may be through ongoing opportunities for feedback and reflection, and is most impactful when it focuses on the specific practice, experiences and learning needs of individual practitioners (‘coaching to practice’). Examples of coaching that helped to implement the effective practices identified in the ‘What’ section include:

  • The ‘Meeting Buddies’ were offered initial training by Children 1st, followed by individualised coaching to help embed the practice.
  • Coaching and on-going support was offered to school and nursery staff in Named Person role by other members of the Dundee Implementation Team. Our learning shows that Educational Psychologists appear to be well placed to step into a coaching role for Named Persons, given their skillset and focus on improving multi-agency assessment, planning of interventions and their overall contribution to the strategic and operational delivery of GIRFEC – read more here.
  • The coaching approach in relation to the home visits carried out by health visitors was strengthened through the introduction of data tools and a more regular time for debrief and reflection within the health visiting team.
  • CELCIS facilitated the changes to the training and coaching approaches by closely working with the local implementation teams, bringing learning from theory and research, and modelling key skills focused on supporting reflective spaces and offering and receiving positive and developmental feedback.

Regular observation of practice

Active Implementation stresses that observations of practice should be carried out regularly in order to appraise if the practice is being delivered with high quality and as intended (with fidelity). Regular observations enable practitioners to hear feedback on their current practice and quickly make adjustments with support from their coach. Annual or biannual observations are not sufficient.

In the ANEW work, we applied the Active Implementation learning and identified two opportunities for observing practice – the Team around the Child Meetings, and the home visits conducted by health visitors, when joined by their supervisor, practice teacher or experienced colleagues (joint visits). CELCIS supported the local implementation teams to create readiness for observation of practice, strengthen the coaching culture and develop a set of observational data tools for this purpose.

Experienced GIRFEC practitioners (e.g., health visiting team leads or educational psychologists) carry out the observations and use these as a basis to highlight high-quality practice and identify practitioners’ learning and development needs, that would be addressed through training and coaching.  


“As a newly qualified health visitor I was quite nervous about chairing my first Team around the Child meeting. I wasn’t sure what to expect, and I knew the family had had previous meetings which hadn’t gone too well.

The Observation Tool was so helpful. I used it to help me prepare for the meeting, making sure I met with the mum before the meeting, introduced everyone at the start of the meeting, checked in with mum throughout to see that she understood and agreed with what was being said. Using it gave me confidence like a checklist that I knew that if I followed it, then the meeting would go ok.”

 [to ask the HV if ok to publish her name, or go with a generic ANEW Champion Health Visitor]


“I wasn't the most confident in my practice, asking myself ‘What could we do better? Was that really good enough?’, because sometimes I'm quite hard on myself. So, one of our educational psychologists videotaped me, just me, and my practice in the meeting. We could then look back and, from a strengths base, say ‘Okay, this is what was really good, and this is what we can move forward on’.”

Headteacher involved in ANEW.


Regular planning and support meetings

CELCIS and the Dundee Implementation Team met regularly with the schools, the nurseries and the health visitors involved in the programme, to plan, run tests of change, discuss progress and address organisational and system barriers. Read more about the cascading model of support by visiting the Enabling Contexts section.

Developing a Decision Support Data System (DSDS) for the innovation:

In line with the Active Implementation methodology, we describe the range of ANEW programme measurements as a Decision Support Data System that collectively enable understanding of the extent to which practice is delivered as intended (with fidelity), the capacity for change, the scale of implementation and associated process indicators, and the outcomes of the innovation.

This approach to data challenges the mental models that rely primarily on quantitative data, collected and analysed at large time intervals. The function of the Decision Support Data System is to provide timely, reliable and relevant data to support better informed decisions, including in relation to training and coaching, and removal of organisational and systemic barriers. It includes both qualitative and quantitative indicators, with a special focus on the experiences of practitioners, children and parents or carers.

As mentioned in the previous sections, CELCIS worked closely with the local teams and facilitated the development of a suite of measurements and tools as part of the Decision Support Data System in Dundee, including:

  • Collecting parental feedback on their Team Around the Child meeting experience via a Parent/Carer Experience Questionnaire that provides information to practitioners on how parents/carers found the Team Around the Child meeting process, from the preparation stage through to the post-meeting debrief. 
  • Conducting Early Concerns Mapping exercises with each school, nursery or health visiting team involved in the programme, to better understand their specific context, the wellbeing concerns among the children they work with, and how they respond to and record those concerns.
  • Practice observations tools for the Team around the Child Meetings and the home visits conducted by health visitors.

To ADD THE LINK TO THE GLOBAL IMPLEMENTATION CONFERENCE VIDEO [but with the ending bit edited out – where we invite people to our session]    

[We can add the overview of tools in the Resource section]

The intention behind each of these approaches is to be constructive rather than critical. They are all focused on improvement and supporting individual practitioners to consistently deliver high-quality practice for children and families.    

Key Learning: 

  • Changing practice should not fall to the practitioner alone. Support must be put in place for practitioners to develop their competence and confidence, including through training, coaching and reflection opportunities informed by timely and meaningful data.
  • Having a practice profile is essential but not sufficient – we learnt that building capacity for regular coaching within the existing resource can be a challenge when agencies are juggling legitimate competing priorities. However, this condition was identified as a key leverage point, having the potential to reduce the demand for more intensive or crisis-response services at a later point, and improve outcomes for children and their families.
  • Observations of practice must be frequently carried out, to inform and support training, coaching and decision-making.
  • Data capacity is a key condition, and the measurements should be able to draw a complex picture, assessing the quality and fidelity of practice, the process and the capacity for scale-up, and explore how the change is impacting on outcomes for children and their families. A paradigm shift is thus required in relation to data. Tools and data used sporadically, in isolation and not followed up by removing barriers and supporting practitioner’s competence and confidence will have limited impact.
  • There is a need to have patience from all (including practitioners, leadership and funders) in recognising that achieving sustainable change and impact, particularly in complex systems, and where innovations are not well-defined, takes time.
  • Throughout the ANEW programme, significant efforts were needed to disrupt the mental models in relation to how practice change happens, for example by challenging the overreliance on training in the absence of coaching. We learnt that those at the forefront of the change, particularly implementation teams, leadership and funders, must remain intentional and persistent in promoting and using a sustainable and scalable approach informed by research evidence. According to our experience in the ANEW programme, Educational Psychologists appear to be well placed to provide coaching and support to Named Persons, given both their skillset and their overall contribution to the strategic and operational delivery of GIRFEC.
Link to ANEW home page
Link to Enabling context page
Link to Effective Practices page
Link to Significant outcomes page