Centre for Developmental & Applied Research in EducationView our Privacy & Cookie Policy

CeDARE Reports Logo
  • Follow us on Twitter
  • View our full site at the University of Wovlerhampton

I CAN's Early Talk Programme:


Independent evaluation of the impact of Early Talk on addressing speech, communication & language needs in Sure Start Children's Centre settings

Dr Judy Whitmarsh, Dr Michael Jopling, Prof Mark Hadfield

Feedback


These videos were produced by Soundhouse Media.

Click here to download a folder containg all of these videos (.zip file 200MB)

Share

Methodology

The original research brief was based on a two-phase research study that compared seven settings implementing ET (post-intervention) with seven settings not implementing ET (pre-intervention) over six months.

Meetings with I Can senior management, originally to identify children's centres for participation, revealed that ET was not designed as a simple six-month intervention project and that LAs were able to buy in different aspects of ET with different approaches to mentoring, factors that made a pre- and post-intervention design problematic. Therefore, with the support of the DCSF (now DfE), the research methodology was altered to become a study of children's centres at various stages and levels of implementation of ET. Thus, we adopted a case study approach with each children's centre becoming an evaluative case and allowing cross-case analysis to contribute to the findings of the evaluation.

Constructing the sample

We used purposive sampling to identify children's centres at different stages of involvement in ET. Initially, the challenge was finding children's centres involved in ET; some I Can advisers told us that it would have been more straightforward to find participating private, voluntary and independent settings. Considerable effort went into recruiting children's centres in a range of locations in England as identifying and accessing settings for the sample proved to be a complex and time-consuming process. Initially, I Can regional advisors contacted LAs using ET with details of the research and a request for them to provide the CeDARE research team with the LA name and contact details. When this yielded few results, I Can made direct contact. The research team then contacted the LA but had to wait while the LA contacted the children's centre for consent to give their name to the research team. At the same time, the research team used their professional contacts and networks to locate other LAs and children's centres using ET. In particular, it proved difficult to identify centres that were either engaged in ET but pre-accreditation or centres intending to undertake ET. We also had to exclude some centres that had undergone an earlier form of accreditation. Thus, this process of sample construction took five months. For inclusion in the research, a centre had to have children aged 3 to 4 years old, or within the immediate location. As it had proved so difficult to locate children's centres engaged in ET, we took a relatively simple approach to sampling. Our purpose was to recruit up to 15 centres according to their stage in the implementation of the ET programme:

  • Stage 1 centres: at least 6 months post ET accreditation;
  • Stage 2 centres: approaching accreditation or up to 6 months post-accreditation;
  • Stage 3 centres: in the early stages of, or considering, implementation.

By applying a staged approach, we could build a sample of a variety of children's centres across different stages of implementing ET. In addition, we ensured that the centres were located in at least three different areas of England. Eventually, 19 settings agreed to participate in the research, which allowed for some reserve settings as contingency. From these 19 settings, 14 were visited for the research: five Stage 1 centres; five Stage 2 centres; and four Stage 3 centres. Nine of the children's centres in the final sample (64 per cent) were located in the 30 per cent most deprived areas in England (see Appendix 1). Accessing Stage 3 centres was the most problematic as it depended on the goodwill of key contacts in LAs and relatively few LAs or children's centres had firm plans to begin ET at the time we were recruiting (from December 2009). Appendix 1 contains demographic details of the 14 children's centres visited for the research.

Research design and methods

Each children's centre was visited for a day by a researcher between May and July 2010. Table 1 outlines the methods used, which were designed to gather the data needed to address the research objectives for the project, and further details about the research design and tools developed can be found in Appendices 2 and 3. Interviews were held with the children's centre manager; the ET lead; and a range of practitioners available on the day. The interview with the manager explored how ET fitted with the manager's strategic vision for the centre; how ET was translated into practice; and its fit with the continuing professional development (CPD) needs of the centre and its staff. The interview with the ET lead, which took the form of a learning conversation (see Appendix 2) took a maximum of one hour and used the overarching themes of the evaluation's theoretical framework to explore how concerns about SLCN were identified and managed. Focus groups involving a total of 55 practitioners explored the learning environment; working with parents; CPD; and the impact of ET.

Table 1: Research methods

Method Participant(s) in each setting
Telephone or face-face interview (Manager) Manager of each participating children's centre.
Interview (Practitioner) ET lead in each participating children's centre
Rating of the environment Setting, for a language rich environment (based on ECERS-E and ECERS-R1)
Documentation Gathered from existing documentation in setting.
Focus group 6 practitioners
Observation of practitioner-child interaction (PCI) 1 Level 3 practitioner
Post PCI observation interview Level 3 practitioner observed
Questionnaire survey 4-6 parents of children aged 3-4.
Mapping other SLC programmes Research team
Video recording Combination of interviews with practitioners and observations of practice in 5 consenting children's centres.

Perspectives were collected from 62 parents via a short questionnaire. We also drew on observations of interactions between practitioners and children (PCI) and post- observation interviews with the practitioners observed; environmental rating scales; and analysis of a range of centre documentation. The observations and rating scales were used to explore current practice in children's centres beyond practitioners' personal perspectives on change and progress and thus increase the validity of the findings relating to the current position of SLC in the centres researched. LA officers from three participating LAs were interviewed by telephone to gain a broader perspective on the implementation of ET. Finally, five centres were visited a second time by a specialist film company, Soundhouse Media, that created video case studies incorporating interviews with practitioners and footage of practice. The research objectives also structure the findings section of this report.

Data analysis

All data was recorded digitally, then reduced by individual researchers using standardised data reduction templates for each research tool. These were then analysed thematically by this report's authors using an iterative and evolving process consistent with a grounded theory approach (Strauss & Corbin, 1998), focusing in particular on evidence relating to impact, depth, and differences between the implementation stages. The observation and environmental rating data was processed using exploratory statistical analysis. At the same time, the data reduction process enabled us to undertake cross-case analysis to identify additional themes from the data. At an early stage, in September 2010, we held a data analysis day with the full research team to test out initial themes and findings and gauge whether they were consistent with their experiences in settings. The outcomes of this day then fed back into the next stage of the analysis. Finally, we used the observation data and the video case studies to validate and triangulate the analysis of the qualitative data.

Ethics

Ethical consent was gained from the University of Wolverhampton School of Education ethics committee for the research. Research participants signed a form giving their informed consent and were informed that they would not be identified in the report; identification obviously occurred during the video filming and participants gave their consent to this. Soundhouse Media negotiated ethical consent from the five children's centres video recorded, from the practitioners participating in the videos and from parents of the children filmed.

Piloting and training

The observation and environmental ratings tools were piloted in settings not involved in the research. The research team were trained in their use and, having trialled the tools, met to discuss the outcomes and process of using them and to ensure that a high level of inter-researcher reliability had been achieved.

Internal reference group

Initial plans to hold two semi-formal internal reference groups meetings were affected by the delays to recruiting centres for the research. Instead, the internal reference group, which included members of the research team and Professors Tony Bertram and Christine Pascal from the Centre for Research in Early Childhood (CREC), commented at key intervals on the research tools, design and early findings using the secure website established for the project.


1. ECERS-R (Early Childhood Environment Rating Scale – Revised, Harms, et al, 2005) and ECERS-E (Early Childhood Environment Rating Scale – Extension, Sylva et al. 2006) are a set of standardised tools for measuring and improving the quality of early years provision ratings tools. They focus on areas such as space and furnishing; language and reasoning; interaction; literacy; and diversity. Further details can be found in Appendix 2.