Evaluation
Evaluation helps make the best use of resources and disseminate good practice. Supervised toothbrushing programmes must be evaluated to assess if the programme was implemented as planned and targets for coverage and uptake have been achieved. Furthermore, evaluating supervised toothbrushing programmes can capture learning and reflections of key stakeholders, to make recommendations for the future.
​
​
​
Key evaluation questions and potential sources of information.
​Were the supervised toothbrushing programme activities delivered as intended?
-
Quantitative (numerical) data from termly monitoring reports on coverage and uptake including number of sites involved, number of children brushing daily.
-
Quality audits.
Were the short and medium-term outcomes achieved? Using interviews or questionnaires with:
-
Early years/school staff about whether the supervised toothbrushing programme achieved its objectives, including aspects such as suitability of training, day-to-day implementation, feedback from children and parents etc.
-
Parents/carers on their perspectives on supervised toothbrushing programmes within nurseries/schools including how toothbrushing in nurseries/schools influences home toothbrushing etc. A template questionnaire for parents/carers to gain their views on supervised toothbrushing programmes can be downloaded from here.
Also consider what key learning and insights about supervised toothbrushing programme implementation have been gained from other stakeholders e.g. local oral health improvement staff, any dental practices involved, family hubs etc.
Long-term outcomes may be evaluated through routinely collected survey or clinical data:
-
National oral health surveys of five-year-olds (e.g., dmft – decayed, missing, filled teeth) which are typically conducted every four years.
-
Number of fillings and extractions carried out by NHS dentists.
-
Number of child dental referrals requiring a general anaesthetic.
However, using these clinical outcomes to evaluate a supervised tooth brushing programme requires consideration of the time it takes for changes in tooth decay to be observed (around 24-36 months), the size of the survey sample used and the limitations of existing data sets.
​
​
​
Key Performance Indicators
It is important that monitoring reports, produced by the provider, include information that is specifically related to the supervised toothbrushing programme and does not relate to any other oral health intervention that may co-exist. The provider will need to report on the Key Performance Indicators (KPI) as agreed upon in the service specification. The following are some examples of KPIs that should be included in a monitoring report.
Coverage and Participation
-
Number of eligible settings (according to local authority information).
-
Number of eligible children within eligible settings.
-
Number of children with consent to participate.
-
How often is supervised brushing taking place (e.g. 5 days per week, 3 days per week).
-
Number of participating settings.
-
Number and percentage of participating settings and children in each Index for Multiple Deprivation (IMD) decile.
-
Number of setting staff receiving training each school term.
-
Narrative report of efforts made to engage eligible settings that have declined to participate.
Quality assurance
-
Number of quality audits undertaken in participating settings and method (in person, virtual, self-assessment).
-
Number of suspended settings and reasons stated.
Reflections
-
Note any challenges or feedback and efforts made to address these.
​
​
Challenges
Obtaining information for KPIs can sometimes be challenging due to changes in the:
-
Total number of eligible settings in each area due to closures.
-
Total number of registered children within settings.
To mitigate this challenge, providers of supervised toothbrushing programmes should engage with local authority partners who may already have access to up-to-date information.
Logic Model for Supervised Toothbrushing Programmes
A logic model is provided below which shows the intended outcomes of the programme in more detail. It serves as a visual roadmap for the relationships between the programme's activities and the intended effects making it a useful tool for planning, implementing, monitoring and evaluating a programme. A logic model is a useful way to present the relationships between the inputs or resources (what you need to provide), activities (what you do), outputs (what you produce), outcomes (the results) and the overarching ultimate goal or impact. A logic model helps stakeholders think about the activities they are involved in, what they hope to achieve and what needs to be done to make it happen.