Talk Test Treat Trace

Chapter 6: Health service data, continuous quality improvement and program evaluation

In this chapter

Key points

  • High quality health service data can be used to plan, implement and evaluate sexually transmitted infection (STI) and blood-borne virus (BBV) programs and improve health outcomes.
  • Health information systems (HIS) templates, prompts and recalls should be reviewed and updated regularly to support best practice guidelines for STI and BBV testing and management.
  • Key performance indicators (KPIs) are measurable values used by funding bodies and health services to report against the progress and outcomes of strategies and measure the success of programs over time.
  • Continuous quality improvement (CQI) refers to a process whereby a systematic and cyclical approach is used to improve health outcomes.
  • To be effective, CQI programs need to engage appropriate staff in the process and use a team approach to develop plans and to identify and implement actions.
  • KPIs may be used to measure the success of CQI programs and action plans can assist with the planning, implementation and evaluation.
  • Evaluation of the process, impact and outcomes can be simplified and guided by the aims of the program as well as the time and resources available.

Back to top

Background

Health service data is used for many functions relating to clinical care and health service and public health program activities. Access to timely and high quality STI and BBV health service data enables:

  • improved continuity of clinical care, case management and follow-up
  • reporting against national and state-based sexual health and BBV strategies
  • reporting against KPIs required by funders and managers
  • measuring of CQI activities
  • feedback to stakeholders such as funders, health service management and staff, other relevant organisations, community members, representatives and leaders
  • monitoring and evaluation of short, medium and long-term goals
  • public health requirements and functions such as:
    • reporting of notifiable diseases
    • identifying emerging infections and outbreaks
    • monitoring antibiotic resistance
    • identifying patterns and distribution of diseases within populations
  • informed resource allocation at both public health and clinical service level
  • data-driven planning, implementation and evaluation of research projects.

Back to top

Health information systems (HIS)

Digital HIS are integral to being able to access and manage quality data. They should enable the following:

  • security and confidentiality of personal information
  • data such as KPIs to be entered and extracted in a way that is straightforward, standardised and not reliant on individual users
  • monitoring, evaluation and reporting of program activities
  • templates that support the integration of STI and BBV testing into routine health screening and are aligned with current recommendations regarding who and how often to test
  • prompts and recalls to enhance appropriate case management, continuity of care and follow-up.

Information technology is a dynamic area and health services should regularly review and update systems to ensure they support best practice and are used effectively to improve health outcomes. Changes to HIS should be consistent with current guidelines and made with the involvement and approval of health service management and staff.

Back to top

Notification of infectious diseases

The notification of certain infectious diseases and related conditions is required by law and enables important public health functions, such as the monitoring, control and prevention of diseases, identifying and responding to outbreaks, monitoring antibiotic sensitivities and identifying the pattern and distribution of diseases within populations.

In WA notifications are guided by the Public Health Act 2016 and the Public Health Regulations 2017 that require STIs and BBVs to be notified by laboratories and medical officers or nurse practitioners on the basis of case definitions, which may include laboratory test results and clinical findings. Notifications are entered into the WA Notifiable Infectious Diseases Database (WANIDD), which is managed by the Communicable Disease Control Directorate (CDCD), which also publishes regular epidemiological reports on the Department of Health’s website. Access to WANIDD is restricted to specific CDCD and regional public health unit (PHU) staff. More information about notification of STIs and BBVs is available at: https://ww2.health.wa.gov.au/Silver-book

Back to top

Key performance indicators (KPIs)

KPIs are measurable values that can be used by funding bodies, health services and organisations to:

  • assist with monitoring, evaluating and measuring the success of programs over time
  • report against progress and outcomes of national and state strategies, and regional and local action plans
  • ensure the appropriate allocation of resources and cost-effectiveness of programs.

At a health service level, KPIs can provide important information that can be used effectively as part of continuous quality processes and strategies to improve primary healthcare delivery. These indictors should not only capture key information but enable quality data to be extracted and collated easily in a timely manner. They need to align with current programs and strategies but also need to be dynamic in response to outcomes as well as emerging health issues.

At a funding level, KPIs are used increasingly to support progress on national strategies such as the Council of Australian Governments (COAG) Closing the Gap targets and national health goals set out in the implementation plan for the National Aboriginal and Torres Strait Islander Health Plan 2013–2023.

While a nationally agreed set of KPIs for STIs and BBVs is currently lacking, work is progressing in WA (in partnership with various stakeholders) to develop consistent and effective indicators. KPIs have been developed as part of research projects, such as Test, Treat, ANd GO 2 (TTANGO2) and STRIVE* , that have been implemented in WA in recent years. Research projects are usually adequately funded to enable the time and expertise to conduct a detailed analysis and reporting of project outcomes. While it may be impractical for health services to conduct the same level of evaluation, KPIs can be adapted in a way that is practical, achievable and will result in the ability to measure and monitor outcomes in a sustainable way.

*STI in Remote communities: ImproVed & Enhanced primary health care (STRIVE) was an STI quality improvement research project conducted by the Kirby Institute between 2011 and 2013 with participation from government and the Aboriginal Community Controlled Health Services (ACCHS) in the NT, Qld and the Kimberley region of WA. As part of STRIVE, STI templates were developed and made available for use beyond the life of the study in HIS such as Communicare.

TTANGO2 Project

TTANGO2 builds on a research trial Test, Treat, ANd GO (TTANGO) that was conducted in participating remote primary healthcare services between 2011 to 2016 to determine the acceptability, performance and short-term health impacts of point of care testing (POCT) for chlamydia and gonorrhoea. TTANGO2 builds on this research project through the addition of testing for trichomonas and wider implementation, and will be evaluated to determine the uptake, sustainability and impact of point of care testing (POCT) in settings with high STI prevalence.

The key performance and process indicators developed for TTANGO2 have been adapted and used beyond the research project in primary healthcare services in Western Australia (WA). They are outlined below and provide an example of indicators that services could use to evaluate a CQI project.

KPIs for STIs (adapted from TTANGO2):

  • Age is reported on in five-year age brackets (15 to 19, 20 to 24, 25 to 29, 30 to 34, 35 and older).
  • Proportion of clinic attendees tested for STIs (STI testing rate).
  • Proportion of current patients tested for STIs once or twice in a 12-month period (STI testing coverage).
  • Proportion of clinic attendees with at least one positive STI test in a 12-month period (unique STI test positivity).
  • Completeness of testing: Proportion of clinic attendees with a positive Chlamydia trachomatis (CT) and/or Neisseria gonorrhoea (NG) and/or Trichomonas vaginalis (TV) result tested for syphilis and human immunodeficiency virus (HIV) within three months of the date of initial specimen collection.
  • Treatment interval: Time (days) from date of positive STI investigation request to date of treatment.
  • Proportion of clinic attendees retested at three months (60 to 120 days) after an initial positive STI result (STI retesting rate).
  • Proportion of STI retests that were positive (STI repeat positivity rate)

Key process indicators to measure quality of data recording:

  • Among those with an STI test performed, location in the patient management system where STI testing was documented (STI test documentation) e.g. adult health check, STI check, antenatal check, or other.
  • The proportion of laboratory STI tests with a POCT performed at the same consultation (POC test uptake).
  • The proportion of patients with positive STI results who had treatment information recorded in prescription or relevant patient management system/clinical item (not progress notes) (treatment record).

Back to top

Continuous quality improvement (CQI) programs

CQI refers to a process whereby a systematic and cyclical approach is used to improve health outcomes. Health services use CQI to help improve many areas of health, including STI and BBV programs.

While CQI programs may use the same or similar indicators as the KPIs required by funding bodies, they often look at broader indicators that may provide health services with more detail to fine-tune programs. More detail can be time-consuming to extract and analyse, but can provide useful information to enable gaps in service delivery to be identified and addressed. To be effective, CQI programs should align with the objectives of national and state strategies, engage appropriate staff in the process, use a team approach to develop plans and implement actions, and focus on goals that will lead to improved health outcomes.

There are different ways of planning for CQI programs but it is helpful to start with an action plan that can identify key aims, strategies and outcomes. An action plan could be developed to encompass the whole program or may be used as a starting point for one part of a program that can be built on over time. An example of an action plan can be found on the National Aboriginal Community Controlled Health Organisation (NACCHO) website. It uses the following five components:

  • Aims: what are you trying to achieve?
  • Strategies: how will you do this?
  • Performance indicators: how will you measure performance?
  • Targets: what are your targets?
  • Timeframe: when will this be delivered?

https://www.naccho.org.au

CQI is an ongoing process that uses a plan, do, study, act (PDSA) cycle to assist with identifying and enacting changes that will lead to improvements in outcomes. CQI programs are dynamic and should be reviewed and refined over time to ensure that actions are being implemented and new goals are being set to ensure ongoing improvements. While an overall CQI program is an ongoing process, there may be components of the program that will be a one-off activity (outlined in the action plan).

CQI programs use a cycle that has the following or similar components:

  • Plan: identify goals or an opportunity for change
  • Do: implement the change on a small scale
  • Study or check: use data to analyse whether the change has been effective
  • Act or adjust: if successful, implement the change on a wider scale and continue to assess results. If not successful, review or update the plan and begin the cycle again.

 

Diagram 1. Cycle of continuous quality improvement programs (Source: SiREN toolkit)

 

Things to consider when developing CQI programs are whether the goals and outcomes are feasible and achievable given time and resource constraints and whether assistance is available from other organisations to help achieve the goals of the program. A team approach is critical to the success of CQI programs and action plans can be used to clarify roles and responsibilities of individual team members and provide timeframes for delivering outcomes. While CQI involves a team approach, it is important to identify one or two staff members to be responsible for driving the program to ensure timelines are met and actions are implemented.

Developing an action plan

Things to consider when developing an action plan include:

  • What are the aims of the project?
  • Do the aims align with the broad objectives of national and WA strategies?
  • What are the strategies and actions needed to meet the aims?
  • What is the timeframe for the overall project and milestones?
  • What key performance or other indicators will be used to measure or evaluate the project?
  • What are the targets of the program?
  • How will data be entered into the HIS and will data entry and extraction be straightforward, consistent and reliable?
  • How will information be fed back to relevant staff and other stakeholders e.g. written reports, verbal feedback?
  • Will the project be done in a cost-effective way?
  • How will milestones or outcomes be actioned?
  • Who is responsible for the different components of the program and action plan? That is:
    • Who will manage or drive the program?
    • Data entry, extraction and analysis?
    • Written and verbal feedback to stakeholders?
    • Implementation of actions and outcomes of the project?

Evaluation

Evaluating programs is important to ensure that the aims were met and that the outcomes were worth the time and resources used. The Sexual Health and Blood-borne Virus Applied Research and Evaluation Network (SiREN) Sexual Health and Blood-borne Virus Program planning toolkit provides detail about how to conduct an evaluation that includes evaluating the process, impact and outcomes. While this toolkit provides a comprehensive overview, evaluation of programs can be simplified, guided by the aims as well as the time and resources available. For example, it may be feasible for services to measure some key outcomes but it may not be simple to measure the impact of a program on the priority population. Failure to meet some key goals does not necessarily mean that the program was not successful overall, but it does mean that parts of the program may need to be reviewed and refined before repeating.

Evaluating the process, impact and outcomes of programs (adapted from the SiREN Sexual Health and Blood-borne Virus Program planning toolkit)

Process evaluation is used to measure program planning, delivery and progress. It can be conducted throughout the duration of the program and assess the quality, appropriateness and cost-effectiveness of the program. Questions that should be asked include whether it reached and satisfied the needs of the target audience, what did or didn’t work well and what could be done differently to improve the program.

Impact evaluation measures the immediate short-term effects of the program. It is related to the program objectives and can measure short-term changes in behaviour, knowledge, participation, policy and risk factors. It is undertaken on completion of certain stages throughout implementation or on completion of the program, or both. It could involve questions such as what proportion of the target group has heard of the program strategies? Has there been a change in behaviour such as an increase in the uptake of STI or BBV testing? Have more condoms been dispensed? Did the program increase STI and BBV knowledge, skills or management? What changed as a result of the program?

Outcome evaluation measures long-term program effects and assesses whether, or to what extent, the program goal has been achieved. It may be conducted from a few weeks to several years after the completion of a program. Long-term changes may include decreases in incidence or prevalence of STI or BBV rates and sustainable behaviour change.

Evaluation may involve the use of both qualitative and quantitative methods of data collection:

  • Qualitative evaluation methods can use interviews and questionnaires to assess or describe the thoughts or feelings of participants about the intervention or program
  • Quantitative evaluation methods use numbers, frequencies, percentages and statistics to measure change.

Remember that with regard to CQI programs that aim to address or improve gaps in systems and health service delivery, the target audience may be the clinic staff rather than the priority population who will ultimately benefit from those improvements or outcomes. The target audience could also be both staff and the priority population, such as with programs that aim to improve access to services, increasing the uptake of testing and improving the management of STIs and BBVs.

Data analysis

Analysis of programs can involve analysing qualitative or quantitative data, or both. Large, well-funded research programs often use highly skilled staff to conduct detailed analysis in order to conduct an evaluation. While the evaluation and analysis of a project might sound daunting, it doesn’t have to be difficult. Evaluation can be as simple as conducting interviews with participants or asking them to fill out an easy evaluation form to get their feedback. Quantitative data analysis can look at some simple indicators that can be extracted from HIS, such as what proportion of 15 to 29 year olds by five-year age group and gender attended the service and had STI or BBV testing in a timeframe before and after an intervention.

From the outset of a CQI project, it is important to think about what KPIs would be useful to measure, and how feasible it will be to obtain or extract that information and conduct a simple but meaningful analysis. Ensure that the information or data is appropriately managed with regard to how it is reported, fed back to participants and stakeholders and stored so that it can be used as a baseline or interim measure against which to measure future progress.

Act or adjust to implement change

Evaluation of the program should help to identify whether it was worthwhile and cost-effective, whether it identified gaps that could be addressed or successes that could be expanded or built upon. Taking further actions to adjust, progress or implement changes are an important part of the CQI process. It often involves team discussion and teamwork to determine what action is needed or feasible, assign tasks and responsibilities to appropriate staff and set timeframes in which to progress or complete actions. Appendix 2 provides a hypothetical example of how a health service could develop a CQI program and action plan in response to findings of low STI testing rates among 15 to 29 year olds attending the service.

Back to top