Use of real-world data to evaluate user engagement and resource use for a digital health technology

Statistical data being viewed on a tablet.

Context

Chronic obstructive pulmonary disease (COPD) is a common condition in adults. Symptoms include breathing difficulties and persistent cough. People with COPD usually get worse over time, have frequent chest infections, and can have periods of sudden worsening, known as exacerbations.

A NICE medical technology evaluation considered myCOPD as a digital tool for people with COPD and their healthcare professionals to support self-management and remote monitoring. It is accessible online via devices such as smartphones and computers. Through an app, users can keep track of their symptoms and prescriptions. The app also contains educational modules to support symptom scoring, inhaler technique, as well as a virtual pulmonary rehabilitation course. When users permit, data can be shared with clinical teams for remote monitoring.

The final guidance on myCOPD was published in March 2022, which recommended further research due to remaining uncertainties around clinical benefits and the impact of myCOPD on healthcare resource use.

Key evidence

The evidence consisted of three small randomised controlled trials (including 41 to 90 people with 3 months or less follow-up time) and local service evaluations. The committee highlighted the need for real-world evidence particularly to understand the uptake of myCOPD in routine practice. However, the methodology, patient numbers or characteristics, clinical outcomes and follow-up periods were frequently not fully reported in these local service evaluations.

One real-world evidence study, a service evaluation in NHS Highland, was reported in a peer-reviewed journal (Cooper et al. 2022). This report showcases several elements of high-quality reporting, as specified in NICE’s real-world evidence framework.

Laptop computer displaying an analytics dashboard

NHS Highland service evaluation

This was a pragmatic real-world feasibility study conducted in 2019. The target population was people with COPD and the setting was routine community-based care.

Engagement data was collected through the app and included, educational module use, and scoring for symptom frequency.

Health service use was collected through patient electronic health records and included hospital admissions, inpatient bed days, home visits, and out-of-hours contacts with health services.

Data collection covered the 12 months before enrolment and up to 12 months after activating myCOPD.

Health care professional and an elderly man, sat on a sofa, viewing information on a tablet screen.

Reporting patient flow through the study

NICE’s real-world evidence framework recommends diagrams are used to visualise the number of participants at each stage of the study, from total number of potential participants to number included in the final analytical dataset, with reasons for any exclusions explained. This can help NICE committees assess risk of bias, for example, where patients who were not included are expected to be different from those included for important characteristics.

The participant flow through the service evaluation study was easily understood from the reported participant attrition chart (see Figure 1), which showed the numbers of people:

  • approached
  • enrolled in the study
  • activated myCOPD
  • declined to participate
  • dropped out of the study.

The reasons for not participating or dropping out were also reported clearly.

Figure 1: participant attrition flow chart

Adapted from: Cooper R, Giangreco A, Duffy M et al. (2022) Evaluation of myCOPD Digital Self-management Technology in a Remote and Rural Population: Real-world Feasibility Study. JMIR Mhealth and Uhealth 10(2): e30782

Licensed under CC BY 4.0.

Figure 2: age distribution for myCOPD adopters compared to national COPD patient data

Figure 2. unpublished data from the authors of Cooper et al. (2022) adapted with permission by authors.

What else did this study do well?

Other aspects of good study design and reporting of results were demonstrated. The following were aligned with the principles outlined in NICE’s real-world evidence framework:

  • a clear, detailed description of study participant characteristics was provided
  • using available national data, study participant characteristics were compared to the wider COPD population to understand generalisability (see Figure 2)
  • key outcomes for decision making were captured, for example, data on engagement with the technology were reported, not only initiation
  • comparative effectiveness was assessed, comparing resource use 12 months before and after enrolment (including confidence intervals)
  • important subgroups were analysed, for example, exploring characteristics and outcomes of more engaged users.

What else could have been done?

Although this study demonstrated aspects of good reporting, the committee decided that the overall evidence remained uncertain. For example, uncertainty remained around understanding the uptake of myCOPD across different regions and settings, and its use and effectiveness for both self-management and pulmonary rehabilitation.

Developers of this real-world evidence study could have further improved the transparency of their evidence to the decision-making committee by providing:

  • access to a study protocol prepared before conducting the study
  • a more detailed assessment of data suitability including provenance, quality, and relevance
  • complete reporting of results from all analyses

Detailed best practice principles relating to the conduct, reporting, and presentation of real-world evidence studies can be found in NICE’s real-world evidence framework.

Notepad and pen, with printouts of graphs, next to a tablet displaying a chart.