Update on Patient Matching Activities
I have written several times about patient matching in the US, both in a blog entry and a published article. On December 11, 2017 the Office of the National Coordinator for Health Information Technology (ONC) sponsored a half-day “Interoperability in Action” webinar focused on Patient Matching Milestones at ONC (see slides). The webinar focused on four ONC projects from the past year. Here’s a quick run-down on what they covered.
The first section focused on the Patient Matching Algorithm Challenge (PMAC) which ran earlier in the year and culminated with the announcement of a set of winners. The purpose of the challenge was to allow vendors to compete for the highest performance metrics for their matching algorithms by testing their software against a large set of test data provided by ONC. Cash prizes were awarded in a number of categories, and the winning vendors were featured in the discussion on the webinar. One of the main purposes of the challenge was to promote the use of standard metrics to evaluate algorithm products.
I certainly applaud ONC’s interest in developing and promoting standard metrics, but I was a little concerned that the winners by their own admission “analyzed patterns in the data.” This seems to call into question to me at least the applicability of their results to the “real world” where you don’t get to see the data set, you have to adjudicate them as they come in. That means that these particular test runs were “tuned” for the data set and the measurable results might not hold up for other data sets.
The second section of the webinar focused on the Gold Standard and Algorithm Testing (GSAT) Pilot, a project funded by ONC and executed by a consortium of organizations including OCHIN, the Kaiser Permanente Center for Health Research, and the MITRE Corporation. The project used records in the OCHIN hosted EHR database to develop a set of record pairs adjudicated by specialists to establish a gold standard data set that could be used for algorithm testing. By knowing with certainty whether two records in a pair are a match or not an algorithm can be objectively tested and its matching results compared to the known answer (sometimes this is referred to as “training data” for a matching algorithm). Much of the presentation was then spent on using this data set to understand their own OCHIN data. ONC also confirmed that it did not leverage this “gold standard” data set for the Patient Matching Algorithm Challenge described above. It is also unclear if this data set will be made available to others for their own testing.
The third section of the webinar described the Patient Demographic Data Quality (PDDQ) initiative and the toolkit that was developed to help provider sites improve the quality of the demographic data in their EHRs. This work grew out of the ONC Patient Matching Community of Practice in which I participated back in 2015 (see the groups final white paper and testing guidelines). Frankly, I had never heard of PDDQ until the webinar. The toolkit is detailed – excruciatingly detailed – and covers such topics as data governance, date quality, data operations, and platform & standards. Each section has an online evaluation tool that allows a provider site to score its progress toward best practice in this area.
The fourth and final section of the webinar described the Data Quality Framework (DQF) Pilot, a collaboration by OCHIN and the Kaiser Permanente Center for Health Research whose goal was to test the advice offered by the PDDQ initiative by conducting a careful intervention in three health center organizations that are part of the OCHIN network. After conducting a literature review to better understand the issues, the project team assessed the patient registration work flows at the target sites, and then trained the sites in new work flow techniques based on PDDQ concepts. They then scored each site on the effectiveness of its revised work flow based on the evaluation tools included in PDDQ. While there was a modest decrease in duplicate records and a moderate increase in their PDDQ scores, the complexity and time-intensive nature of the intervention seemed to be real limitations.
The studies and projects described in this webinar were actually quite impressive, though there was absolutely no time allocated for audience Q&A. I also had a sense that despite some obvious connections between some of the projects (like the use of PDDQ by the final presenters) there was not as much coordination or cross-learning between these activities as I might have liked.
ONC continues to shy away from any direct engagement on the topic of a national patient identifier even though it is certainly permitted to discuss (as opposed to implement) such a strategy under Congressional instructions. The 21st Century Cures Act instructs the GAO to conduct a study on patient matching, including a review of ONC and other stakeholders in these activities with particular emphasis on private-sector efforts. Interesting, in November 2017 the College of Healthcare Information Management Executives (CHIME) announced that it was suspending its $1 million national patient ID challenge. An additional useful resource recently published by ONC is a strategic implementation guide for Identity Management developed by the ONC-SIM Health IT Resource Center.
- Tags:
- 21st Century Cures Act
- College of Healthcare Information Management Executives (CHIME)
- Data Quality Framework (DQF)
- DQF Pilot
- Gold Standard and Algorithm Testing (GSAT)
- GSAT Pilot project
- Kaiser Permanente Center for Health Research
- MITRE Corporation
- Noam H. Arzt
- OCHIN
- Office of the National Coordinator for Health Information Technology (ONC)
- ONC Patient Matching Community of Practice
- ONC-SIM Health IT Resource Center
- Patient Demographic Data Quality (PDDQ) initiative
- patient matching
- Patient Matching Algorithm Challenge (PMAC)
- Login to post comments